What is yield in python

Last updated: April 1, 2026

Quick Answer: In Python, yield is a keyword that transforms a function into a generator — a special iterator that produces values one at a time, pausing and preserving the function's entire state between each call. Unlike return, which exits a function permanently, yield suspends execution and resumes from the exact same point on the next call. Introduced in Python 2.2 on December 21, 2001, via PEP 255, generators using yield are dramatically memory-efficient: processing 10 million items with a generator uses roughly 112 bytes, while an equivalent list uses approximately 80 megabytes — a reduction of over 700,000 times.

Key Facts

Overview of the yield Keyword in Python

In Python, the yield keyword transforms an ordinary function into a generator function — a powerful construct that produces values lazily, one at a time, rather than computing and storing an entire sequence upfront. When a generator function is called, Python does not execute its body immediately. Instead, it returns a generator object — an iterator implementing Python's iterator protocol. Each time next() is called on the generator (directly or via a for loop), the function body executes until the next yield statement. At that point, the yielded value is returned to the caller and the function is paused — with all local variables, execution position, and call stack preserved exactly as-is. The subsequent next() call resumes execution from precisely where it stopped.

This lazy evaluation model is fundamentally different from functions using return, which terminate completely and discard all state. A generator function can yield many values over its lifetime, making it conceptually a stateful, resumable sequence producer. This approach is central to idiomatic, memory-efficient Python, especially for processing large datasets, implementing infinite sequences, and building data pipelines.

The yield keyword was introduced in Python 2.2, released on December 21, 2001, via PEP 255, authored by Tim Peters, David Goodger, and Neil Schemenauer. PEP 255 described generators as a simple and powerful tool for creating iterators that eliminate the need for custom iterator classes with boilerplate __iter__() and __next__() methods. Since then, the generator model has expanded significantly — most importantly with yield from in Python 3.3 and its role as the foundation for Python's modern async/await coroutine system introduced in Python 3.5.

How yield Works: Mechanics, Examples, and Memory Efficiency

Understanding yield requires seeing it in practice. Consider a simple counting generator:

def count_up(start, end):current = startwhile current <= end:yield currentcurrent += 1gen = count_up(1, 5)print(next(gen)) # Output: 1print(next(gen)) # Output: 2for value in gen:print(value) # Output: 3, 4, 5

Calling count_up(1, 5) does not execute any code inside the function — Python returns a generator object. The first next(gen) call enters the function, runs until yield current (with current = 1), returns 1, and freezes. The next call resumes after the yield, increments current, loops back, and yields 2. The for loop calls next() implicitly until the generator raises StopIteration, ending the loop naturally.

The memory efficiency of generators versus lists is dramatic. Generating 10 million integers:

This represents a memory reduction of over 700,000 times. For applications processing gigabyte-scale log files, streaming financial data, or large database result sets, this difference is what makes the task feasible rather than impossible.

Python 3.3 (released September 29, 2012) introduced yield from via PEP 380, enabling a generator to transparently delegate to a sub-generator or iterable:

def flatten(nested):for item in nested:if isinstance(item, list):yield from flatten(item)else:yield itemresult = list(flatten([1, [2, [3, 4]], 5]))# result: [1, 2, 3, 4, 5]

yield from properly forwards sent values, thrown exceptions, and return values between outer and inner generators — making complex recursive generator compositions clean and correct. It was also foundational to Python's async programming model: asyncio (Python 3.4, PEP 3156) originally used yield from for coroutines before async def and await were added as dedicated syntax in Python 3.5 via PEP 492.

Beyond one-directional value production, generators support two-way communication via the .send(value) method, which passes a value into the running generator where it becomes the result of the yield expression. This bidirectional capability is what made generators suitable as the basis for coroutines.

Generator expressions, introduced in Python 2.4 (November 30, 2004) via PEP 289, provide concise inline generator syntax:

# List comprehension: entire list built immediatelysquares_list = [x**2 for x in range(1000)]# Generator expression: lazy, one value at a timesquares_gen = (x**2 for x in range(1000))# Extra parentheses optional inside function calls:total = sum(x**2 for x in range(1000))

Python's standard library itertools module provides dozens of generator-based utilities — chain(), islice(), groupby(), product(), combinations() — all built on the generator protocol to enable memory-efficient processing of complex sequences without intermediate lists.

Common Misconceptions About yield in Python

Misconception 1: yield and return are interchangeable. Using return terminates a function and sends back one value, discarding all state. Using yield transforms the entire function into a generator function — no code executes on the initial call, and the function can produce multiple values over its lifetime. In Python 3, a generator function can contain a return statement, but this raises StopIteration (ending iteration), not a traditional value return. In Python 2, using return value inside a generator was a SyntaxError. The two keywords are fundamentally incompatible in purpose and behavior.

Misconception 2: Generators can be iterated multiple times. Unlike lists, tuples, or strings, a generator is a single-use iterator. Once exhausted, iterating again produces nothing — the generator does not reset. This surprises many developers:

gen = (x for x in range(3))print(list(gen)) # [0, 1, 2]print(list(gen)) # [] — generator is exhausted

To iterate the same sequence multiple times, either convert it to a list with list(), store results in another data structure, or recreate the generator object for each pass.

Misconception 3: Generators are always faster than lists. Generators excel at memory efficiency but are not always faster in raw execution speed. For small datasets that fit comfortably in memory, a list comprehension can execute faster because list access has less overhead — no repeated __next__() calls, no generator state save/restore, no re-entry of the function frame. Python core developer Raymond Hettinger has noted publicly that the primary advantage of generators is memory reduction, not raw speed. For large datasets, however, generators improve both memory consumption and time-to-first-result, since the first value is available immediately without waiting for the entire sequence to be computed.

Practical Applications of yield in Python

The yield keyword and generators are central to idiomatic, production-quality Python across many domains:

Mastering yield is widely considered a hallmark of intermediate-to-advanced Python proficiency. The Python official documentation explicitly recommends generators for any situation requiring a custom iterator, noting they automatically provide correct __iter__() and __next__() implementations that would otherwise require substantial boilerplate in a class-based approach. For any Python developer working with data processing, web development, or systems programming, understanding yield is an essential skill.

Related Questions

What is the difference between yield and return in Python?

The return statement terminates a function immediately and sends a single value back to the caller, discarding all local state permanently. The yield statement suspends the function at that point, returns a value to the caller, and preserves all local variables and execution position so the function can resume exactly where it left off on the next call. A function containing yield is called a generator function; invoking it returns a generator object rather than executing the function body. A key practical difference is that a function can only return once, but a generator function can yield many values across multiple calls before naturally ending by raising StopIteration.

What is a generator in Python?

A generator in Python is a special iterator that produces values lazily — one at a time on demand — rather than computing and storing all values in memory upfront. Generators are created either by using yield inside a function (a generator function) or by writing a generator expression with parentheses, such as (x**2 for x in range(100)). They automatically implement Python's iterator protocol with correct __iter__() and __next__() methods without any boilerplate. Processing 1 million records using a generator instead of a list can reduce memory consumption from hundreds of megabytes to roughly 100–200 bytes for the generator object itself.

What is yield from in Python?

yield from, introduced in Python 3.3 via PEP 380 (released September 29, 2012), allows a generator to transparently delegate iteration to another iterable or sub-generator. When a generator executes yield from some_iterable, each value from that iterable is yielded directly to the outer caller as though the inner code were part of the outer generator. It also correctly forwards sent values, thrown exceptions, and return values between the outer and inner generators, enabling clean recursive generator composition. yield from was foundational to Python's asyncio coroutine system before the dedicated async/await syntax arrived in Python 3.5.

When should you use yield instead of return in Python?

Use yield instead of return when your function must produce multiple values over time, when the full dataset is too large to hold in memory at once, or when the caller only needs items one at a time. Generators are ideal for reading large files line by line, consuming paginated API responses, generating infinite sequences, or building processing pipelines. For sequences containing fewer than a few thousand items that will be iterated multiple times, a list returned with return may be simpler and fast enough. The Python documentation recommends generators whenever you would otherwise write a class implementing __iter__ and __next__ methods.

What are generator expressions in Python?

Generator expressions are a compact inline syntax for creating generators, introduced in Python 2.4 via PEP 289 (released November 30, 2004). They look like list comprehensions but use parentheses instead of square brackets — (x**2 for x in range(100)) is a generator expression while [x**2 for x in range(100)] is a list comprehension. When a generator expression is the sole argument to a function call, the extra parentheses may be omitted: sum(x**2 for x in range(100)) is valid Python. Generator expressions use approximately 112 bytes of memory regardless of the number of items they represent, making them memory-efficient alternatives to list comprehensions for sequences consumed only once.

Sources

  1. Yield Expressions - Python 3 Language ReferenceCC BY-SA 3.0
  2. PEP 255 – Simple GeneratorsOpen Publication License
  3. Generator (computer programming) - WikipediaCC BY-SA 4.0