🔄

Generators & yield — How Iteration Works Without Eating Memory

The mechanism behind yield "pausing" function execution

Lists load everything into memory. 100 million items = 100 million items worth of memory.

# Eats all memory
numbers = [x * 2 for x in range(100_000_000)]

# Almost no memory
numbers = (x * 2 for x in range(100_000_000))

The second is a generator expression. Values aren't pre-created — computed one at a time on each next() call.

What yield Does

Normal functions destroy the stack frame on return. Local variables, execution position — all gone.

yield is different. It returns a value while preserving the stack frame. Next next() call resumes from the line after yield. Local variables intact.

CPython Internals

In CPython, generators are PyGenObject. gi_frame holds the preserved frame, gi_code stores bytecode. next() calls _PyEval_EvalFrameDefault resuming from the preserved frame.

Unlike normal function calls, no new frame is created — the existing frame is reused. This is why generators are lightweight.

Key Points

1

yield returns a value while preserving the function stack frame

2

next() resumes from the line after yield — local variables preserved

3

In CPython, frame preserved in PyGenObject.gi_frame

4

send() injects values at yield points — basis for coroutines

Use Cases

Large file processing — read line by line keeping memory constant Infinite sequences — endless iterators like counter, fibonacci