JavaScript generators (function*) are special functions that can pause and resume execution using the yield keyword. They enable lazy evaluation, infinite sequences, bidirectional communication, and elegant async patterns. While generators are sometimes overlooked in favor of async/await, they solve problems that promises cannot — including cancelable streams, infinite data sources, and memory-efficient pipelines.
Generator Basics: function* and yield
A generator function returns a generator object that implements the Iterator protocol. Each yield pauses execution and returns a value; calling next() resumes it. Generators are lazy — they compute values on demand.
// Generator function syntax: function* with yield
function* counter(start = 0) {
let i = start;
while (true) {
yield i++; // pauses here, returns i, resumes on next()
}
}
const gen = counter(1);
console.log(gen.next()); // { value: 1, done: false }
console.log(gen.next()); // { value: 2, done: false }
console.log(gen.next()); // { value: 3, done: false }
// Finite generator
function* range(start, end, step = 1) {
for (let i = start; i < end; i += step) {
yield i;
}
}
// for...of automatically calls .next() and stops at done: true
for (const num of range(0, 10, 2)) {
console.log(num); // 0, 2, 4, 6, 8
}
// Spread operator works with generators
const nums = [...range(1, 6)]; // [1, 2, 3, 4, 5]
// Destructuring works too
const [a, b, c] = range(10, 20); // a=10, b=11, c=12Two-Way Communication: yield as Expression
yield is not just a return statement — it is an expression that receives a value when next(value) is called. This enables stateful coroutines and the classic Redux-Saga pattern.
// Generators support two-way communication
// yield receives values via next(value)
function* calculator() {
let result = 0;
while (true) {
const input = yield result; // pauses and sends result; receives input
if (input === null) break;
result += input;
}
return result;
}
const calc = calculator();
calc.next(); // start: { value: 0, done: false }
calc.next(10); // add 10: { value: 10, done: false }
calc.next(5); // add 5: { value: 15, done: false }
calc.next(null); // stop: { value: 15, done: true }
// Generator as stateful iterator
function* idGenerator(prefix = 'id') {
let id = 1;
while (true) {
const reset = yield `${prefix}-${id}`;
if (reset) {
id = 1;
} else {
id++;
}
}
}
const ids = idGenerator('user');
console.log(ids.next().value); // 'user-1'
console.log(ids.next().value); // 'user-2'
console.log(ids.next(true).value); // 'user-1' (reset)
console.log(ids.next().value); // 'user-2'yield*: Delegating to Other Iterables
yield* delegates to another iterable — whether a generator, array, Set, Map, or string. It forwards all values and return values from the inner iterable.
// yield* — delegate to another iterable
function* innerGen() {
yield 'a';
yield 'b';
yield 'c';
}
function* outerGen() {
yield 1;
yield* innerGen(); // delegate: yields 'a', 'b', 'c'
yield* [4, 5, 6]; // works with any iterable
yield 7;
}
console.log([...outerGen()]); // [1, 'a', 'b', 'c', 4, 5, 6, 7]
// Practical: flatten nested arrays
function* flatten(arr) {
for (const item of arr) {
if (Array.isArray(item)) {
yield* flatten(item); // recursive delegation
} else {
yield item;
}
}
}
const nested = [1, [2, [3, 4], 5], [6, 7]];
console.log([...flatten(nested)]); // [1, 2, 3, 4, 5, 6, 7]
// Tree traversal with yield*
function* walkTree(node) {
yield node.value;
for (const child of node.children ?? []) {
yield* walkTree(child); // depth-first traversal
}
}Async Generators: Streaming Data
Async generators (async function*) combine generators with async/await. They are perfect for streaming APIs — processing data as it arrives rather than waiting for everything to load.
// Async generators: async function* with yield
async function* streamLines(url) {
const response = await fetch(url);
const reader = response.body.getReader();
const decoder = new TextDecoder();
let buffer = '';
while (true) {
const { done, value } = await reader.read();
if (done) {
if (buffer) yield buffer;
break;
}
buffer += decoder.decode(value, { stream: true });
const lines = buffer.split('\n');
buffer = lines.pop() ?? '';
for (const line of lines) {
yield line; // yield each complete line
}
}
}
// Usage with for await...of
async function processCSV(url) {
let lineNumber = 0;
for await (const line of streamLines(url)) {
lineNumber++;
if (lineNumber === 1) continue; // skip header
const [name, score] = line.split(',');
console.log(`${name}: ${score}`);
}
}
// SSE (Server-Sent Events) as async generator
async function* sseStream(url) {
const response = await fetch(url);
const reader = response.body.getReader();
const decoder = new TextDecoder();
for await (const chunk of readChunks(reader)) {
const text = decoder.decode(chunk);
for (const line of text.split('\n')) {
if (line.startsWith('data: ')) {
yield JSON.parse(line.slice(6));
}
}
}
}Real-World Patterns
Generators enable lazy pipelines that process only the data needed. Combined with filter(), map(), and take() generators, you can compose data transformations without creating intermediate arrays.
// Real-world: infinite scroll with generator
function* paginator(fetchPage) {
let page = 1;
let hasMore = true;
while (hasMore) {
const { items, totalPages } = yield fetchPage(page);
hasMore = page < totalPages;
page++;
}
}
// Lazy pipeline with generators
function* map(iterable, fn) {
for (const item of iterable) {
yield fn(item);
}
}
function* filter(iterable, predicate) {
for (const item of iterable) {
if (predicate(item)) yield item;
}
}
function* take(iterable, n) {
let count = 0;
for (const item of iterable) {
if (count++ >= n) break;
yield item;
}
}
// Lazy pipeline — no intermediate arrays created!
const first10EvenSquares = [
...take(
filter(
map(range(1, Infinity), x => x * x),
x => x % 2 === 0
),
10
)
];
// [4, 16, 36, 64, 100, 144, 196, 256, 324, 400]
// Observable-like: cancelable async iteration
async function* withTimeout(asyncIterable, timeoutMs) {
const timeout = setTimeout(() => {
throw new Error('Stream timed out');
}, timeoutMs);
try {
for await (const item of asyncIterable) {
yield item;
}
} finally {
clearTimeout(timeout);
}
}Generators vs Alternatives
| Feature | Generator | async/await | Promise | Observable |
|---|---|---|---|---|
| Infinite sequences | Perfect | No | No | Yes |
| Lazy evaluation | Yes (pull-based) | No | No | Yes (push) |
| Backpressure | Natural (pull) | No | No | Yes |
| Streaming async | async function* | No | No | Yes (RxJS) |
| Two-way comms | yield expression | No | No | No |
| Browser support | ES2015+ (all) | ES2017+ | ES2015+ | Requires RxJS |
Best Practices
- Use generators for lazy sequences and infinite data sources. Use async/await for one-shot async operations that return a single value.
- Return early from generators with return to set done: true. Any code after return is unreachable. Use try/finally for cleanup when a generator is abandoned.
- Async generators are ideal for streaming HTTP responses, SSE streams, and WebSocket message processing.
- Build reusable pipeline helpers (map, filter, take, zip) as generator functions for memory-efficient data transformations.
- TypeScript: annotate generator return types as Generator<YieldType, ReturnType, NextType> for full type safety.
Frequently Asked Questions
What is the difference between a generator and an iterator?
An iterator is any object with a next() method that returns { value, done }. A generator is a function that automatically creates and manages an iterator. All generators are iterators, but not all iterators are generators. Arrays, Sets, Maps, and strings have built-in iterators; you implement custom iterators manually.
Can generators replace async/await?
They once did — before async/await existed, libraries like co() used generators to simulate async code. Today, async/await is simpler for one-shot promises. But async generators shine for streaming: processing data as it arrives, handling SSE, or implementing cancelable operations.
What is the Iterator Protocol?
An object implementing the Iterator Protocol has a next() method returning { value: T, done: boolean }. An Iterable has a [Symbol.iterator]() method returning an iterator. Generator objects implement both. This protocol is what enables for...of, spread, and destructuring to work with any custom data structure.
Are generators good for Redux-Saga?
Yes — Redux-Saga uses generators extensively. Sagas are generator functions that yield effect descriptors (call, put, take, fork). The middleware interprets these effects and handles async execution. This makes side effects testable without mocking — you just check the yielded descriptors.
How do I cancel a generator or async generator?
Call generator.return(value) to force-finish it. For async generators used in for await...of, breaking or returning from the loop calls the generator's return() method automatically. Use try/finally inside the generator for cleanup on cancellation. AbortController integrates naturally for fetch-based async generators.