🌊

Node.js Streams — Processing Large Data Without Memory Explosion

How Readable, Writable, Transform + backpressure work

Backpressure

writable.write(chunk) returns false when internal buffer is full. Readable pauses, waits for Writable's 'drain' event. .pipe() handles this automatically.

Without backpressure: fast Readable + slow Writable → buffer grows → memory explodes.

pipeline()

.pipe() doesn't propagate errors — streams can leak. pipeline() auto-destroys all streams on error.

Key Points

1

Streams process data in chunks — never load everything into memory

2

Chain Readable → Transform → Writable with pipe()

3

write() returns false = buffer full → handle backpressure with pause/drain

4

pipeline() safer than pipe() — auto-destroys all streams on error

Use Cases

CSV/log file processing — process GB-scale files with 64KB memory HTTP proxy — forward request body without buffering