Streams process data piece by piece, enabling efficient handling of large files and real-time data without loading everything into memory.
Stream Types#
Readable: Source of data (fs.createReadStream, http request)
Writable: Destination for data (fs.createWriteStream, http response)
Duplex: Both readable and writable (TCP socket)
Transform: Modify data as it passes through (compression, encryption)
Reading Streams#
Writing Streams#
Transform Streams#
Pipeline#
HTTP Streaming#
Async Generators#
Backpressure#
Object Mode Streams#
Best Practices#
Memory:
✓ Use streams for large files
✓ Set appropriate highWaterMark
✓ Handle backpressure
✓ Clean up on error
Error Handling:
✓ Always handle 'error' events
✓ Use pipeline for cleanup
✓ Destroy streams on error
✓ Check stream state
Performance:
✓ Tune chunk size for workload
✓ Use object mode for objects
✓ Avoid unnecessary transforms
✓ Profile memory usage
Conclusion#
Streams enable efficient data processing in Node.js. Use pipeline for composing streams safely, handle backpressure to prevent memory issues, and leverage async iteration for cleaner code. Streams are essential for processing large files and real-time data.