How to create Node.js Streams and what types of Streams are available?
Node.js Streams have a reputation for being challenging to use and much more challenging to comprehend.
As Dominic Tarr put it, “Streams are Node’s best and most misunderstood idea.” Even Redux founder and React.js core developer Dan Abramov is terrified of Node streams.
What are streams?
One of the core ideas that underpin Node.js applications is the concept of streams. They are data-handling techniques that read or write input into output in a sequential fashion.
The use of streams makes it possible to read and write files, communicate over a network, or exchange any kind of information end-to-end.
In contrast to the conventional method of reading files into memory all at once, streams read data chunks piece by piece and process them without storing the entire file in memory.
This makes streams extremely useful for working with enormous volumes of data. For instance, file size can exceed your available memory, making it difficult to handle it by reading the entire file into memory. Streams come to the rescue in this situation!
Larger files can be read by using streams to process data in smaller parts.
For instance, “streaming” services like Netflix or YouTube don’t require you to download the video and audio feed at once. Instead, the video is delivered to your browser in a steady stream of chunks, enabling the recipients to start watching and/or listening nearly instantly.
Why streams
Streams essentially offer two key advantages over other data management techniques:
- Memory efficiency: Processing data doesn’t require a lot of memory to be loaded into it.
- Efficiency in terms of time: processing data as soon as it is available saves a lot of time compared to waiting until the entire payload has been transmitted.
Node.js supports 4 different kinds of streams:
- Writable: Streams that allow data to be written. For instance, we can use streams to write data to a file using fs.createWriteStream().
- The ability to read data from streams. For instance, we can read a file’s contents by calling fs.createReadStream().
- Duplex: streams with both read- and write-ahead capabilities. For illustration, net.Socket
- Streams that have the ability to change or transform data as it is written and read. You can write compressed data to and read decompressed data from a file, for instance, in the case of file compression.
If you’ve used Node.js before, you may be familiar with streams. A Node.js-based HTTP server, for instance, treats the request as a readable stream and the response as a writable stream. The fs module, which enables you to interact with both readable and written file streams, might have been utilized. Because TCP sockets, the TLS stack, and other connections are all based on Node.js streams, streams are used whenever you use Express to interact with the client. They are also used by every database connection driver you can use.
Creating a readable stream
The Readable stream is initially needed, and it is initialized.
const Stream = require('stream') const readableStream = new Stream.Readable() We can now transmit data to the stream since it has been initialised: readableStream.push('ping!') readableStream.push('pong!')
Async Iterator
When working with streams, the async iterator is strongly advised. Asynchronous iteration, according to Dr. Axel Rauschmayer, is a methodology for obtaining data container contents asynchronously (i.e., the current “task” may be interrupted before receiving an item). The stream async iterator implementation uses the ‘readable’ event within, it should be noted.
Async iterators can be used to read from readable streams:
import * as fs from 'fs'; async function logChunks(readable) { for await (const chunk of readable) { console.log(chunk); } } const readable = fs.createReadStream( 'tmp/test.txt', {encoding: 'utf8'}); logChunks(readable);
// Output: // 'This is a test!\n'
How to create a writable stream
You must use write() on the stream instance in order to write data to a writable stream. as in the following illustration:
var fs = require('fs'); var readableStream = fs.createReadStream('file1.txt'); var writableStream = fs.createWriteStream('file2.txt'); readableStream.setEncoding('utf8'); readableStream.on('data', function(chunk) { writableStream.write(chunk); });
The code shown above is simple. It only uses write() to write data chunks to the destination after reading them from an input stream. If the operation was successful, this function returns a boolean result of true or false. If so, the write operation was successful, and you are free to continue writing data. If false is returned, something went wrong and you are currently unable to post anything. When you are ready to begin writing more data, the writable stream will signal you by broadcasting a drain event.
When the writable.end() method is invoked, the Writable is informed that no further data will be written to it. The optional callback function is attached as a listener for the ‘finish’ even if it is provided.
:const fs = require('fs'); const file = fs.createWriteStream('example.txt'); file.write('hello, '); file.end('world!'); // Writing more now is not allowed!
pipeline()
The process of piping involves feeding the output of one stream into another. It is typically utilized to obtain data from one stream and transmit that stream’s output to another stream. The number of piping operations is unrestricted. In other words, streamed data is processed using pipes in stages.
Stream.pipeline() was introduced in Node 10.x. This module function allows you to connect streams, forward errors, clean up properly, and offer a callback when the pipeline is finished.
Here is a use case for the pipeline:
const { pipeline } = require('stream'); const fs = require('fs'); const zlib = require('zlib'); pipeline( fs.createReadStream('The.Matrix.1080p.mkv'), zlib.createGzip(), fs.createWriteStream('The.Matrix.1080p.mkv.gz'), (err) => { if (err) { console.error('Pipeline failed', err); } else { console.log('Pipeline succeeded'); } } );
The Stream Module
All streaming APIs are built on a foundation provided by the Node.js stream module.
Native Node.js modules include the Stream module by default. The Stream is an instance of the Node class known as EventEmitter, which manages events asynchronously. Consequently, streams are by nature event-based.
In order to use the stream module:
const stream = require('stream');
For making new varieties of stream instances, utilize the stream module. The stream module need not always be used to consume streams.
The following are some significant writable stream-related events:
● error – Emitted to show that a writing/piping mistake has happened.
● The writable stream emits this event when a readable stream is piped into it.
● When you call unpipe on the reading stream to prevent it from piping into the destination stream, an unpipe signal is released.
Conclusion
The fundamentals of streams were the focus of this. The most essential and potent elements of Node.js are streams, pipelines, and chaining. You can write clean, effective I/O code with the aid of streams.
Additionally, BOB, a Node.js strategic initiative, is worth looking at because it aims to enhance Node.js’s streaming data interfaces both internally within the Node.js core and, ideally, externally as future public APIs.