ITS
Node.js development
Introduction
Node Streams are an efficient way to channelize and process input and output data for Node.js application .Using Node Js streaming, entrepreneurs can improve the performance, scalability, and maintainability of Node.js application that function with huge amounts of data.
Find out about the types of streams in Node.js, along with their practical tutorial for better understanding.Explore the chaining and piping of Node Streams.
Streams are abstract interfaces for working with data that can be read or written sequentially. In Node.js, streams are a fundamental concept used to handle data flow between input and output sources.
Streams are an important concept in Node.js because they allow for the efficient handling of large amounts of data. Instead of loading all the data into memory at once, streams process data in chunks as it becomes available. Data can be streamed from a source (like a file or a network socket) to a destination (like a response object or another file) in real-time, without buffering the whole data into memory at once.
For instance, one may read a stream or write a stream from and to various data sources, sinks, files, network sockets, and stdin/stdout.
There are four different types of streams, each for a specific purpose, namely, Readable NodeJs Streams, Writable Streams, Duplex Streams, and Transform Streams for Node.js application.
Let us understand the Readable Node.js Stream example.
Readable streams are used to reading data from a source, such as a file or a network socket. They emit a ‘data’ event whenever new data is available and an ‘end’ event when the stream has ended. Examples of readable streams in Node.js include ‘fs.createReadStream()’ for reading files and ‘http.IncomingMessage’ for reading HTTP requests.
Let us understand the Readable Node.js Stream with an example.
const fs = require(‘fs’);
// Create a readable stream from a file
const readStream = fs.createReadStream(‘example.txt’, { encoding: ‘utf8’ });
// Handle ‘data’ events emitted by the stream
readStream.on(‘data’, (chunk) => {
console.log(`Received ${chunk.length} bytes of data.`);
});
// Handle the ‘end’ event emitted by the stream
readStream.on(‘end’, () => {
console.log(‘End of file reached.’);
});
// Handle errors emitted by the stream
readStream.on(‘error’, (err) => {
console.error(`Error: ${err}`);
});
Reading data from the file in chunk: My name is john doe
Read Stream Ended!
In this example, we use the fs module to create a readable stream from a file named ‘example.txt’. We set the encoding option to ‘utf8’ to read the file as a string.
We then handle the ‘data’ event emitted by the stream, which is triggered every time a chunk of data is read from the file. In this case, we simply log the number of bytes received.
We also handle the ‘end’ event emitted by the stream, which is triggered when the end of the file is reached. Finally, we log any errors emitted by the stream to the console.
Writable streams are used for writing data to a destination, such as a file or a network socket. They have a ‘write()’ method to write data and an ‘end()’ method to signal the end of the stream. Examples of this streams in Node.js include ‘fs.createWriteStream()’ for writing files and ‘http.ServerResponse’ for writing HTTP responses.
Example of Node.js Writable Stream:
const fs = require(‘fs’);
// Create a writable stream
const writeStream = fs.createWriteStream(‘output.txt’);
// write data to file
writeStream.write(‘Hello from write stream’)
// ending writable stream
writeStream.end();
// Handle stream events
writeStream.on(‘finish’, () => {
console.log(`Write Stream Finished!`);
})
writeStream.on(‘error’, (error) => {
console.error(`Write Stream error: ${error}`);
})
In this example, we use the fs module to create a writable stream to a file named ‘output.txt’. We set the encoding option to ‘utf8’ read data from the file as a string.
We then write data to the stream using the write() method, calling it twice to write two lines of text. We end the stream using the end() method.
We also handle the ‘finish’ event emitted by the stream, triggered when all data has been written to the file. Finally, we log any errors emitted by the stream to the console.
Duplex streams are bidirectional, meaning they can read and write data. They can be used for tasks such as proxying data from one network socket to another. Duplex streams inherit from both ‘Readable’ and ‘Writable’ streams, so they have all the methods of both.
Duplex Stream example:
const { Duplex } = require(‘stream’);
const myDuplex = new Duplex({
write(chunk, encoding, callback) {
console.log(chunk.toString());
callback();
},
read(size) {
if (this.currentCharCode > 90) {
this.push(null);
return;
}
this.push(String.fromCharCode(this.currentCharCode++));
}
});
myDuplex.currentCharCode = 65;
process.stdin.pipe(myDuplex).pipe(process.stdout);
In this example, we create a new Duplex stream using the Duplex class from the stream module. The write method is called whenever data is written to the stream, and simply logs the chunk of data to the console. The read method is called whenever the stream is read from, and in this example, it pushes characters from the ASCII character set to the stream until the character code reaches 90, at which point it pushes null to signal the end of the stream.
We then pipe the standard input stream (process.stdin) to our Duplex stream, and then pipe the Duplex stream to the standard output stream (process.stdout). This allows us to type input into the console, which gets written to the Duplex stream, and then the output from the Duplex stream gets written to the console.
Transform streams are a type of duplex stream that can modify data as it passes through them. They can be used for compression, encryption, or data validation tasks. Transform streams inherit from ‘Duplex’, so they have both a ‘read()’ and a ‘write()’ method. When you write data to a transform stream, it will be transformed by the transform function before being emitted as output.
Let us see an example of the transform Node.js stream.
const fs = require(‘fs’);
// Importing strema APIs
const { Transform, pipeline } = require(‘stream’);
// Create a readable stream
const readableStream = fs.createReadStream(‘input.txt’);
// Create a writable stream
const writableStream = fs.createWriteStream(‘output.txt’);
// Set the encoding to be utf8.
readableStream.setEncoding(‘utf8’);
// Transform chunk into uppercase
const uppercaseWordProcessing = new Transform({
transform(chunk, encoding, callback) {
console.log(`Data to be transformed: ${chunk}`);
callback(null, chunk.toString().toUpperCase());
}
});
readableStream
.pipe(uppercaseWordProcessing)
.pipe(writableStream)
// Alternatively, we can use the pipeline API to easily pipe a series of streams
// together and get notified when the pipeline is fully completed.
pipeline(readableStream, uppercaseWordProcessing, writableStream, (error) => {
if (error) {
console.error(`Error occured while transforming stream: ${error}`);
} else {
console.log(‘Pipeline succeeded!’);
}
});
// Handle stream events
readableStream.on(‘end’, () => {
console.log(`Read Stream Ended!`);
writableStream.end();
})
readableStream.on(‘error’, (error) => {
console.error(`Read Stream Ended with an error: ${error}`);
})
writableStream.on(‘finish’, () => {
console.log(`Write Stream Finished!`);
})
writableStream.on(‘error’, (error) => {
console.error(`Write Stream error: ${error}`);
})
Data to be transformed: My name is john doe
Read Stream Ended!
Write Stream Finished!
Pipeline succeeded!
In this example, we create a new class called ‘UpperCaseTransform’ that extends the built-in ‘Transform’ class from the ‘stream’ module. We override the’ _transform’ method to convert each chunk of incoming data to uppercase using the ‘toUpperCase’ method of the string object. Then, we push the transformed chunk to the writable stream using the ‘push’ method and call the ‘callback’ function to indicate that we’re done processing the chunk.
Finally, we pipe the ‘stdin’ readable stream into an instance of our ‘UpperCaseTransform’ class, and pipe the resulting transformed data to the ‘stdout’ writable stream. This causes all data written to ‘stdin’ to be converted to uppercase and printed to the console.
Now that we know about the types of Node.js Streams, let us get to the business benefits of using them.
Popular companies using Node.js, having humungous data such as Netflix, NASA, Uber, Walmart, etc. are leveraging streams Node.js and hence able to manage better, sustain, and perform with their applications. Here are the advantages of using Node Streams in your Node.js application.
In Node Js Streaming, piping is a way to connect a readable stream with a writable one using the pipe() method. The pipe() method takes a writable stream as an argument and connects it to a readable stream.
When pipe() is called, it sets up listeners on the readable stream’s ‘data’ and ‘end’ events, and automatically writes data from the readable stream to the writable stream until the end of the readable stream is reached. This makes it easy to chain together multiple streams and create a pipeline for processing data.
Here’s an example of using pipe() method:
const fs = require(‘fs’);
// Create a readable stream from a file
const readStream = fs.createReadStream(‘input.txt’);
// Create a writable stream to a file
const writeStream = fs.createWriteStream(‘output.txt’);
// Pipe the readable stream to the writable stream
readStream.pipe(writeStream);
// Handle errors emitted by either stream
readStream.on(‘error’, (err) => {
console.error(`Error reading file: ${err}`);
});
writeStream.on(‘error’, (err) => {
console.error(`Error writing file: ${err}`);
});
In this example, we first create readable and writable type streams using the fs module. Then, we then use the pipe() method to connect the readable stream to the writable stream.
We also handle any errors emitted by either stream using the on(‘error’) method.
Note that pipe() is a convenient way to handle stream data flow in Node.js, but it may not always be suitable for complex stream processing scenarios. Also, Discover various debugging techniques and tools that can help you identify and fix issues quickly with Debug Node.js application.
const fs = require(‘fs’);
// Create a readable stream from a file
const readStream = fs.createReadStream(‘input.txt’);
// Chain stream operations to transform the data
readStream
.pipe(transformStream1)
.pipe(transformStream2)
.pipe(transformStream3)
.pipe(writeStream);
// Create a writable stream to a file
const writeStream = fs.createWriteStream(‘output.txt’);
// Define transform stream operations
const transformStream1 = // …
const transformStream2 = // …
const transformStream3 = // …
In this example of Node streams, we create a readable stream from a file using the fs module. We then combine several stream operations to transform the data, using the pipe() method to connect each operation to the next.
We define the individual transform stream operations separately and pass them as arguments to pipe(). These operations can be any stream type, including Transform, Duplex, or even other Readable streams.
Finally, we create a writable stream to a file and connect it to the end of the pipeline using pipe().
Note that chaining is a powerful way to process stream data in Node.js, but it may only sometimes be the most efficient or flexible approach. Also, Learn how to leverage a new version of Node.js for your projects. Follow simple steps to download and install the latest version of Node 19 and updates.