Latest Code Tutorials

Node Streams Tutorial With Example

0 952

Get real time updates directly on you device, subscribe now.

Node Streams Tutorial With Example is today’s leading topic. Streams are collections of data just like the arrays or strings. The difference between arrays and streams is that streams might not be available all at once, and they don’t have to fit in the memory. This feature makes streams more powerful when working with the massive amounts of data, or the data that is coming from the external source one chunk at a time. Many of the built-in modules in Node implement the streaming interface. So Stream data are not available at once, but they are available at some point of time in the form of chunk data. That is why it is beneficial for developing a streaming web application like Netflix or Youtube.

If you want to learn more about Node.js then check out this course NodeJS – The Complete Guide (incl. MVC, REST APIs, GraphQL)

Node Streams Tutorial With Example

I am using node.js version v11.3.0, and your version might be v10, but it does not matter here. The Stream is an abstract interface for working with streaming data in the Node.js. The stream module provides the base API that makes it easy to build the objects that implement the stream interface. 

Streams can be readable, writable, or both.  All streams are instances of EventEmitter.

The following statement can access the stream module.

const stream = require('stream');

Why Streams

Streams primarily provide the following advantages.

  • Memory efficiency: You don’t need to carry the massive amounts of data in memory before you can process it.
  • Time efficiency: It takes way less time to start processing the data as soon as you have it, rather than waiting till the whole data payload is available to start the process.

Types of Streams

There are four fundamental stream types in Node.js:

Let us take a basic simple example of filesystem module streams.

Create a project folder using the following command.

mkdir streams

Go inside that folder and create a file called the server.js and add the following code.

// server.js

const http = require('http')
const fs = require('fs')

const server = http.createServer(function (req, res) {
  fs.readFile('data.txt', (err, data) => {
server.listen(3000, () => {
  console.log('server is running')

Here, we have used the http module to create a web server and also imported the module called fs which is the filesystem module for node.js applications.

When the server starts, we are reading the content of the file data.txt and send the response to that data to a client.

So, we can see the output inside the browser.

Save that file and also create one more file inside the root called data.txt and add the following code in it.

// data.txt

node streams
file streams

Go to the terminal and start the node server using the following command.

node server

Switch to the browser and go to this URL: http://localhost:3000/

We can see the content of the data.txt file.

Here, one thing you can note that when the reading of the file is completed then and then it will send a response. So, if the file is very big, then it takes some time to read the whole file and then sending back to the client.

We can overcome this problem by using the Stream. As we have discussed earlier that streams can emit the chunks of data to the client at some time interval. So, as soon as some chunk of data read, it will emit to the client.

We can use the Streams in the above example like this.

Related Posts
1 of 18
// server.js

const http = require('http')
const fs = require('fs')

const server = http.createServer((req, res) => {
  const stream = fs.createReadStream('data.txt')
server.listen(3000, () => {
  console.log('Node.js stream on port 3000');

Instead of waiting until the file is totally read, we can start streaming it to the HTTP client as soon as we have the chunk of data ready to be sent.

#Example 2

Let us take a second example of Node.js Stream. Write the following code inside a server.js file.

// server.js

const fs = require('fs')

let data = ''

const readerStream = fs.createReadStream('data.txt')


readerStream.on('data', (chunk) => {
   data += chunk

readerStream.on('end',() => {

readerStream.on('error', (err) => {

console.log('Node readerStream');

Here, we get the output inside the terminal because we have not used any web server. This example is an instance of a node eventemitter.

  • data − This event is fired when there is data available for read.

  • end − This event is fired when there is no more data to read.
  • error − This event is fired when there is an error receiving or writing data.
  • finish − This event is fired when all the data has been flushed to the underlying system.

Writable Streams

Write the following code inside a server.js file.

// server.js

const fs = require('fs')

const data = 'Writable Stream Example'

const writerStream = fs.createWriteStream('write.txt')



writerStream.on('finish', function() {
   console.log('Writing completed');

writerStream.on('error', function(err) {

console.log('Streaming Ended')

When the server starts, if the file is not there, it will create it and write the data inside it, and then when the writing is over, we can see the output inside the terminal as well as inside the newly created file.

Piping the Streams

Piping is a mechanism where we provide the output of one stream as the input to another stream. It is usually used to get data from one stream and to pass the output of that stream to another stream.

// server.js

const fs = require('fs')

const readerStream = fs.createReadStream('data.txt')

const writerStream = fs.createWriteStream('data2.txt')


console.log('Piping ended')

So, here, we are reading the data from the data.txt file and writing the data to the other file called data2.txt.

We have piped the readerStream to writerStream.

Difference between fs.readFile() and fs.createReadStream()

The fs.readFile() loads file and reads it into memory and then writes it to response whereas fs.createReadStream() sends chunks of file like writing small chunks of file dividing the entire process into that chunk size of file writing which reduces memory load and memory wastage/garbage reduction.

Finally, Node Streams Tutorial With Example is over.

Leave A Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.