joseverissimo
Published 6 Jan, 2019

Dealing with back-pressure when streaming data

Although Node.js is fairly fast, when it comes to serving large amounts of data it can suffer from bottlenecks.

If you are following this tutorial make sure to install the lorem-ipsum dependancy using the following command npm i -s lorem-ipsum. We are using lorem-ipsum to generate random paragraphs that we can then send to the client.

Consider the code below:

simpleServer.jsJavaScript
"use strict";

// Require needed dependencies.
var loremIpsum = require('lorem-ipsum');
var http = require('http');

// Generate random paragraphs.
function generateLorem() {
    return loremIpsum({
        count: 500,
        units: 'paragraphs',
        sentenceLowerBound: 30,
        sentenceUpperBound: 100,
    });
}

// Create a server that we can connect to via curl.
http.createServer((req, res) => {
    res.writeHead(200, { 'Content-Type': 'text/plain' });

    var c = 0; // Counter variable.
    
    while (c < 100) {
        // Generate random paragraphs and send it to the client.
        res.write(c + ': ' + generateLorem() + '\n');
        c++;
    }

    // Response will end when the loop has finished.
    res.end('\n Request ended. \n');

    // Let us know when the server has finished the request.
    res.on('finish', () => { 
        console.log('Request fulfilled');

        // Find out how much memory was used to process the request.
        const heapUsed = process.memoryUsage().heapUsed / 1024 / 1024;
        console.log(`This process used ${Math.round(heapUsed * 100) / 100} MB to execute`);
    });

}).listen(8080, () => console.log('Listening on http://localhost:8080'));

In this snippet, we are creating a simple server that when called returns 5000 paragraphs of Lorem Ipsum text.

Run node simpleServer in your terminal in order to start the server and then on another terminal window run the following command curl http://localhost:8080, this will request the Lorem Ipsum data from the server you have just initiated.

The request command (cURL) will take around a minute to complete, this is to be expected as we are returning a large amount of data, but the interesting part is that if you return to the window where we initiated the server you will see a similar output:

Bash
$ node simpleServer
Listening on http://localhost:8080
Request fulfilled
This process used 107.67 MB to execute
All data was sent

It is interesting because why would a process of just sending data use so much memory from our computer's resources when we are not doing that much processing, to begin with?

The script is suffering from back-pressure, which is also known as a bottleneck, due to the server sending the data too fast for the client to receive it. The client, in this case, would be the cURL command that we used.

As the client is unable to receive the data, Node.js will queue that part of data (or all of it) in user memory and wait for the client to be free again before sending it back. Explaining why there was so much memory used when running the last script.

While this does not seem like that big of a deal, imagine the server's resources being limited in the amount of memory allocated; or even if there were several thousands of requests being made to the server at once which would cause the process to run out of memory and crash.

Not only that, you are even at risk of losing some of the data when sending it to the client.

Overcoming back-pressure

The easiest solution would be to allocate more memory to your process but that sometimes that may not be possible, so here is what you should do.

First of all, according to the Node.js documentation, the write function in the request variable returns true, if all of the data was flushed successfully to the client.

Or it returns false, if all or part of the data was instead queued in memory and not sent. When this is the case we know that our process is suffering from back-pressure. Therefore, we should stop sending more data to the client.

Also, in the documentation, it is referenced that an event called 'drain' is emitted as soon as our client is free to receive data again.

With these two pieces of information, we know that the server process needs to stop sending data as soon as the write function returns false. We can then listen for the 'drain' event to be emitted before we resume sending our data, so our code would look something like the below.

simpleServer.jsJavaScript
"use strict";

// Require needed dependencies.
var loremIpsum = require('lorem-ipsum');
var http = require('http');

// Generate random paragraphs.
function generateLorem() {
    return loremIpsum({
        count: 500,
        units: 'paragraphs',
        sentenceLowerBound: 30,
        sentenceUpperBound: 100,
    });
}


// Create a server that we can connect to via a curl.
http.createServer((req, res) => {
    res.writeHead(200, { 'Content-Type': 'text/plain' });

	var c = 0; // Counter variable.

	function generateAndSendData() {
		while (c < 100) {
			// Generate random paragraphs and send it to the client.
			const result = res.write(c + ': ' + generateLorem() + '\n');
			c++;

			// Are we ok to send more data or have we hit a bottleneck?
			if (!result) {
				// We've hit a bottleneck, return in order for the while 
				// loop to stop and listen for the 'drain' event.
				return res.once('drain', generateAndSendData);
			}
		}

		res.end('\nThe end...\n',() => console.log('All data was sent'));
	}
	// Call this function at least once.
	generateAndSendData();
	
	// Let us know when the server has finished the request.
	res.on('finish', () => { 
		console.log('Request fulfilled')
		
		// Find out how much memory was used to process the request.
        const heapUsed = process.memoryUsage().heapUsed / 1024 / 1024;
        console.log(`This process used ${Math.round(heapUsed * 100) / 100} MB to execute`);
	});

}).listen(8080, () => console.log('Listening on http://localhost:8080'));

Run the code above by running node simpleServer in your terminal, and then on another terminal window run curl http://localhost:8080.

You will notice that it took around the same time to send the data, but if you look at the terminal window where the server is running, you will see that the memory used was significantly less than before.

Bash
$ node simpleServer
Listening on http://localhost:8080
Request fulfilled
This process used 13.82 MB to execute
All data was sent

And why is that? If you look at the revisited code, you can see that as soon as the process hits a bottleneck when writing the data to the client, we are stopping the while loop from carrying out by adding a return keyword.

We are returning a listener of the 'drain' event, which then calls the same generateAndSendData function.

This is because, once that 'drain' event emits a signal, it lets us know that the client is free to receive more data again. This prevents a lot of data to queue in the resource's memory, as we now only ever process the data and send it if the client is free to receive it.

I hope this post has helped you with understanding the aspects of streaming data and bottlenecks. If you have any questions please don't hesitate to contact me via email 👍