How worker threads boosted my Node application

Tugay İlik
Level Up Coding
Published in
3 min readFeb 12, 2020

--

The worker_threads module was released with Node.js v10.x LTS and stable with v13.x. It enables the use of threads that execute Javascript in parallel (be aware that parallelism and concurrency are close but different concepts). It is very useful when the operations require CPU consumption.

In my application, I’m generating dynamic scripts per request with dynamic parameters. To generate the script, I’m running an npm command with execSync command. I couldn’t use the async exec command because the command needs to follow a sequence for creating a dynamic script.

Let me show what was happening before the change:

draw.io

The process above was happening each time a request had been reaching our end from other microservices (AWS Lambda, Node.js). The whole process was sync until the 3rd step, only the upload was async. So the last step was not a thread blocker.

The incoming request amount per minute is around 30 to the generator service. That means for the instant generation process, a single process should finish in 2 seconds maximum, but mine was around 5–6 seconds. Thus, the application was locking the main thread until the active operations ends. That was delaying incoming generation processes sometimes up to 1 minute.

So, what have been changed after Worker Threads implementation?

draw.io

In code, an example of worker threads looks like the following.

In the index.js file:

// index.js
const { Worker } = require('worker_threads');
const workerData = {
write: { fileName: 'write.txt', filePath: './' },
read: { fileName: 'read.json', filePath: './' },
hash: Math.random(),
};
new Worker('./worker.js', { workerData });

In our worker.js file:

// worker.js
const { readFileSync, appendFileSync, unlinkSync, existsSync } = require('fs');
const { resolve } = require('path');
const { workerData: { read, write, hash } } = require('worker_threads');
console.log('Process triggered.');
console.time('Execution time');
/**
* @return {Promise}
*/
const sleep = async function () {
return new Promise(resolve => {
setTimeout(resolve, Math.floor(Math.random() * 5000) + 999);
});
};
(async () => {
// Act like process takes time
await sleep();
const writePath = resolve(write.filePath, write.fileName + '-' + hash);
const readPath = resolve(read.filePath, read.fileName);
const json = JSON.parse(readFileSync(readPath).toString('utf-8'));
// Remove the file before writing
if (existsSync(writePath)) {
unlinkSync(writePath);
}
json.forEach(item => {
appendFileSync(writePath, JSON.stringify(item) + '\n\n\n');
});
console.log('Process end.');
console.timeEnd('Execution time');
})();

With the implementation above, the application’s main thread is not busy with the generation process anymore. Each process is isolated and not a blocker for incoming processes. That dramatically decreases the delay in the generation process, but the CPU usage remained the same because workers are consuming a lot of CPU too.

Because it made generation faster, it provided me other benefits too:

  • The read and write file processes got a lot faster 🚀

Since the main thread was always busy, the incoming read/write requests were delayed until it could find an empty slot in the execution queue. But now, it is happening immediately.

  • Response time decreased dramatically 📉

The application’s response time for static files served via express was around 3–5, sometimes 10 seconds until it had been cached. It took a lot of time until I realized what was going there. Serving static files with express is really fast unless you have a system that consumes CPU and has a really busy thread consumption. Static files are served by the readFile functionality of node.

metrics from dynatrace

The metrics show that the response time decreased after 16:00 (after deploying new version). Now, it is around 100–200 milliseconds which is perfect.

I have enjoyed implementing worker_threads and seeing the perfect results 🕺.

I’ve created an example application for mocking how it looks on code.

Hope you enjoyed while reading it! ✊

Feel free to contact me on linkedin.

--

--