API with NestJS #34. Handling CPU-intensive tasks with queues

JavaScript NestJS TypeScript

This entry is part 34 of 121 in the API with NestJS

Handling CPU-intensive operations with REST API can be tricky. If our endpoint takes too much time to respond, it might result in a timeout. In this article, we look into queues to help us resolve this issue.

Queue proves to be a very useful part of backend architecture. With it, we can implement asynchronous and distributed processing. A queue is a data structure that is modeled on a real-world queue. A publisher can post messages to the queue. A consumer can consume the message and process it. Once the consumer handles the message, no other consumer can process this message.

With NestJS, we have access to the package. It wraps the Bull library that provides queue functionalities based on Redis. Redis is a fast and reliable key-value store that keeps data in its memory. Even if we restart our Node.js application, we don’t lose the data saved in Redis.

Setting up Bull and Redis

Since Bull uses Redis to manage queues, we need to set it up. So far, within this series, we’ve used Docker Compose to help us with our architecture. Thankfully, it is straightforward to set up Redis with Docker.

docker-compose.yml

By default, Redis works on port

Connecting to Redis requires us to define two additional environment variables: the port and the host.

app.module.ts

.env

We also need to install the necessary dependencies.

Once we’ve got all of the above configured, we can establish a connection with Redis.

app.module.ts

Thanks to calling , we can use Bull across all of our modules.

We can pass more options besides the object when configuring Bull. For a whole list check out the documentation.

Managing queues with Bull

Let’s create a queue that can help optimize multiple PNG images for us. We will start with defining a module.

optimize.module.ts

Above, we register our queue using . Thanks to doing so, we can use it in our .

optimize.controller.ts

Above, we follow the NestJS documentation on how to upload multiple files with Multer.  To do that, we need the and the  decorator.

Once we have the files, we need to add a job to our queue using the method. We pass two arguments to it: the name of the job that we later refer to and the data it needs.

In the above endpoint, we respond with the id of the job. This will allow the user to ask for the return value of the job later.

 

Consuming the queue

Now we need to define a consumer. With it, we can process jobs added to the queue.

To optimize images, we use the imagemin library. Since we expect the user to upload multiple images, we compress the result to a file using the adm-zip package.

optimize.processor.ts

To make it more verbose, we could update the progress of the job by calling the method when we finish up optimizing some of the images.

Above, we manipulate buffers. A Node.js buffer represents a fixed-length sequence of bytes. If you want to know more about buffers, check out Node.js TypeScript #3. Explaining the Buffer.

We call the function, because the stopped being an instance of the Buffer class when serialized and put to the Redus store.

Returning the result of the job

The crucial part of our method is the fact that it returns a buffer. Thanks to that, Bull saves our optimized images to Redis, and we can refer to it later.

To do that, let’s create a new endpoint that takes the job’s id as a parameter.

optimize.controller.ts

If we would update the progress of the job in the consumer, we might respond with it if the job is not yet complete.

Above, we use the method to get the job with a given id. Since we’ve used the decorator, we put NestJS into the library-specific mode for the handler. Because of that, we are responsible for managing the response manually, for example, with the method.

If the job with the specified id exists but hasn’t yet been completed, we respond with the 202 Accepted status. It indicates that we’ve accepted the request and are processing it, but we haven’t yet completed it.

If the job is completed, we create a readable stream from the buffer and send it to the user.

If you want to know more about streams, check out Node.js TypeScript #4. Paused and flowing modes of a readable stream

If you want to use Postman to download the result, use the “Send and download” button.

Running jobs in separate processes

Our job processors can run in separate processes for better performance.

optimize.module.ts

Since we execute our image processor in a forked process, the dependency injection isn’t available. If we would need some dependencies, we would need to initialize them.

image.processor.ts

If you want to know more about child processes, read Node.js TypeScript #10. Is Node.js single-threaded? Creating child processes

Summary

In this article, we’ve learned the basics of managing queues with NestJS and Bull. To do that, we’ve implemented an example in which we optimize multiple images at once. Thanks to doing that through the queue, we can better manage our resources. We can also avoid timeouts on CPU-intensive tasks and run them in separate processes.

Series Navigation<< API with NestJS #33. Managing PostgreSQL relationships with PrismaAPI with NestJS #35. Using server-side sessions instead of JSON Web Tokens >>
Subscribe
Notify of
guest
2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
refi slak
refi slak
3 years ago

that was very good to know. thanks <3

Nutan Shrivastava
Nutan Shrivastava
3 years ago

I am trying to integrate worker_threads with nestjs. I am successful, but unsure how can I perform db operations within the worker since the worker is not a call it cannot be added to module and when it gives me issues when I try to inject my data model in the worker to perform any crud ops. Any suggestions?