Unlock Peak Performance for Your NestJS Application: A Comprehensive Guide
[UPDATED: 09/02/2025]
Are you struggling with slow response times
, high resource usage
, or scalability issues
in your NestJS
application? What if you could fix 80% of your performance problems by optimising just 20% of your code?
Welcome to the ultimate guide to supercharging your NestJS app! Here, we'll dive into actionable strategies, from identifying bottlenecks using the Pareto Principle to leveraging cutting-edge techniques.
- Why This Matters?
Performance optimisation isn't just about speed, it's about delivering a seamless user experience, reducing costs, and ensuring your application can scale effortlessly. By focusing on the most impactful areas, you can achieve dramatic improvements without wasting time on low-priority fixes.
Technique
- Identify Bottlenecks using Pareto Principle (80/20 Rule)
- NodeJS version upgrade
- Enable HTTP/3 or HTTP/2
- NestJS logger performance
- Buffering logs improvement
- Request-Scoped injection -
@Injectable({ scope: Scope.REQUEST })
- Use Fastify instead ExpressJS
- Use NodeJS compression middleware
- Caching Strategies types
-
In-Memory Caching
= "CacheModule.register({" - Redis-based Caching (Distributed) =
CacheModule.reguster({ store: redisStore,
- Browser HTTP Caching with Headers -
'Cache-Control':
- Proxy Caching
Db Query Caching
- Caching
environment variables
at runtime =ConfigModule()
- Caching storage
cache-manager
- Caching Server Side External API Calls to Reduce Latency
- Redis CacheInterceptor Global implementation
- HTTP caching with @CacheKey and @CacheTTL
- Cache busting
-
- Db schema synchronise
- Define to the Module Class which repository to use
- Use Lazy-loading:
route-based lazy loading
+Dynamic module loading
- Use Compression
- Database Query Optimisation
- Minimise the Number of Queries
- Use ORM Frameworks with Streaming Capabilities
- Reduce the Amount of Data Fetched
- Leverage Database Indexing
- Avoid N+1 Query Problem
- Caching query results
- Implement WebWorkers - WW
- Implement WebSockets - WS
- Implement true Load balancing across multiple servers =
NodeJS Clustering
->Cloud-based Load Balancer
+Reverse Proxy
- NodeJS Clustering
- Cloud-based Load Balancer
- Reverse Proxy
- Choose the most appropriate Dependency Injection Library
- API Architecture performance:
- API Composition pattern (efficient Microservice Communication)
- API caching
- Reverse Proxy
- Choose the most appropriate Dependency Injection Library
- API Architecture performance:
- API Composition pattern (efficient Microservice Communication)
- API caching
- Metric for insights
- Documenting your API
- Adding Swagger annotations to DTOs and entities
- Metric for insights
- Documenting your API
- Adding Swagger annotations to DTOs and entities
- Recomendations
Explanation
- Pareto Principle (80/20)
80/20 rule, is a powerful concept that can significantly improve efficiency and effectiveness in software development
Identify bottlenecks using profiling tools (e.g., Jaeger, Zipkin, ClinicJS) and prioritize critical optimisations
-- Where to look:
- Database Layer: Check Slow queries, connection pooling, caching.
- API Endpoints: High response times, unoptimized middleware, lack of caching.
- CPU-Intensive Operations: Heavy computation, unoptimized libraries, lack of offloading.
- Memory Usage: Memory leaks, large data structures.
- Third-Party Integrations: Slow API calls, unoptimized libraries, lack of caching.
- Event Loop Latency: Synchronous operations, unoptimized loops.
- Logging and Monitoring: Excessive logging, lack of monitoring.
- Deployment and Infrastructure: Resource limits, lack of scaling, unoptimised Docker images.
- NodeJS version upgrade
One of the performance wins I could achieve was by simply upgrading, the runtime version
from 16
to18
, if even possible to20
while making little to no changes to NestJS code
- This will, quickly, improved Performance: V8 Engine Updates Optimised Garbage Collection
- TypeORM: Can indeed benefit from the performance improvements introduced in NodeJS 20+, especially when handling large datasets or complex queries:
- Native ECMAScript Modules (ESM) Support: This can lead to cleaner code, improved tree-shaking, and potentially better performance.
Performance Improvements in newer V8 engine: This can lead to faster execution of TypeORM's
query-building logic
anddata manipulation operations
Data Fetching Strategies:
When dealing with large datasets, consider
optimising the data-fetching strategies with TypeORM
:Pagination: We can set limit the number of records retrieved in a single query.
Selective Fetching: Retrieve only the necessary fields instead of entire entities.
Batching: When fetching related entities, use batching to reduce the number of queries executed.
To fully leverage these improvements, ensure the application is updated to use NodeJS 20+ and review the TypeORM configuration and database design. Regular profiling and performance testing will help identify bottlenecks and ensure your application scales efficiently with large datasets.
Enable HTTP/3 or HTTP/2
Multiplexing:
In HTTP/1.1, only one request can be processed at a time per connection (head-of-line blocking), meaning subsequent requests must wait until the first one is completed. With HTTP/2, multiple streams can be opened, allowing the server to send multiple responses at once without waiting for previous requests to finishImproved Resource Loading:
Simultaneous loading of resources improves the overall time it takes for a page to become interactive.
Fallback to HTTP/2: While HTTP/3 is preferable, if a client or network does not support HTTP/3, your server should fall back to HTTP/2
If your server and infrastructure (like Azure) support HTTP/3, it's recommended to utilize it, as it generally offers better performance and user experience than HTTP/1.1
Ensure that your Azure App Service is configured with a valid SSL certificate. This is necessary because HTTP/2 requires a secure connection (HTTPS)
- NestJS logger performance
NestJS comes with a default logger (Logger class) which internally writes logs to
process.stdout
andprocess.stderr
.
This can affect performance in NodeJS applications, including NestJS, particularly because mostI/O operations are asynchronously
, butprocess.stdout
andprocess.stderr
are synchronous in certain conditions, which can block the event loop producing perfomance issues.
- In brief:
Most I/O operations in NodeJS are asynchronous, allowing the event loop to remain non-blocking.
However,
process.stdout
+process.stderr
can behave synchronously under certain conditions, such as when their output is redirected (e.g., piped to a file), potentially blocking the event loop and impacting performance.
import { Logger } from '@nestjs/common';
const logger = new Logger('AppModule');
logger.log('This is a log message');
- Alternatives:
-- Winston logger:
Winston logger will write logs asynchronously, which significantly improves performance by not blocking the event loop during I/O operations.
npm install @nestjs/winston winston
- Buffering logs improvement
All logs will be buffered until the custom logger is attached
import { NestFactory } from '@nestjs/core';
import { AppModule } from './app.module';
import { Logger } from 'nestjs-pino';
async function bootstrap() {
const app = await NestFactory.create(AppModule, { bufferLogs: true });
app.useLogger(app.get(Logger));
await app.listen(3000);
}
bootstrap();
- Request-Scoped injection -
@Injectable({ scope: Scope.REQUEST })
is a design pattern
used in Dependency Injection (DI) frameworks like NestJS,where a service is instantiated once per incoming request
- BENEFICIAL Request-Scoped Injection -- Isolation = more safe -- Concurrency Safety = no risk of shared mutable state -- Lifecycle Management = simplify memory management
Although it all sounds quite intimidating, a properly designed application that
leverages request-scoped providers
should not slow down by more than ~5% latency-wise.
- NO BENEFITIAL Transactional Services: Logging or Metrics Collection
Increased Resource Consumption: Each request creates a new instance of the service, which can lead to increased memory usag
Performance Impact: The overhead of instantiating services for each request can impact performance
While Nest tries to cache as much metadata as possible, it will still have to create an instance of your class on each request. This approach means that I will be instantiating a new instance SomeBrokenLoggingService for each request, that would impact my application performance
While this is useful for isolating, it adds "significant overhead" since each request requires creating, managing, and destroying service instances.
In NestJS, you can define a service as
request-scoped
, meaning a new instance of the service is created for every request. This Negatively impact performance because it creates a new instance of the service for every incoming request
@Injectable({ scope: Scope.REQUEST })
export class SomeBrokenLoggingService {
// A new instance will be created for every request
constructor(@Inject(REQUEST) private request: Request) {}
}
Alternative Approaches
use
Async Local Storage
(ALS):
AsyncLocalStorage is a NodeJS API that allows you to store and retrieve context for the duration of a specific request, making it ideal for tracking request-specific data like correlation IDs without creating new service instances.
import { AsyncLocalStorage } from 'async_hooks';
const asyncLocalStorage = new AsyncLocalStorage();
app.use((req, res, next) => {
asyncLocalStorage.run(new Map(), () => {
// Set UserID for the request
asyncLocalStorage.getStore().set('UserId', generateUserId());
next();
});
});
// Retrieve UserId ID within a service
const UserId = asyncLocalStorage.getStore().get('UserId');
LINK
https://docs.nestjs.com/fundamentals/injection-scopes#performance
- Use
Fastify
insteadExpressJS
NestJS comes by default with Express platform, but it can be easily switched to Fastify, and the documentation did a great job explaining it on the "Performance (Fastify)" page
Fastify is a fast and low memory overhead web framework for NodeJS, built with a focus on high performance and low memory usage.
While ExpressJS remains the default option for most NestJS applications, Fastify offers superior performance, lower memory usage, and built-in features like JSON schema validation and logging.
- Why
Low-memory overhead
: Processes requests faster than many other frameworks (like Express) due to its low-overhead design: Fastify is known for using less memory than Express, which is important in environments with constrained resources or applications that handle a high number of concurrent requests.
Built-in Logging
:
Fastify has built-in high-performance logging using the pino logger. Logging is asynchronous by default, reducing the performance hit caused by I/O operations, something ExpressJS does not handle as efficiently without additional configuration.
Async/Await First
:
Fastify is designed around async/await, making it more modern and easier to use in newer JavaScript/TypeScript projects, Express does not.
npm install @nestjs/platform-fastify fastify
import { NestFactory } from '@nestjs/core';
import { AppModule } from './app.module';
import {
FastifyAdapter,
NestFastifyApplication,
} from '@nestjs/platform-fastify';
async function bootstrap() {
const app = await NestFactory.create<NestFastifyApplication>(
AppModule,
new FastifyAdapter(),
);
await app.listen(3000);
}
bootstrap();
- LINK
NestJS Performance: Fastify
https://docs.nestjs.com/techniques/performance#performance-fastify
Avoiding NestJS performance bottlenecks
https://medium.com/@Fcmam5/avoiding-nestjs-performance-bottlenecks-78fa2bc66372
Optimize Your NestJS App Performance With These Practices
https://www.brilworks.com/blog/optimize-your-nest-js-app-performance-with-these-practices/
- Use NodeJS compression middleware
The compression package helps to compress HTTP responses, which can significantly improve performance, particularly for large files such as HTML, CSS, and JS
Compress will help to maintain minimal data transfer during client interactions reducing the payload
With compression enabled, responses from your NestJS App will be compressed before they are sent to the client, reducing the amount of data that needs to be transferred over the network and improving performance.
Now, by default, the compression() middleware will use the gzip compression algorithm. You can also pass in options to customize the behavior of the middleware, such as the threshold at which responses will be compressed, using the following syntax:
app.use(compression({
threshold: 512 // set the threshold to 512 bytes
}));
npm i --save compression
// This setup will compress responses sent from both your Angular and NestJS API
import { NestFactory } from '@nestjs/core';
import { AppModule } from './app.module';
import * as compression from 'compression';
async function bootstrap() {
const app = await NestFactory.create(AppModule);
// Enable compression middleware
app.use(compression());
await app.listen(3000);
}
bootstrap();
- Caching Strategies
NodeJS itself
DOESNT provide built-in caching mechanisms
(but it can be built), fortunately NestJS does provideCacheModule
1. In-Memory Caching = CacheModule.register({
Like Post-it notes on your computer screen, it's quick to write and read, but disappears when you shut down
Stores cache data in memory, local to the current instance of the application.
NestJS provides a built-in CacheModule, which can be configured for in-memory caching using libraries like node-cache or redis for distributed caching.
import { CacheModule, Module } from '@nestjs/common';
@Module({
imports: [
CacheModule.register({
ttl: 5, // seconds
max: 10, // maximum number of items in cache
}),
],
})
export class AppModule {}
- 2. Redis-based Caching (Distributed) =
CacheModule.register({ store: redisStore,
Provides distributed caching, ideal for scalable applications behind load balancers.
to persist the cache across multiple instances or services
import { CacheModule, Module } from '@nestjs/common';
import * as redisStore from 'cache-manager-redis-store';
@Module({
imports: [
CacheModule.register({
store: redisStore,
host: 'localhost',
port: 6379,
ttl: 600, // seconds
}),
],
})
export class AppModule {}
- 3. Browser HTTP Caching with Headers -
'Cache-Control':
import { Controller, Get, Res } from '@nestjs/common';
import { Response } from 'express';
@Controller('products')
export class ProductsController {
@Get()
findAll(@Res() response: Response) {
// Setting HTTP cache headers
response.set({
'Cache-Control': 'public, max-age=3600', // cache for 1 hour
});
response.send({ products: [...] });
}
}
- 4. Proxy Caching
Caches requests and responses at the proxy level to reduce the load on the server and speed up response times.
Caching can also be handled at the reverse proxy level
The reverse proxy caches responses and serves them from its cache to improve performance.
- 5. Db Query Caching
More like a pocket notebook, this type of cache is stored in the database and is a bit more permanent
TypeORM can cache database queries to reduce the load on your database server
Caches the results of frequently queried database data to minimise database load.
const cachedResults = await this.productRepository.find({
cache: true,
});
- 6. Caching environment variables at runtime =
ConfigModule()
If we have a lot of frequent accesses to environment variables, we want to optimise the retrival of configuration data
In a NestJS application, environment variables are commonly accessed through the ConfigService, which is used to retrieve configuration settings such as database credentials, API keys, and other sensitive information. Since accessing process.env can be relatively slow when done repeatedly, enabling caching can improve performance.
Improved Performance: Reduces the overhead of accessing environment variables multiple times, which can lead to performance bottlenecks.
import { ConfigModule } from '@nestjs/config';
@Module({
imports: [
ConfigModule.forRoot({
cache: true, // Enable caching for better performance
}),
],
})
export class AppModule {}
LINK
https://docs.nestjs.com/techniques/configuration#cache-environment-variables- Caching storage
cache-manager
- Caching storage
This cache manager offers a unified interface that supports various
cache storage
providers beyond just the default in-memory option.
import { Module } from '@nestjs/common';
import { CacheModule } from '@nestjs/cache-manager';
import { AppController } from './app.controller';
@Module({
imports: [CacheModule.register()], // enable in-memory caching
controllers: [AppController],
})
export class AppModule {}
npm install @nestjs/cache-manager cache-manager
LINK
https://docs.nestjs.com/techniques/caching
- 8. Caching Server Side External API Calls to Reduce Latency
Cache external API responses to reduce response times and API call frequency, improving performance.
We can significantly reduce latency and improve overall performance: In-Memory Caching with NodeJS: cache-manager, node-cache, redis, etc
npm install @nestjs/cache-manager cache-manager
npm install cache-manager-redis-store --save # Only if using Redis
npm install redis --save # Redis client
@Module({
imports: [
CacheModule.register({
ttl: 60, // Time to live in seconds
max: 100, // Maximum number of items in cache
}),
HttpModule, // Import HttpModule to make HTTP requests
],
providers: [AppService],
})
@Injectable()
export class AppService {
constructor(private readonly httpService: HttpService) {}
@Cacheable({ key: 'externalApi' }) // Use caching for this method
async getExternalData(): Promise<any> {
const response = await firstValueFrom(this.httpService.get('https://api.example.com/data'));
return response.data;
}
- 9. Redis CacheInterceptor Global implementation
: Think of this as taking a screenshot of a web page to quickly look at it later without reloading the entire thing
NestJS's built-in CacheInterceptor allows automatic caching of responses based on decorators or globally for all requests.
- 11. HTTP caching with @CacheKey and @CacheTTL
NestJS also provides decorators such as @CacheKey() and @CacheTTL() that offer more granular
control over caching at the method level. These can be particularly useful for caching the results of
specific API calls.
import { Controller, Get } from '@nestjs/common';
import { CacheKey, CacheTTL } from '@nestjs/common';
@Controller('data')
export class DataController {
@Get()
@CacheKey('my-custom-key')
@CacheTTL(300)
findAll() {
return this.dataService.findAll();
}
}
• @CacheKey: Assigns a custom key to the cached response, which can be useful when you
need to cache similar data under different contexts
• @CacheTTL: Overrides the default TTL, allowing you to specify how long the result of this
specific method should be cached
This level of control ensures that only the most critical data is cached, reducing unnecessary cache
pollution and ensuring that your cache is as effective as possible.
12. Cache busting
Caching is powerful, but stale data can be a significant drawback if not managed properly. Cache
busting refers to the process of invalidating or refreshing the cache when the underlying data changes.
In NestJS, you can manually clear the cache when necessary using the CacheManager service:
import { CacheService } from '@nestjs/common';
@Injectable()
export class MyService {
constructor(private cacheManager: CacheService) {}
async updateData() {
// Update data logic
Profiling and load testing 529
await this.cacheManager.del('my-custom-key');
}
}
This method removes the cached data associated with the specified key, ensuring that the next request
retrieves fresh data. Using cache-busting strategies ensures that your users always receive up-to-date information without
sacrificing the performance benefits of caching.
- database schema synchronise
database schema should be auto created on every application launch
synchronise ensures that out TypeORM entitites will be sync with the DB every time we run our App
This is GREAT for development, BUT disable this on Production!
TypeOrmModule.forRoot({
type: 'postgres',
host: 'localhost',
port: 5432,
username: 'postgres',
password: 'password',
database: 'postgres',
autoLoadEntities: true,
synchronize: true, // disable in Prod
}),
// test postgres
psql -U postgres -d postgres
[Nest] 18032 - 27/09/2024, 16:04:39 LOG [InstanceLoader] TypeOrmModule dependencies initialized +181ms
...
[Nest] 18032 - 27/09/2024, 16:04:39 LOG [InstanceLoader] TypeOrmCoreModule dependencies initialized +127ms
- Define to the Module Class which repository to use
Define to the Module Class which repository to use
We need to specify to the Module class, we are working which repository to use in the current scope (the product module's scope).
By doing so, we are telling the ProductModule module which repository we need for this scope.
NestJS does this to avoid having to load all the repositories present in the project, which would impact
the app's overall performance
// products/products.module.ts
@Module({
imports: [TypeOrmModule.forFeature([Product]), CommonModule],
// ...
)}
export class ProductsModule {}
- Use Lazy-loading:
route-based lazy loading
+Dynamic module loading
Lazy loading is a powerful design pattern that
delays the initialization of resources until they are actually needed
.Then it will be cache and any consecutive invocation
will be very fastRather than loading all objects at once, it allows the application to load the necessary data.
- Routing Level:
lazyModuleLoader
route-based lazy loading
LOading a Module from RouterModule
import { Module } from '@nestjs/common';
import { RouterModule } from '@nestjs/core';
@Module({
imports: [
RouterModule.register([
{
path: 'lazy',
loadChildren: () => import('./lazy/lazy.module').then(m => m.LazyModule),
},
]),
// other imports
],
// controllers, providers
})
export class AppModule {}
- Dynamic module loading within a service
// app.controller.ts
import { Controller, Get } from '@nestjs/common';
import { LazyModuleLoader } from '@nestjs/core';
import { ReportsModule } from './reports/reports.module';
import { ReportsService } from './reports/reports.service';
@Controller()
export class AppController {
constructor(private readonly lazyModuleLoader: LazyModuleLoader) {}
@Get()
async getLazyReport(): Promise<string> {
// use console.time() and console.timeEnd()
//to get the initialization time of ReportsModule
console.time();
const moduleRef = await this.lazyModuleLoader.load(() => ReportsModule);
const reportsService = moduleRef.get(ReportsService);
console.timeEnd();
return reportsService.getReport();
}
}
-LINK
https://docs.nestjs.com/fundamentals/lazy-loading-modules
https://medium.com/@Abdelrahman_Rezk/lazy-loading-in-nestjs-boosting-performance-and-efficiency-2c6350a6ab84
https://blog.devgenius.io/mastering-lazy-loading-in-nestjs-enhancing-application-performance-863612abaea0
- Database Query Optimisation
Optimize database queries effectively
- 1. Minimize the Number of Queries: Problem: If you make multiple small queries to the database for each part of your logic, you can introduce performance bottlenecks. Each query adds network latency, processing overhead, and can overwhelm the database with many individual requests.
SOLUTION:
Use Batch Queries
:
Instead of making multiple separate queries, you can batch them together. This reduces the round trips between your server and the database.
For example, in NestJS using TypeORM, you can retrieve all the required data in a single query using joins or relations, rather than multiple separate queries.
// TypeORM
const usersWithPosts = await this.userRepository.find({
relations: ['posts'], // Fetches users along with their related posts in a single query
});
- 2. Use ORM Streaming Capabilities:
Problem: When dealing with large datasets
, retrieving all data at once can overwhelm both the server's memory and the database. Loading huge result sets in one go can slow down the system significantly.
SOLUTION:
TypeORM can help: Use streaming to fetch data in chunks
or pagination to limit number of resuts
or cursor-based fetching to load data incrementally
- 3. Reduce the Amount of Data Fetched:
Problem:
Fetching all columns or related data when you only need a subset can unnecessarily overhead the database and increase the response time.
Solution:
Fetch only the necessary data
Retrieve only the necessary fields or rows using selective querying
For example, if you only need a user's name and email, don't fetch the entire user object.
// TypeORM
const userNames = await this.userRepository.find({
select: ['name', 'email'], // Fetch only the fields you need
});
- 4. Leverage Database Indexing:
Problem:
When the database has to scan all rows in a table to find matching results (a full table scan), performance decreases significantly, especially with large datasets.
Solution:
Leverage indexing
Use indexes to make querying faster by creating indexes on columns that are frequently used in WHERE clauses or joins. Indexes allow the database to find matching rows faster without scanning the entire table.
- 5. Avoid N+1 Query Problem:
PROBLEM
The N+1 query problem happends when you first fetch a list of entities (eg N users), and then for each user, you make an additional query to fetch related data (eg their posts). This results in N+1 total queries, leading to performance issues and increases load times
SOLUTION:
Avoid the N+1 query problem by using eager loading
or joins
to retrieve related data in a single query, reducing the number of queries. We can use TypeORM to do these.
// TypeORM
// Instead of fetching users first and then fetching posts in separate queries
const usersWithPosts = await this.userRepository.find({
relations: ['posts'], // Fetch users and their posts in one query
});
- 6. Caching query results
Temporarily storing the results of expensive database queries, and later Serve Cached Results
- Handling Malicious Request Data
-- Validate DTO
NestJS automaticly validate, whitelist and STOP DTOs requests
npm i class-validator class-transformer
// main.ts
app.useGlobalPipes(new ValidationPipe());
// .dt.ts
// validate
import {IsString} from 'class-validator';
export class CreateCoffeeDto {
@IsString()
readonly name: string;
@IsString()
readonly brand: string;
@IsString({ each: true })
readonly flavors: string[];
}
-- whitelist
// main.ts
// whitelist
app.useGlobalPipes(new ValidationPipe(
{whitelist: true,}
));
-- forbidNonWhiteListed
// main/ts
app.useGlobalPipes(new ValidationPipe(
{
whitelist: true,
forbidNonWhitelisted: true,
}
));
-- PartialType
Avoid redundant CRUD code
npm i @nestjs/mapped-types
// udpate.dto.ts
import {PartialType} from '@nestjs/mapped-types'
import {CreateCoffeeDto} from '../create-cofee.dto/create-cofee.dto'
export class UpdateDto extends PartialType(CreateCoffeeDto) {}
- Implement WebWorkers - WW
web workers enable multitasking and can run without interfering with the main event loop.
Web Workers are a way to run scripts in a separate thread from the main thread of a web application.
NestJS provides built-in support for Web Workers through the worker_threads module that comes with NodeJS
Web workers can be a powerful tool for improving the performance and scalability of web applications, especially for long-running or computationally-intensive tasks
// worker.js
const { parentPort } = require('worker_threads');
parentPort.on('message', (message) => {
console.log(`Worker thread received message: ${message}`);
parentPort.postMessage(`Worker thread received message: ${message}`);
});
// app.controller.ts
// This will start the NestJS application and create a new worker thread to the root endpoint
import { Controller, Get } from '@nestjs/common';
import { Worker } from 'worker_threads';
@Controller()
export class AppController {
@Get()
async getHello(): Promise<string> {
return new Promise((resolve) => {
const worker = new Worker('./../worker.js');
worker.on('message', (message) => {
console.log(`Main thread received message: ${message}`);
resolve(message);
});
worker.postMessage('Hello from main thread!');
});
}
}
- Implement WebSockets - WS
NestJS provides a built-in module, "@nestjs/platform-socket.io", for Socket.IO-based applications
We can create real-time-communication, WebSocket gateways, handle events, and manage connections directly within your NestJS application. It handles the underlying Socket.IO implementation, making it easier to work with real-time data.
--
- Implement true Load balancing across multiple servers
from native but basic
NodeJS Clustering
-> true:Cloud-based Load Balancer
+Reverse Proxy
Load balancing is a technique of evenly distributing incoming network traffic among
multiple servers
to prevent any single server from becoming overloaded
-- 1. NodeJS Clustering
Although not a full load balancer, NodeJS itself provides a way to distribute requests across multiple processes using the
clustering module
. NestJS, being built on top of NodeJS, supports this natively.NodeJS clustering can help implement load balancing, but it operates in a more limited capacity compared to traditional load balancers
--- Distribution limitation
As requests come in a new instance is created and assigned to different worker, helping to utilise CPU cores efficiently. This is limited to the resources of a single machine. It CANNOT distribute requests "across multiple servers or containers" for a true load balancing across multiple servers
we need: Cloud-based Load Balancer
+ Reverse Proxy
--- 2. Clouds Load Balancers
Cloud load balancers often come with built-in scalability features and can easily handle failover and health checks.
Combining (Azure) Cloud Load Balancers + Reverse Proxy = is a common and effective approach to managing traffic in cloud-based applications. This architecture can help you achieve a scalable, reliable, and performant application.
--- 3. Reverse Proxy
A common approach it to use
a reverse proxy
in your NestJS application with ExpressJS, you can implement load balancing effectively
npm install --save @nestjs/platform-express http-proxy-middleware
// The reverse proxy is mounted on the /api path, so any incoming requests to /api will be forwarded to one of the upstream servers.
import { NestFactory } from '@nestjs/core';
import { ExpressAdapter } from '@nestjs/platform-express';
import { AppModule } from './app.module';
import * as express from 'express';
import { createProxyMiddleware } from 'http-proxy-middleware';
async function bootstrap() {
const server = express();
const app = await NestFactory.create(
AppModule,
new ExpressAdapter(server)
);
const servers = [
{ target: '<http://localhost:3001>' },
{ target: '<http://localhost:3002>' }
];
servers.forEach(server => {
server.use('/api', createProxyMiddleware({
target: server.target,
changeOrigin: true,
pathRewrite: { '^/api': '' }
}));
});
await app.listen(3000);
}
bootstrap();
// Start the reverse proxy server:
node server.js
// Start the upstream servers:
// This will start two upstream servers on ports 3001 and 3002.
// That's it. You now have a reverse proxy load balancer that will distribute incoming traffic across multiple servers using ExpressJS in NestJS.
node server1.js
node server2.js
- Choose the most appropriate Dependency Injection Library
NestJS provides built-in support for dependency injection (DI), however, you can look beyond in-build options and use other DI libraries.
There are several popular DI libraries that are compatible with NestJS, including:InversifyJS
,Awilix
,TypeDI
,tsyringe
Conclusion:
- Fastest Option: tsyringe is likely the fastest due to its lightweight design and simplicity, making it ideal for performance-critical applications.
- Balanced Performance and Features: Awilix is also very fast and strikes a good balance between simplicity and the features it offers.
- For Advanced DI Needs: If you need more flexibility and advanced DI features, InversifyJS and TypeDI are great choices, though they come with slightly more overhead.
- If raw performance is your top priority, tsyringe would be the best choice
- API Architecture performance
-- 1. API Composition pattern (efficient Microservice Communication)
We have a high latency because of multiple network calls being made from client, to get the data from different backend services. This Pattern is particularly helpful when you are dealing with multiple microservices that need to be composed into a single API response.
Esentially API Composer acts as a gateway or orchestrator that manages and distributes requests to different API services, especially in microservice architectures.
-- 2. API caching using Redis
Using Redis for API Caching in NestJS
CacheModule.register({
store: redisStore,
host: 'your-redis-hostname.redis.cache.windows.net',
port: 6380, // default SSL port for Azure Redis
password: 'your-redis-access-key',
ttl: 300, // seconds
db: 0,
ssl: true, // Enable SSL for Azure Redis
}),
- Metric for insights
Consider the following metrics to get useful insights on your application's state:
• Logging and monitoring: Keep an eye on your application metrics. Tools such as Grafana and Prometheus can give you insights that are crucial for scaling.
• Performance metrics: Response times, error rates, and other key performance indicators (KPIs) should be monitored to understand how well your application is scaling.
• Monitor with NestJS DevTools and profilers: Use NestJS DevTools or integrated APM solutions, such as Jaeger or Zipkin, to monitor the performance of your services. These tools can provide insights into inter-service communication, response times, and bottlenecks
// visualize real-time performance metrics and identify areas where communication delays are occurring
npm install --save @nestjs/terminus @nestjs/devtool
- Documenting your API
--- Key components of effective API documentation
Overview of the API: Start with a high-level overview of what the API does, its main features,
and its potential use cases.
• Authentication and authorization: Clearly explain how clients should authenticate and authorise with your API. Include any keys or tokens they might need.
• Endpoint descriptions: Each endpoint should be thoroughly documented with its purpose, URI, required headers, request and response formats, and any query or path parameters.
• Error codes and messages: Document common error responses and what they mean to help users troubleshoot issues.
• Examples and use cases: Provide practical examples of requests and responses. Real-world scenarios or use cases can significantly enhance understanding.
- Adding Swagger annotations to DTOs and entities
To enhance your API documentation using Swagger in NestJS, you can use the
@ApiProperty
decorator from the@nestjs/swagger
package. This decorator allows you to add metadata to your DTOs (Data Transfer Objects) and entities, making your API documentation more descriptive and user-friendly
--- Key Benefits of Using Swagger Annotations:
• Improved API Documentation: Clearly describe your API endpoints, request/response structures, and expected behavior.
• Interactive Testing: Swagger UI allows developers to test API endpoints directly from the documentation
• Consistency: Ensures that your API documentation stays in sync with your codebase.
• Onboarding: Makes it easier for new developers to understand and use your API
// Install @nestjs/swagger
npm install @nestjs/swagger
// Add Swagger Decorators to DTOs
export class CreateUserDto {
@ApiProperty({
description: 'The email address of the user',
example: 'user@example.com',
})
@IsEmail()
email: string;
@ApiProperty({
description: 'The password of the user',
example: 'password123',
minLength: 8,
})
@IsString()
@MinLength(8)
password: string;
@ApiProperty({
description: 'The full name of the user',
example: 'John Doe',
required: false,
})
@IsString()
fullName?: string;
}
// Add Swagger Decorators to Entities
@Entity()
export class User {
@ApiProperty({
description: 'The unique identifier of the user',
example: 1,
})
@PrimaryGeneratedColumn()
id: number;
@ApiProperty({
description: 'email address of the user',
example: 'user@example.com',
})
@Column()
email: string;
@ApiProperty({
description: 'The full name of the user',
example: 'John Doe',
})
@Column()
fullName: string;
@ApiProperty({
description: 'The date the user was created',
example: '2023-10-01T12:00:00Z',
})
@Column()
createdAt: Date;
}
// Document API Endpoints in Controllers
@ApiTags('users') // Groups endpoints under "users" in Swagger UI
@Controller('users')
export class UsersController {
constructor(private readonly usersService: UsersService) {}
@Post()
@ApiOperation({ summary: 'Create a new user' })
@ApiBody({ type: CreateUserDto })
@ApiResponse({
status: 201,
description: 'The user has been successfully created.',
type: User,
})
@ApiResponse({ status: 400, description: 'Bad Request.' })
async create(@Body() createUserDto: CreateUserDto): Promise<User> {
return this.usersService.create(createUserDto);
}
@Get(':id')
@ApiOperation({ summary: 'Get a user by ID' })
@ApiParam({ name: 'id', description: 'User ID', example: 1 })
@ApiResponse({
status: 200,
description: 'The user has been successfully retrieved.',
type: User,
})
@ApiResponse({ status: 404, description: 'User not found.' })
async findOne(@Param('id') id: number): Promise<User> {
return this.usersService.findOne(id);
}
}
// main.ts
// Swagger configuration
import { NestFactory } from '@nestjs/core';
import { AppModule } from './app.module';
import { SwaggerModule, DocumentBuilder } from '@nestjs/swagger';
async function bootstrap() {
const app = await NestFactory.create(AppModule);
// Swagger configuration
const config = new DocumentBuilder()
.setTitle('User Management API')
.setDescription('API for managing users')
.setVersion('1.0')
.addTag('users')
.build();
const document = SwaggerModule.createDocument(app, config);
SwaggerModule.setup('api', app, document);
await app.listen(3000);
}
bootstrap();
- Recomendations
Last but no least, I recommend you to read the great book: Scalable Application Development with NestJS
, by (Pacifique Linjanaja)
Top comments (0)