Outline
nestia
and fastify
enhances NestJS
server performance about 10x to 30x higher.
- Previous Article: Boost up NestJS server much faster and easier
- Github Repository: https://github.com/samchon/nestia
- Guide Documents: https://nestia.io/docs
In previous article, I introduced my library nestia
, making NestJS
much easier and much faster. In the article, introducing performance enhancements, I had told that nestia
can boost up vaildation speed maximum 20,000x faster (JSON serialization is 200x faster).
By the way, some people have asked me like that:
Okay, your
nestia
makes NestJS server faster, by boosting up validation and serialization speeds enormously. By the way, how about entire server performance? I especially want to know that how muchnestia
can increase the number of simultaneous connections.How about
nestia
performance in the entire server level?
Today, I came back with answer. The answer is, nestia
(+ fastify
) increases NestJS
server availabitily about 10x to 30x. I'll show you how I've measured it, and describe how only validation and JSON serialization can affect entire server performance.
Measured on Surface Pro 8
For reference, you can run the benchmark program on your computer by following below commands. After the benchmark, a report would be issued under nestia/benchmark/results/{YOUR-CPU-NAME}
directory. If you send the result PR on my repo (https://github.com/samchon/nestia), I'd pleasure and appreciate it even more.
git clone https://github.com/samchon/nestia
cd nestia/benchmark
npm install
npm start
Validation
How to use
import { TypedBody, TypedParam, TypedRoute } from "@nestia/core";
import { Controller } from "@nestjs/common";
import { IBbsArticle } from "./IBbsArticle";
@Controller("bbs/articles")
export class BbsArticlesController {
@TypedRoute.Put(":id")
public async update(
@TypedParam("id", "uuid") id: string,
@TypedBody() input: IBbsArticle.IUpdate, // 20,000x faster validation
): Promise<void> {}
}
When you develop a NestJS
backend server with nestia
, you can easily validate request body data just by using @nestia.TypedBody()
decorator function like above.
For reference, unlike class-validator
and class-transform
being used by NestJS
which require triple duplicated schema definitions, nestia
can utilize pure TypeScript type. Look at below code snippet, then you may understand how nestia
makes DTO schema definition easily.
//----
// NESTJS (class-validator + class-transform) REQUIRES
// TRIPLE DUPLICATED DEFINITION
//----
export class BbsArticle {
@ApiProperty({
type: () => AttachmentFile,
nullable: true,
isArray: true,
description: "List of attached files.",
})
@Type(() => AttachmentFile)
@IsArray()
@IsOptional()
@IsObject({ each: true })
@ValidateNested({ each: true })
files!: AttachmentFile[] | null;
}
//----
// BESIDES, NESTIA UNDERSTANDS PURE TYPESCRIPT TYPE
//----
export interface IBbsArticle {
files: IAttachmentFile[] | null;
}
Individual Performance
Measured on Intel i5-1135g7, Surface Pro 8
When measuring validation performance, nestia
(nestia
utilizes typia.assert<T>()
function) is maximum 20,000x times faster than class-validator
used by NestJS
by default.
How do you think about, if such a fast validation speed is applied to entire server level? As validation of request body data takes small portion of the entire backend server, so is this performance difference not sufficiently impactful at the overall server level? Or 20,000x times gap is an enormous value, therefore would affect to the entire server performance?
Let's see below server benchmark graph.
Server Performance
Measured on Intel i5-1135g7, Surface Pro 8
The answer was the entire server level performance be affected significantly.
When comparing performance in the entire server level with simultaneous connections, nestia
can increase the number of simultaneous connections about 10x higher than NestJS
. If adapt fastify
, such a performance gap would be increased up to 25x. Besides, adapting fastify
in NestJS
only gains performacne about 1~2%.
I think such significant difference caused by two reasons.
The 1st is: validations are processed in the main thread. As you know, the strength of NodeJS is events represented by non-blocking I/O, all of which run in the background. However, request body data validation is processed in the main thread, so if such validation logic is slow, it stops entire backend server.
The 2nd reason is just 20,000x gap. Even though request body data validation is a small work within framework of the entire server processes, if the performance gap is 20,000x times, it would be a significant difference.
Considering main thread operation with 20,000x performance gap, above benchmark result is enough reasonable.
Reference
For reference, request body validation utilzed an Array instance with length 100. If reduce the length to be 10, performance enhancement be halfed (about 60%). Otherwise, as increase the length to be larger as, performance enhancement be dramatically increased.
// "IBox3D" SIMILAR DTOS ARE USED, WITH 100 LENGTH ARRAY
export interface IBox3D {
scale: IPoint3D;
position: IPoint3D;
rotate: IPoint3D;
pivot: IPoint3D;
}
export interface IPoint3D {
x: number;
y: number;
z: number;
}
JSON Serializaiton
How to use
import { TypedBody, TypedParam } from "@nestia/core";
import { Controller } from "@nestjs/common";
import typia from "typia";
import { IBbsArticle } from "./IBbsArticle";
@Controller("bbs/articles")
export class BbsArticlesController {
@TypedRoute.Get(":id") // 200x faster JSON serialization
public async at(
@TypedParam("id", "uuid") id: string
): Promise<IBbsArticle> {
return typia.random<IBbsArticle>();
}
}
When you develop a NestJS
backend server with nestia
, you can easily boost up JSON serialization speed just by using @nestia.EncryptedRoute.${method}()
decorator function like above.
For reference, unlike class-validator
and class-transform
being used by NestJS
which require triple duplicated schema definitions, nestia
can utilize pure TypeScript type. Look at below code snippet, then you may understand how nestia
makes DTO schema definition easily.
//----
// NESTJS (class-validator + class-transform) REQUIRES
// TRIPLE DUPLICATED DEFINITION
//----
export class BbsArticle {
@ApiProperty({
type: () => AttachmentFile,
nullable: true,
isArray: true,
description: "List of attached files.",
})
@Type(() => AttachmentFile)
@IsArray()
@IsOptional()
@IsObject({ each: true })
@ValidateNested({ each: true })
files!: AttachmentFile[] | null;
}
//----
// BESIDES, NESTIA UNDERSTANDS PURE TYPESCRIPT TYPE
//----
export interface IBbsArticle {
files: IAttachmentFile[] | null;
}
Individual Performance
Do you remember? I'd written an article about my another library typia
and had compared JSON serialization performance between typia
and class-transformer
. In the previous benchmark, typia
was maximum 200x times faster than class-transformer
.
For reference, nestia
utilizes typia
, and NestJS
utilizes class-transformer
.
- Previous Article: I made Express faster than Fastify
- Github Repository: https://github.com/samchon/typia
- Guide Documents: https://typia.io/docs
Measured on Intel i5-1135g7, Surface Pro 8
How do you think about, if such a fast JSON serialization speed is applied to entire server level? As JSON serialization performance enhancement is much smaller than validator case (200x vs 20,000x), so is this performance difference not sufficiently impactful at the overall server level? Or 200x times gap would affect to the entier server performance, because JSON serialization is heavier work than validation?
Let's see below server benchmark graph.
Server Performance
Measured on Intel i5-1135g7, Surface Pro 8
The answer was the entire server level performance be affected significantly, too.
When comparing performance in the entire server level with simultaneous connections, nestia
can increase the number of simultaneous connections about 10x higher than NestJS
. If adapt fastify
, such a performance gap would be increased up to 18x. Besides, adapting fastify
in NestJS
only gains performacne about 0~10%.
I think such significant difference caused by two reasons.
The 1st reason is same with validation case. JSON serializations are processed in the main thread. As you know, the strength of NodeJS is events represented by non-blocking I/O, all of which run in the background. However, request body data validation is processed in the main thread, so if such validation logic is slow, it stops entire backend server.
The 2nd reason is JSON serialization is heavier process than validation. Therefore, even though JSON serialization gains less performance than validation (200x vs 20,000), it would be still significant at the entire server level.
Considering main thread operation and heavier JSON serialization process than validation, above benchmark result is enough reasonable.
Composite Performance
import { TypedBody, TypedRoute } from "@nestia/core";
import { Controller } from "@nestjs/common";
import { IBbsArticle } from "./IBbsArticle";
@Controller("bbs/articles")
export class BbsArticlesController {
@TypedRoute.Post()
public async store(
@TypedBody() input: IBbsArticle.IStore
): Promise<IBbsArticle> {
return {
...input,
id: "2b5e21d8-0e44-4482-bd3e-4540dee7f3d6",
created_at: "2023-04-23T12:04:54.168Z",
}
}
}
The last benchmark is about composite performance, validating request body data and serialize JSON response data at the same time. As nestia
had shown significant performance gaps, composite benhcmark also shows significant performance gaps.
Let's see below benchmark graph, and imagine how much performance would be increased if you adapt nestia
in your NestJS
backend server. I think that no more reason not to use nestia
. It is much faster, and even much easier.
Measured on Intel i5-1135g7, Surface Pro 8
Conclusion
-
nestia
boosts upNestJS
server performance significantly - If adapt
fastify
withnestia
, the performance would be increased more - Otherwise adapt
fastify
withoutnestia
, the performance would not be increased - Let's use
nestia
when developingNestJS
backend server- Much faster
- Much easier
- Even supports SDK generation like
tRPC
Left is server code, and right is client (frontend) code
Top comments (10)
here typed.body allows additional props after meeting the interface or type. BUt allowing additional props is not a nice thing
You can do it by editing
tsconfig.json
file.Configure like that, then additional properties would be blocked:
Thanks that worked! Actually, tried some basics typia functions but didn't have in depth knowledge about all typia function that's why failed to interprete you comment in that Json file.
Last Question I Have on this library:
Was trying you nestia core portion but I was just missing one feature of class Transformed that I couldn’t do using nestia core.
github.com/ToxicalNoob3062/CarSell...
here in this file you can see I can use @Transform() decorator to transform the passed object property to a different value! But in nestia I can say may dto to transform some return object values to something else before sending it .
Nestia recommends to use pure TypeScript type instead or class.
What you want is on preparation, but cannot sure when to be completed
github.com/samchon/typia/issues/683
Ok man no problem! Try your best! ❤And thanks for nestia and typia . Useful Enough! But feels like 50% is there which is class validation. But rest 50% is not there which is class transformation.
what are the cons of using nestia+fasting instead of the others?
Well, nothing?
Well, as of today you can’t do codefirst GraphQL servers with nestia? Nestia only does REST ?
Only for Rest now. Plan to support typescript type to graphql schema converter in
typia
, but priority is low.Anyway, I had seen in many channels requesting GraphQL features in
nestia
. If you really want the feature, can you suggest me how to design the related CLI, SDK interfaces, as writing an detailed feature request issue innestia
?After those jobs be completed, I'll try what you want:
swagger.json
filetypia
swagger.json
file)typia
typia
This is all very nice but very misleading, this cannot replace class-transform as it lacks the ability to transform data.
typia is just, at best, a replacement for class-validator, using this will require you to transform your data in a separate way which will require an additional schema definition, negating the advantage you outline here.