Preface
- Demonstration
- Shopping A.I. Chatbot Application: https://nestia.io/chat/shopping
- Shopping Backend Repository: https://github.com/samchon/shopping-backend
- Shopping Swagger Document (
@nestia/editor
): https://nestia.io/editor/?url=...
- Related Libraries
- Source Code:
ShoppingChatApplication.tsx
The above demonstration shows Shopping A.I. chatbot built with Swagger document.
As you can see, in the Shopping A.I. chatbot application, the user can do everything written in the Swagger documents just by conversation texts. Searching products, taking orders, checking delivery status, user can do these things by chatting texts.
Just by delivering the Swagger document, Super A.I. chatbot performing the LLM (Large Language Model) function calling is automatically composed. The Super A.I. chatbot will select proper functions defined in the Swagger document by analyzing conversation contexts with the user. And then Super A.I. chatbot requests the user to write arguments for the selected functions by conversation texts, and actually calls the API function with the argument. This is the key concept of the Nestia A.I. chatbot.
In other words, every backend servers providing Swagger documents can be conversed to the A.I. chatbot. In the new A.I. era, you don’t need to develop GUI (Graphical User Interface) application more. Just prepare swagger document, and let the A.I. chatbot to do the rest. The A.I. chatbot can replace the GUI application development, and it can be more efficient and user-friendly than the traditional GUI applications.
Playground
You can test your backend server's A.I. chatbot with the following playground.
Upload your Swagger document file to the playground website, and start conversation with your backend server. If your backend server's documentation is well written so that the A.I. chatbot quality is satisfiable, you can start your own A.I. chatbot service from the next section #Application Development.
Application Development
import { NestiaAgent } from "@nestia/agent";
import { NestiaChatApplication } from "@nestia/chat";
import {
HttpLlm,
IHttpConnection,
IHttpLlmApplication,
OpenApi,
} from "@samchon/openapi";
import OpenAI from "openai";
import { useEffect, useState } from "react";
export const ShoppingChatApplication = (
props: ShoppingChatApplication.IProps,
) => {
const [application, setApplication] =
useState<IHttpLlmApplication<"chatgpt"> | null>(null);
useEffect(() => {
(async () => {
setApplication(
HttpLlm.application({
model: "chatgpt",
document: OpenApi.convert(
await fetch(
"https://raw.githubusercontent.com/samchon/shopping-backend/refs/heads/master/packages/api/customer.swagger.json",
).then((r) => r.json()),
),
}),
);
})().catch(console.error);
}, []);
if (application === null)
return (
<div>
<h2>Loading Swagger document</h2>
<hr />
<p>Wait for a moment please.</p>
<p>Loading Swagger document...</p>
</div>
);
const agent: NestiaAgent = new NestiaAgent({
provider: {
type: "chatgpt",
api: props.api,
model: "gpt-4o-mini",
},
controllers: [
{
protocol: "http",
name: "main",
application,
connection: props.connection,
},
],
config: {
locale: props.locale,
},
});
return <NestiaChatApplication agent={agent} />;
};
export namespace ShoppingChatApplication {
export interface IProps {
api: OpenAI;
connection: IHttpConnection;
name: string;
mobile: string;
locale?: string;
}
}
Developing Super A.I. chatbot is very easy. Load your swagger document, and compose OpenAI functino calling schema by HttpLlm.application()
function of @samchon/openapi
library. And render <NestiaChatApplication />
component with the schema.
Then you can start conversation with your backend server. Have a good time with your backend server, and feel the new A.I. era.
Make your own A.I. chatbot
Above @nestia/agent
and @nestia/chat
libraries are just for testing and demonstration. I’ve made them to prove a conncept that every TypeScript classes can be conversed with the A.I. chatbot, and typia
/ nestia
are especially efficient for the A.I. chatbot development purpose.
However, @nestia/agent
support only OpenAI, and has not optimized for specific purpose. As it has not been optimized without any RAG (Retrieval Augmented Generation) models, it may consume a lot of LLM cost than what you may expected. Therefore, use the @nestia/agent
for studying the A.I. chatbot development, or just demonstrating your TypeScript class before the production development.
- Source Codes:
-
@nestia/agent
: https://github.com/samchon/nestia/tree/master/packages/agent -
@nestia/chat
: https://github.com/samchon/nestia/tree/master/packages/chat
-
When developing your own A.I. chatbot, you need to learn how to converse the OpenAPI document to LLM function calling schema. Visit @samchon/openapi
repository, and read README document of it. Then you may easily understand.
Next Episode
I'll introduce a new open source A.I. chatbot builder platform everyone can use even containing someone who does not know programming at all.
Top comments (0)