This is a submission for the Cloudflare AI Challenge.
What I Built
A simple chatbot app using llama-2-7b-chat-fp16 model from Cloudflare Worker and Vercel AI SDK
Demo
My Code
Journey
Just a simple chatbot using Next.js, the prebuilt utility useChat from the Vercel AI SDK to handle streaming responses of Cloudflare Worker, and utilizing the prebuilt chat bubble component from DaisyUI for the UI.
Tech Stack
- Cloudflare Pages – deployments
- Cloudflare Worker AI Model - text generation
- Next.js – framework
- Vercel AI SDK – AI library
- TailwindCSS, DaisyUI - styles
Top comments (9)
Great demo :)
It's fascinating that you have used the "Vercel AI SDK" for the streaming purposes.
I wasn't even aware there was a Vercel AI SDK. I wish I knew that a few months ago
Nice work!
Why did you use DaisyUI when the Vercel AI SDK comes with shadcn?
Shadcn does not have a chat bubble component like daisyUI, and as a mobile developer, I’m not very proficient in web UI, so I prefer to choose components available in daisyUI.
Nice work Trieu! I tried to run a local version and ran into some issues.
Opened an issue on github.
It is a breaking change for StreamingTextResponse in Vercel AI SDK 3.0.20. . I've fixed this issue in a new commit.
Wow, quick fix thank you! Just got it working.
In preview mode "npm run preview" the response isn't being streamed, there is a delay and it's returned onFinished (feels like REST)
Is this expected behaviour? Would be nice to see it stream locally in dev mode
I also encountered a similar issue when running in preview mode.