This is a submission for the Cloudflare AI Challenge.
What I Built
I created a conversational AI chatbot that uses Cloudflare Workers and KV storage to provide users with a persistent, intelligent conversational experience. By leveraging the power of Cloudflare's distributed network and serverless execution environment, the chatbot is highly available and responsive, ensuring a seamless interaction for users.
Demo
Experience the AI chatbot in action here: Deployed Chatbot
My Code
Explore the code repository here:
https://github.com/tom-log/contextual-conversation-worker
Journey
During the development process, I delved into the capabilities of Cloudflare Workers and AI models. I decided to focus on creating a chatbot that could store the context of a conversation, which introduced me to the use of Cloudflare KV storage system.
The documentation provided by Cloudflare was instrumental in helping me understand how to integrate AI models with serverless functions. Throughout this project, I improved my understanding of asynchronous JavaScript and how to interact with external APIs in a serverless environment.
What I'm particularly proud of is the chatbot's ability to maintain a conversation thread, which makes the interaction feel more natural and engaging.
Multiple Models and/or Triple Task Types
For this project, I focused on a single task type, which is conversational AI. The model used was @cf/meta/llama-2-7b-chat-int8
for generating chatbot responses based on the conversation context stored in Cloudflare KV.
If you'd like to share insights or pose any queries, the comments section awaits your valued input!
Top comments (0)