Participating in the TiDB Future App Hackathon 2024 was an incredible journey that pushed the boundaries of our creativity and technical skills. We embarked on a mission to develop Hangout AI, a personalized travel itinerary generator that leverages advanced AI and data technologies to deliver tailored travel plans for users across Jakarta, Singapore, and Kuala Lumpur.
As someone who reached the top 60 finalists in the TiDB Future App Hackathon 2023 and received some great merchandise, I was determined to make this year even better. This time, I was well-prepared and, more importantly, not alone, which made all the difference.
Inspiration
The idea for Hangout AI was born from our shared love of exploring cafes in Jakarta as WFC (Work From Cafe) enthusiasts. We often planned hangouts that didn’t always materialize, so we thought, why not create a tool that makes planning easier and more fun? Our goal was to design an AI-driven application that could generate personalized itineraries, taking into account not just the user’s preferences but also real-time data like weather and location details.
What It Does
Hangout AI is a powerful tool that generates customized travel itineraries based on user input, location, date, and weather conditions. By integrating a Large Language Model (LLM) with Retrieval-Augmented Generation (RAG), we ensured that our recommendations were accurate, relevant, and highly personalized. The platform also offers visual previews of locations, giving users a more immersive planning experience.
How We Built It
We assembled a robust tech stack to bring Hangout AI to life:
- LLM Service: We integrated the Groq Llama3-70B-8192 model for generating travel itineraries.
- API Server: Built using FastAPI and Node.js, it handles user requests and interacts with the LLM.
- Client Application: Developed with React, Vite, and Tailwind CSS to create an engaging and responsive user interface.
- Database: We utilized TiDB and PingCAP Vector MySQL for efficient data management and retrieval.
- Data Integration: Our data pipeline involved scraping location data with Google Maps scraper and cleaning it using Python and Pandas.
- Deployment: We deployed the application using Vercel, Heroku, and a personal VPS, ensuring reliable performance.
TiDB Utilization Overview
TiDB played a critical role in our project. We utilized:
- RAG with PingCAP Vector MySQL Database: For efficient retrieval and embedding of location-based data to enhance the LLM's output.
- TiDB Serverless for MetaLocations: To manage and retrieve non-sensitive data used for visual itinerary previews.
- MySQL Database from TiDB: For handling user profiles, location data, and chat records, ensuring scalability and seamless integration.
Top comments (0)
Some comments may only be visible to logged-in visitors. Sign in to view all comments.