DEV Community

Cover image for Using the Web as your API
Thomas Hansen for AINIRO.IO

Posted on • Edited on • Originally published at ainiro.io

Using the Web as your API

In our last article we demonstrated how to build an AI-bases shopping cart that can basically sell items directly from within our AI chatbot.

In this article we take it one step further by asking our chatbot how the weather is tomorrow, for then to find the best restaurants in our city, and having the AI chatbot then list the restaurant's reviews on TripAdvisor - For then to finish up with giving us the restaurant's phone number such that we can call and book a table.

We even use our AI chatbot to hire a ProductHunt hunter for us, by having the AI chatbot crawl and scrape ProductHunt, giving us the LinkedIn profile page of today's winner.

How it works

Basically, we're using a combination of system messages (instructions) and a VSS-based RAG database. Our most important functions, such as "scrape the web" and "search the web" can be found in our system message. While edge functions, typically not that important, are found in our RAG database as training snippets.

When we tell OpenAI that we want to search the web, it will return something resembling the following.

___
FUNCTION_INVOCATION[/modules/openai/workflows/workflows/web-search.hl]:
{
  "query": "[query]",
  "max_tokens": "[max_tokens]"
}
___
Enter fullscreen mode Exit fullscreen mode

It will always be invoked with 4,000 as max_tokens, but the query itself will be dynamically created by ChatGPT, to create an optimised search query, more likely to return the correct result. If we for instance ask the question "Does Magnus Carlsen have a girlfriend? Create a two paragraph summary", the AI chatbot will probably end up searching for "Magnus Carlsen girlfriend", before scraping the top results from DuckDuckGo, and creating a two paragraph summary.

The point being that the above filepath after our FUNCTION_INVOCATION is a Hyperlambda workflow, that invokes DuckDuckGo with a search query, scrapes the top results until it's got enough tokens, and then sends the result back to OpenAI again to answer our original question. Below is the sequence.

  1. Invoke ChatGPT
  2. If ChatGPT doesn't return a function we're done
  3. If ChatGPT returns a function invocation we execute that function
  4. Our cloudlet invokes ChatGPT again, now with the result of our function invocation, to answer the original query supplied by the user

The whole thing is recursive in nature, allowing us to phrase questions implying multiple function invocations such as for instance.

Search for xyz and create a 5 paragraph summary. Then find the authors of each article and find their LinkedIn profiles.

For security reasons to avoid infinite loops eating up your OpenAI tokens, we stop the process after a maximum of 5 invocations towards OpenAI.

Wrapping up

In the above video we demonstrate how to check the weather, how to search TripAdvisor for a restaurant in some specific city, how to follow links to each restaurant's TripAdvisor profile, and listing the top 5 reviews for a specific restaurant. We then ask our AI chatbot if it can find one of our restaurant's websites, scrape it, and return the phone number such that we can book a table.

Afterwards we ask our AI chatbot to list all products at ProductHunt, crawl into 3 individual launches, find the creators, for then to search the web for one of our creator and create a summary of her skills. Then we ask the AI chatbot to find the person's LinkedIn profile, and return this as a hyperlink to us.

3 days ago all of this was pure science fiction, and not even ChatGPT can do stuff such as this. Today it's all available for you in your AINIRO cloudlet to install into your AI chatbot.

Top comments (2)

Collapse
 
capte profile image
Capte

Awensome post!

Collapse
 
polterguy profile image
Thomas Hansen

Thx mate :)
Did you see the viddi ...?