DEV Community

octopus inc
octopus inc

Posted on

Create agentic systems by just describing what you want.

Introduction

Agentic AI is a growing trend among developers and companies wanting to get the most out of intelligent machines. There is a growing list of impacful use cases, from marketing automation to customer support and beyond.

However, building agentic systems today is still a cumbersome process, even for seasoned AI engineers. Most frameworks, like langchain, crewAI introduce unnecessary abstrations that make debugging difficult. In fact, many people prefer to write their code from scratch after a few frustated attempts.

But can LLMs themselves help to write code for agentic systems, and hence reduce the pain of development? In principle yes, like any other code. But current frameworks are not optimized for that: because they are excessively verbose, the LLM spends an unnecessary amount of tokens and brainpower generating classes, calling methods, which are needed for the framework itself, while beign unrelated to the workflow logic.

GenSphere: a programming language designed for LLMs to use while coding

GenSphere is an open-source, python-based project that enables developers to define AI workflows using simple YAML files, outlining the tasks and their connections without delving into the procedural complexities. Each node in the workflow represents a high-level operation — be it a function call, an LLM API request, or another nested workflow. By abstracting the execution logic, GenSphere lets you focus on orchestrating sophisticated AI applications declaratively.

Because the YAML files are an economical description of workflow logic, which abstract away the execution code, one can use LLMs themselves to generate those files from high-level prompts. The LLM will spend their output tokens and brainpower fully on implementing the workflow logic instead of writing code. This opens the possibility of building full-fledged text-to-agents workflows.

How GenSphere works

On GenSphere, you build LLM applications with YAML files, that define an execution graph. Nodes can be either LLM API calls, regular function executions or other graphs themselves.

Because you can nest graphs easily, building complex applications is not an issue, but at the same time you don’t lose control. The YAML basically states what are the tasks that need to be done and how they connect. Other than that, you only write individual python functions to be called during the execution. No new classes and abstractions to learn.

Generate agentic systems by prompting

As GenSphere projects are defined by simple YAML files (and their associated functions and schemas) one can use LLMs themselves to generate those files.

This Google Colab example shows how to use a GenSphere project from the public Hub to take a task.txt file as input and output another GenSphere project that accomplishes that task. You can then run that GenSphere project to get your final output.

The example in the notebook uses the following task as input:

Your task is to generate script for 10 YouTube videos, about 5 minutes long each.
Our aim is to generate content for YouTube in an ethical way, while also ensuring we will go viral.
You should discover which are the topics with the highest chance of going viral today by searching the web.
Divide this search into multiple granular steps to get the best out of it. You can use Tavily and Firecrawl_scrape
to search the web and scrape URL contents, respectively. Then you should think about how to present these topics in order to make the video go viral.
Your script should contain detailed text (which will be passed to a text-to-speech model for voiceover),
as well as visual elements which will be passed to as prompts to image AI models like MidJourney.
You have full autonomy to create highly viral videos following the guidelines above. 
Be creative and make sure you have a winning strategy.
Enter fullscreen mode Exit fullscreen mode

It outputs a full workflow with 12 nodes, multiple rounds of searching and scraping the web, LLM API calls, (attaching tools and using structured outputs autonomously in some of the nodes) and function calls.

Image description

Learn more about GenSphere

Top comments (0)