DEV Community

Cover image for 7 open-source tools you need to master AI development as a beginner πŸ§™β€β™‚οΈπŸͺ„
John Cook
John Cook

Posted on

7 open-source tools you need to master AI development as a beginner πŸ§™β€β™‚οΈπŸͺ„

AI is eating the software, and to be AI ready can give you a better edge in the times to come. You can get into new emerging companies or build cutting edge applications.
However, deciding the tools to build your apps can be challenging. It should be in

  • Active development
  • Great community
  • Great support

So, I have made a list of such tools that can make your AI developement journey smooth and rewarding.

Family guy dancing


1. Composio πŸ‘‘Β - A comprehensive toolkit for building production-ready AI agents

I was struggling with integrating GitHub and Linear with my agent to convert tickets to pull requests, handling authentication flows was too much, too time taking, I wanted a simple solution and Composio is the only one that stood out. It has over 250 such integrations and supports LangChain, LlamaIndex, and all those frameworks.

Getting started with it is very easy.

npm install composio-core openai
Enter fullscreen mode Exit fullscreen mode

Connect your GitHub Account

import { Composio } from "composio-core";

const client = new Composio({ apiKey: "<your-api-key>" });

const entity = await client.getEntity("Jessica");
const connection = await entity.initiateConnection({appName: 'github'});

console.log(`Open this URL to authenticate: ${connection.redirectUrl}`);
Initialize Composio and OpenAI

import { OpenAI } from "openai";
import { OpenAIToolSet } from "composio-core";

const openai_client = new OpenAI();
const composio_toolset = new OpenAIToolSet();
Fetch GitHub actions and pass them to the LLM

const tools = await composio_toolset.getTools({
actions: ["github_star_a_repository_for_the_authenticated_user"]
});

const instruction = "Star the repo composiohq/composio on GitHub";

const response = await openai_client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: instruction }],
tools: tools,
tool_choice: "auto",
});
Execute the tool calls.

const result = await composio_toolset.handleToolCall(response);
console.log(result);
Enter fullscreen mode Exit fullscreen mode

The documentation provides more on Composio, its work, and important concepts for making capable production-ready agents.

With Composio, you can focus on building great AI experiences while it handles the complexity of tool integration and security.

Composio

Star the Composio repository ⭐


2. E2B - Open-source runtime for executing AI-generated code in secure sandboxes

E2B is your AI agent's trusted companion for code execution. It provides secure, isolated sandboxes in the cloud where AI-generated code can run safely, supporting any LLM and multiple AI frameworks. Perfect for building AI agents that need to analyze data, execute code, or perform complex reasoning tasks.Installation:

npm i @e2b/code-interpreter @anthropic-ai/sdk dotenv
Enter fullscreen mode Exit fullscreen mode

Usage:

import { Sandbox } from '@e2b/sdk'

async function runCode() {
  // Create a new sandbox
  const sandbox = await Sandbox.create()

  // Execute code in the sandbox
  const result = await sandbox.run(`
    const data = require('fs').readFileSync('data.csv', 'utf8')
    console.log(data)
  `)

  // Close the sandbox when done
  await sandbox.close()
}

runCode()
Enter fullscreen mode Exit fullscreen mode

E2B's sandbox environments can run for up to 24 hours, making it perfect for long-running AI agent tasks and complex data analysis workflows.

E2B Image

Star the E2B repository ⭐


3. Vercel AI SDK - The full-stack AI SDK for building AI-powered user interfaces

The Vercel AI SDK simplifies building AI-powered streaming text and chat UIs. It provides a set of hooks and utilities that make integrating various AI models into your Next.js applications easy.

npm install ai
Enter fullscreen mode Exit fullscreen mode
import { useChat } from 'ai/react'

export default function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat()

  return (
    <div>
      <ul>
        {messages.map((m, i) => (
          <li key={i}>{m.content}</li>
        ))}
      </ul>
      <form onSubmit={handleSubmit}>
        <input
          value={input}
          onChange={handleInputChange}
          placeholder="Say something..."
        />
      </form>
    </div>
  )
}
Enter fullscreen mode Exit fullscreen mode

The Vercel AI SDK streamlines AI integration with built-in support for streaming, rate limiting, and various AI providers.

Vercel AI SDK

Star the Vercel AI SDK repository ⭐


4. LangGraph - Build and execute stateful LLM agents and flows

LangGraph and LangSmith provide a powerful toolkit for creating complex AI workflows. It enables you to build stateful agents that can maintain context and execute multi-step reasoning tasks with built-in monitoring and debugging capabilities.

npm install langgraph langsmith
Enter fullscreen mode Exit fullscreen mode
import { LangGraph, StateManager } from 'langgraph'
import { LangSmith } from 'langsmith'

const workflow = new LangGraph({
  nodes: {
    start: async (state) => {
      const response = await llm.chat("Process this request: " + state.input)
      return { output: response }
    },
    analyze: async (state) => {
      return { analysis: await processResponse(state.output) }
    }
  },
  edges: {
    start: ['analyze'],
    analyze: ['end']
  }
})

const result = await workflow.invoke({ input: "User query" })
Enter fullscreen mode Exit fullscreen mode

LangGraph and LangSmith together provide a robust foundation for building and monitoring production-grade AI applications.

langGraph

Star the LangGraph repository ⭐


5. ChromaDB- The AI-native open-source embedding database

ChromaDB is a modern database built specifically for AI applications, offering efficient storage and retrieval of embeddings. It's perfect for semantic search, recommendation systems, and other vector-based AI applications.

npm install chromadb
Enter fullscreen mode Exit fullscreen mode
import { ChromaClient } from 'chromadb'

async function searchDocuments() {
  const client = new ChromaClient()

  *// Create a collection*
  const collection = await client.createCollection('documents')

  *// Add documents with embeddings*
  await collection.add({
    ids: ["id1", "id2"],
    embeddings: [[1.1, 2.3, 3.2], [4.5, 6.9, 7.2]],
    documents: ["first document", "second document"]
  })

  *// Query similar documents*
  const results = await collection.query({
    queryEmbeddings: [1.1, 2.3, 3.2],
    nResults: 2
  })
}
Enter fullscreen mode Exit fullscreen mode

ChromaDB makes building and scaling vector search applications easy with its simple API and powerful features.

ChromaDB

Star the ChromaDB repository ⭐


6. LiteLLM- Universal API for LLM calls

LiteLLM provides a unified interface to call multiple LLM providers, making it simple to switch between or combine different AI models in your applications. It supports OpenAI, Anthropic, Google, and many other providers.

npm install litellm
Enter fullscreen mode Exit fullscreen mode
import { LiteLLM } from 'litellm'

async function callMultipleModels() {
  const llm = new LiteLLM({
    api_key: process.env.OPENAI_API_KEY
  })

  *// Call different models with the same interface*
  const gpt4Response = await llm.complete({
    model: "gpt-4",
    messages: [{ role: "user", content: "Hello!" }]
  })

  const claudeResponse = await llm.complete({
    model: "claude-2",
    messages: [{ role: "user", content: "Hello!" }]
  })
}
Enter fullscreen mode Exit fullscreen mode

LiteLLM simplifies model management and provides consistent error handling across different LLM providers.

LiteLLM Image

Star the LiteLLM repository ⭐


7. LlamaIndex - Data framework for LLM applications

LlamaIndex helps you connect custom data sources to LLMs. It provides tools for data ingestion, structuring, and query optimization, making building LLM applications that can reason over your private data easier.

npm install llamaindex
Enter fullscreen mode Exit fullscreen mode
import { Document, VectorStoreIndex } from 'llamaindex'

async function queryDocuments() {
  *// Create documents*
  const documents = [
    new Document({ text: "Sample document content" }),
    new Document({ text: "Another document here" })
  ]

  *// Create and query index*
  const index = await VectorStoreIndex.fromDocuments(documents)
  const queryEngine = index.asQueryEngine()

  const response = await queryEngine.query(
    "What are the main topics in the documents?"
  )

  console.log(response.toString())
}
Enter fullscreen mode Exit fullscreen mode

LlamaIndex bridges the gap between your data and LLMs, making it easier to build powerful AI applications with custom knowledge.

LlamaIndex

Star the LlamaIndex repository ⭐


Thanks for reading, let me know what some tool you've found really helpful.

Top comments (0)