DEV Community

Cover image for Introduction to AI Gateway
Mrunmay Shelar for LangDB

Posted on

Introduction to AI Gateway

Rise Of LLMs

In the ever-evolving landscape of technology, few innovations have created waves as transformative as artificial intelligence (AI).

The rise of AI—fueled by large language models (LLMs)—is reshaping how we think about software, automation, and the user experience. Much like the pivotal shifts brought by mobile computing, cloud infrastructure, and microservices architecture, AI represents a foundational shift in how we design and deliver technology.

As we embrace this new era of AI, large language models are not just reshaping what’s possible—they’re redefining how we approach technology itself. However, this transformation is only as strong as the underlying frameworks that support it. APIs, as the silent workhorses of modern software, serve as the crucial bridge connecting AI’s potential with real-world applications, ensuring its seamless integration into the fabric of our digital ecosystem.

What is an AI Gateway

An AI Gateway is a middleware platform that simplifies the integration, management, and scaling of artificial intelligence (AI) models and services within an organization’s IT infrastructure. It serves as a critical bridge between AI systems—such as large language models (LLMs)—and the applications or services that consume them.

By acting as the backbone of AI integration, AI Gateways empower developers to harness the full potential of AI while minimizing operational challenges, paving the way for smoother, faster, and more effective AI adoption.

This is particularly useful in scenarios where you want to try out multiple models from different AI providers (e.g., OpenAI, Anthropic, Gemini) that need to be orchestrated, where performance and cost need to be optimized, or where guardrails like data security and monitoring are crucial. By centralizing and streamlining AI integrations, AI Gateways empower developers to focus on building smarter applications instead of reinventing the operational wheel.

How LangDB can help

LangDB’s AI Gateway offers a streamlined way to integrate and manage multiple Large Language Models (LLMs) with minimal effort. With just a single line of code, developers can seamlessly connect to various AI providers, ensuring efficient operations while keeping costs under control. It’s designed to simplify the complexity of working with LLMs, so you can focus on building smarter applications, faster.

Workflow of LangDB

Let’s look at an example of how LangDB works with LangChain to show its simplicity and power in action:

from langchain import hub
from langchain.agents import (
    AgentExecutor,
    create_react_agent,
    create_openai_functions_agent,
    create_tool_calling_agent
)
from langchain_core.tools import Tool
from langchain_openai import ChatOpenAI
import requests
import json
import os
from langchain_community.agent_toolkits.load_tools import load_tools
from langchain_community.tools.tavily_search.tool import TavilySearchResults
from langchain_community.utilities.tavily_search import TavilySearchAPIWrapper
import uuid


api_base = "https://api.staging.langdb.ai"  # LangDB API base URL
pre_defined_run_id =  uuid.uuid4()
default_headers = {"x-project-id": "xxxx", ## Enter LangDB Project ID
                  "x-thread-id": pre_defined_run_id} 
os.environ['OPENAI_API_KEY'] = 'xxxxx' ## Enter LangDB API Key
os.environ['TAVILY_API_KEY'] = 'xxxxx' ## Enter Tavily API Key



def get_function_tools():
  search = TavilySearchAPIWrapper()
  tavily_tool = TavilySearchResults(api_wrapper=search)

  tools = [
      tavily_tool
  ]

  tools.extend(load_tools(['wikipedia']))

  return tools



def init_action():
  llm = ChatOpenAI(model_name='gpt-4o-mini' , temperature=0.3, openai_api_base=api_base, default_headers=default_headers )
  prompt = hub.pull("hwchase17/openai-functions-agent")
  tools = get_function_tools()
  agent = create_tool_calling_agent(llm, tools, prompt)
  agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
  response = agent_executor.invoke({"input": "Who is the owner of Tesla company? Let me know details about owner."})

init_action()
Enter fullscreen mode Exit fullscreen mode

Dynamic Model Switching

One of LangDB’s most powerful feature is the ability to seamlessly switch between different models without major changes in the existing codebase. In the above example, if you want to use Claude 3.5 Sonnet, all you need to do is update the model name in your configuration:

# Switch to Anthropic's Claude model
llm = ChatOpenAI(
    model_name='claude-3-sonnet-20240229',  # Change model here
    temperature=0.3, 
    openai_api_base=api_base, 
    default_headers=default_headers
)
Enter fullscreen mode Exit fullscreen mode

This simple modification enables you to experiment with different models, optimize for performance or cost, and stay future-proof as new models become available. LangDB handles the operational complexities, making it easy to adapt your application to the best AI tools on the market.

Conclusion

LangDB’s AI Gateway redefines how developers integrate and manage Large Language Models, offering a streamlined, efficient, and scalable approach to leveraging AI in real-world applications. With features like seamless integration, intelligent routing and dynamic model switching, LangDB empowers developers to focus on innovation while simplifying the complexities of AI adoption.

In an era where AI is transforming industries, LangDB stands out as a critical tool for bridging the gap between cutting-edge technology and practical application. Whether you’re scaling enterprise solutions or experimenting with multiple models, LangDB ensures that your AI workflows are efficient, cost-effective, and ready for the future.

Checkout LangDB here: LangDB

Top comments (0)