Introduction to Gemini 2.0 Flash
Gemini 2.0 Flash, Google's efficient generative AI model, excels at tasks requiring low latency and strong performance. Its agentic capabilities, large context window, and native tool use make it ideal for finance applications. In this blog, we'll create a stock insights agent using Gemini 2.0 Flash on Vertex AI, leveraging Google Search grounding for real-time data. We'll use the google-genai
library and build a Gradio UI for interactive stock analysis.
Step-by-Step Guide: Developing the Stock Insights Agent
Letβs break down the process into environment setup, connecting to Gemini 2.0 Flash, integrating Google Search, and creating a Gradio web app.
1. Environment Setup and Authentication
Before coding, prepare your Google Cloud environment:
- Enable Vertex AI and Vertex AI Search and Conversation APIs in your Google Cloud project. Ensure you have the necessary permissions.
Install the Google Gen AI SDK for Python:
pip install --upgrade google-genai
-
Authentication:
- If running this notebook on Google Colab, you will need to authenticate your environment. To do this, run the new cell below.
import sys if "google.colab" in sys.modules: # Authenticate user to Google Cloud from google.colab import auth auth.authenticate_user()
- Otherwise, set up authentication using a service account. Go to the Google Cloud Console, create a service account with Vertex AI access, and download its JSON key. Then set the
GOOGLE_APPLICATION_CREDENTIALS
environment variable to the path of this JSON key to authenticate your API calls. For example:
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/your/service-account-key.json"
Initialize the Vertex AI SDK:
import os
from google import genai
PROJECT_ID = "[your-gcp-project-id]" # Replace with your GCP project ID
LOCATION = "us-central1" # Or the region where Gemini 2.0 Flash is available
client = genai.Client(vertexai=True, project=PROJECT_ID, location=LOCATION)
2. Connecting to the Gemini 2.0 Flash Model
Connect to the Gemini 2.0 Flash model using the google-genai
library:
MODEL_ID = "gemini-2.0-flash-001" # Or the latest available model ID
3. Enabling Google Search Grounding for Real-Time Stock Data
Use Google Search to ground responses with up-to-date information:
from google.genai.types import GoogleSearch, Tool, GenerateContentConfig
import json
# Create a Google Search retrieval tool
google_search_tool = Tool(google_search=GoogleSearch())
# Define our query or prompt for the model
company_name = "Alphabet Inc." # example company
prompt = (
f"Provide information about {company_name} in JSON format with the following keys:\n"
f"- `company_info`: A brief overview of the company and what it does.\n"
f"- `recent_news`: An array of 2-3 recent news items about the company, each with a `title` and a short `description`.\n"
f"- `trend_prediction`: An analysis of the stock trend and a suggested target price for the near future."
)
# Configure the model to use the Google Search tool
config = GenerateContentConfig(tools=[google_search_tool])
# Generate a response using the model, grounded with Google Search data
response = client.models.generate_content(
model=MODEL_ID,
contents=prompt,
config=config,
)
try:
data = json.loads(response.text)
print(json.dumps(data, indent=4)) # Print nicely formatted JSON
except json.JSONDecodeError:
print("Error decoding JSON:", response.text)
4. Testing the Model Response (Example)
Test the model with a sample query:
user_question = (
"Provide information about Tesla in JSON format with the following keys:\n"
f"- `company_info`: A brief overview of the company and what it does.\n"
f"- `recent_news`: An array of 2-3 recent news items about the company, each with a `title` and a short `description`.\n"
f"- `trend_prediction`: An analysis of the stock trend and a suggested target price for the near future."
)
config = GenerateContentConfig(tools=[google_search_tool])
response = client.models.generate_content(
model=MODEL_ID,
contents=user_question,
config=config,
)
try:
data = json.loads(response.text)
print(json.dumps(data, indent=4)) # Print nicely formatted JSON
except json.JSONDecodeError:
print("Error decoding JSON:", response.text)
5. Gradio UI Implementation
Create an interactive web interface with Gradio:
import gradio as gr
import json
# Function to get stock insights for a given company name
def get_stock_insights(company):
if not company:
return "", "", ""
# Construct a prompt for the LLM to generate a structured JSON response
prompt = (
f"Provide information about {company} in JSON format with the following keys:\n"
f"- `company_info`: A brief overview of the company and what it does.\n"
f"- `recent_news`: An array of 2-3 recent news items about the company, each with a `title` and a short `description`.\n"
f"- `trend_prediction`: An analysis of the stock trend and a suggested target price for the near future."
)
# Use the Vertex AI model with Google Search grounding
config = GenerateContentConfig(tools=[google_search_tool])
response = client.models.generate_content(
model=MODEL_ID,
contents=prompt,
config=config,
)
try:
data = json.loads(response.text)
company_info = data.get("company_info", "")
news_items = data.get("recent_news", [])
news_string = ""
for news in news_items:
news_string += f"- **{news.get('title', 'No Title')}**: {news.get('description', 'No Description')}\n"
trend_prediction = data.get("trend_prediction", "")
return company_info, news_string, trend_prediction
except json.JSONDecodeError:
return "Error decoding JSON. Please try again.", "", ""
# Define Gradio interface components
company_input = gr.Textbox(label="Company or Stock Ticker", placeholder="e.g. Tesla or TSLA")
company_info_output = gr.Markdown(label="Company Information")
news_info_output = gr.Markdown(label="Recent News")
prediction_info_output = gr.Markdown(label="Trend & Prediction")
# Create the Gradio interface
demo = gr.Interface(
fn=get_stock_insights,
inputs=company_input,
outputs=[company_info_output, news_info_output, prediction_info_output],
title="π Stock Insights Agent",
description="Enter a company name or stock ticker to get an overview, latest news, and an AI-driven stock trend prediction."
)
# Launch the app (if running locally, this will start a web server for the UI)
demo.launch()
-
Prompt: The prompt now explicitly requests the output in JSON format, specifying the structure with keys
company_info
,recent_news
(an array oftitle
anddescription
objects), andtrend_prediction
. -
JSON Parsing: The
get_stock_insights
function now parses the JSON response usingjson.loads()
. -
Gradio Output Population: The function extracts the values from the parsed JSON and populates the corresponding Gradio output components (
company_info
,news_string
,trend_prediction
). Thenews_string
is formatted into a Markdown list for better presentation. -
Error Handling: Includes a
try...except
block to handle potentialJSONDecodeError
if the model doesn't return a valid JSON.
Now, let's move the GitHub repo structure before the conclusion:
GitHub Repository Structure
When packaging this project for collaboration and deployment, a clear repository structure is essential. Hereβs a suggested structure:
stock-insights-agent/
βββ README.md
βββ requirements.txt
βββ app.py
βββ .env (optional)
βββ utils/
β βββ __init__.py
β βββ data_utils.py
βββ notebooks/
βββ development.ipynb (optional)
- README.md: Project overview, setup, dependencies, API credentials, and usage examples.
-
requirements.txt: Pinned Python libraries (e.g.,
google-genai
,gradio
). - app.py: Main application script (Vertex AI initialization, Gradio interface).
-
.env (optional): Environment variables (API keys, project IDs). Important: Add to
.gitignore
. - utils/: Helper functions (prompt construction, result parsing).
- notebooks/: Jupyter notebook for development/prototyping (optional).
Ensure the README includes instructions for Google Cloud credentials and API enablement. Document the agent's functionality and any limitations.
Conclusion
In this tutorial, we built a Stock Insights Agent using Gemini 2.0 Flash on Vertex AI. By leveraging Google Search grounding and the google-genai
library, we created an interactive Gradio interface for real-time stock analysis.
Next steps: Consider deploying the Gradio app, integrating robust data sources, refining prompts, or enabling multi-turn conversations. Gemini 2.0 Flash offers many possibilities in finance, from personalized assistants to automated report generators.
Top comments (0)