DEV Community

Olga Braginskaya
Olga Braginskaya

Posted on • Originally published at datobra.com on

Scaling Data Analytics: Building a Starter Kit with Neon, Airflow, and Streamlit

Setting up a data analytics project from scratch can be a headache. You need a database, a way to pull in data, and a dashboard to make sense of it all. I wanted to make that process easier—something lightweight, flexible, and beginner-friendly.

That’s how I ended up building the Data Analytics Dashboard Starter Kit during the Neon Challenge. It’s a simple but powerful setup that automates data collection, stores it in a serverless PostgreSQL database, and displays insights in an interactive dashboard—all with just Python. For the demo, I used cryptocurrency data from CoinGecko, but you can plug in any dataset and make it your own.

Live Dashboard Demo

GitHub Repository

Technologies That Power This Kit

To make this work, I used three key tools:

1. Neon (Serverless PostgreSQL)

Neon offers a serverless PostgreSQL experience with automatic scaling—perfect for analytics projects where you don’t want to manage infrastructure.

2. Airflow (via Astronomer)

Airflow automates ETL workflows. I use it to fetch historical and real-time crypto data from CoinGecko and store it in Neon.

3. Streamlit (Interactive Dashboards)

Streamlit is one of the easiest ways to build a Python-based dashboard without frontend experience. Just write a script, and Streamlit handles the UI.

How the Data Flows

  1. Airflow pulls cryptocurrency data from the CoinGecko API at scheduled intervals.
  2. Neon stores this data efficiently for querying.
  3. Streamlit fetches the stored data and visualizes it in an interactive dashboard.

Here’s the core function for retrieving OHLC (Open, High, Low, Close) data for Bitcoin:

Project Structure

I wanted a modular structure that keeps things clean:

├── astronomer/    # Airflow DAGs for ETL jobs  
│   ├── dags/  
│   ├── Dockerfile  
│   ├── requirements.txt  
├── frontend/      # Streamlit app code  
│   ├── app.py  
│   ├── Dockerfile  
│   ├── requirements.txt  
.pre-commit-config.yaml  
compose.yaml  
README.md  
Enter fullscreen mode Exit fullscreen mode
  • astronomer/ → Contains Airflow DAGs for data ingestion.
  • frontend/ → Houses the Streamlit app for visualization.
  • Docker support → Everything is containerized for easy deployment.

Getting Started: Run the Project Locally

1. Clone the Repository

git clone https://github.com/olgazju/data_analytics_dashboard_starter_kit.git  
cd data_analytics_dashboard_starter_kit
Enter fullscreen mode Exit fullscreen mode

2. Set Up a Python Virtual Environment

brew install pyenv pyenv-virtualenv  
pyenv install 3.12.0
pyenv virtualenv 3.12.0 da_kit 
pyenv local da_kit
Enter fullscreen mode Exit fullscreen mode

3. Run with Docker

Ensure Docker is installed, then run:

docker-compose up --build

The dashboard will be available at http://localhost:8501.

Deploying the Dashboard

Deploy Airflow DAGs

Navigate to the astronomer/ folder and deploy the DAGs:

astro deploy

Deploy the Streamlit App

Use Streamlit Cloud to host the app. Link your GitHub repo, and Streamlit will take care of the deployment.

Next Steps

If you’re interested in data analytics and want a quick way to get started, try out the Data Analytics Dashboard Starter Kit. Fork the repo, experiment with new data sources, and let me know what you build.

Would love to hear your thoughts—feel free to share feedback or suggestions.

Top comments (0)