DEV Community

Cover image for How to Retrieve and Export AWS Cost Data by Service Using Python
Dmitry Romanoff
Dmitry Romanoff

Posted on

How to Retrieve and Export AWS Cost Data by Service Using Python

When managing cloud infrastructure, one of the most important tasks is to keep track of costs across different AWS services. AWS provides a powerful service called Cost Explorer, which helps you view detailed cost data. But what if you want to automate the process of retrieving and exporting that data for further analysis?

In this article, we will walk through a Python script that uses the AWS SDK (boto3) to interact with AWS Cost Explorer, fetch cost data for specific AWS services, and export the results to a CSV file.

Prerequisites

Before we dive into the code, ensure you have the following:

  1. Python installed on your machine: Version 3.6 or higher.
  2. AWS CLI configured: You need AWS credentials (access key and secret key) configured in your environment. Use the AWS CLI to configure this.
   aws configure
Enter fullscreen mode Exit fullscreen mode
  1. Install the boto3 library: This is the AWS SDK for Python. You can install it via pip:
   pip install boto3
Enter fullscreen mode Exit fullscreen mode

The Goal

We want to retrieve AWS cost data for February 2025, focusing on the AmortizedCost for each service (excluding support, credits, and refunds). Then, we will export this data into a CSV file, which can be analyzed further or visualized.

The Code

Let's break down the Python code that achieves this goal.

import boto3
import csv
from datetime import datetime

# Initialize the AWS Cost Explorer client
client = boto3.client('ce', region_name='us-east-1')

# Define the date range for February 2025
start_date = '2025-02-01'
end_date = '2025-02-28'

# Call the get_cost_and_usage API
def get_aws_cost_and_usage(start_date, end_date):
    # Prepare the parameters
    response = client.get_cost_and_usage(
        TimePeriod={
            'Start': start_date,
            'End': end_date
        },
        Granularity='MONTHLY',
        Metrics=['AmortizedCost'],  # Use 'AmortizedCost' or 'BlendedCost' if necessary
        GroupBy=[{'Type': 'DIMENSION', 'Key': 'SERVICE'}]  # Group by service
    )

    # Create a list to hold service and cost data
    service_costs = []

    # Extract and store service-wise costs
    for result in response['ResultsByTime']:
        for group in result['Groups']:
            service = group['Keys'][0]
            cost = float(group['Metrics']['AmortizedCost']['Amount'])

            # Exclude credits, refunds, and support fees
            if service.lower() in ['aws support', 'credits', 'refunds']:
                continue  # Skip these services from the list

            # Exclude negative costs (credits or refunds)
            if cost < 0:
                continue  # Skip if cost is negative (i.e., credit or refund)

            # Round the cost to 2 decimal places
            cost = round(cost, 2)
            service_costs.append((service, cost))

    # Sort the list by cost in descending order
    service_costs.sort(key=lambda x: x[1], reverse=True)

    # Write the sorted services and their costs to a CSV file
    with open('cost_report.csv', 'w', newline='') as file:
        writer = csv.writer(file)
        writer.writerow(['Service', 'Cost'])  # Write the header
        for service, cost in service_costs:
            writer.writerow([service, cost])  # Write each service and its cost

# Run the function to get costs for February 2025 and generate the CSV report
get_aws_cost_and_usage(start_date, end_date)
Enter fullscreen mode Exit fullscreen mode

How It Works

  1. Setting Up the AWS Cost Explorer Client:

    The first step in the script is to initialize the AWS Cost Explorer client using boto3.client('ce'). This client will allow us to interact with AWS Cost Explorer API and fetch cost data.

  2. Define Date Range:

    We set the date range to February 2025 (start_date = '2025-02-01' and end_date = '2025-02-28'). You can adjust these dates as needed.

  3. API Call to Get Cost and Usage:

    The get_cost_and_usage function makes a call to the get_cost_and_usage API. The parameters passed here define the granularity (monthly), the metric to be fetched (AmortizedCost), and the grouping (by service).

  4. Processing the Data:

    The response from the AWS Cost Explorer API contains detailed information about each service and its costs. The script processes the data, filters out services like AWS Support, Credits, and Refunds, and excludes any negative costs (which may indicate refunds or credits).

  5. Sorting the Data:

    After filtering, the services are sorted by cost in descending order to highlight the highest-cost services.

  6. Exporting to CSV:

    The sorted cost data is written to a CSV file (cost_report.csv). The CSV file will have two columns: Service and Cost. This file can be easily opened in spreadsheet applications like Excel for further analysis.

Running the Script

To run this script, simply execute it from the command line or any Python environment. After running, you will find a file called cost_report.csv in the same directory as your script.

python fetch_aws_costs.py
Enter fullscreen mode Exit fullscreen mode

This will generate a CSV report with the services and their corresponding amortized costs for February 2025, sorted from the highest to the lowest cost.

Conclusion

This script demonstrates how you can automate the process of fetching and exporting AWS cost data for specific services using Python and AWS Cost Explorer. By filtering out unwanted services like AWS Support and credits, and organizing the data by cost, you can easily track and analyze your AWS spending.

Feel free to modify this script to suit your specific needs, such as adjusting the date range, changing the granularity, or adding additional filters. This approach is powerful for managing cloud costs and can save you significant time if you need to pull detailed reports regularly.

If you're new to boto3 or the AWS Cost Explorer API, I recommend checking out the official AWS documentation to dive deeper into the available features.

Top comments (0)