DEV Community

Serge Artishev
Serge Artishev

Posted on

Azure SQL to Storage Account Export Tool

In the world of cloud-based data management, efficiently moving data in and out of Azure SQL databases is a common challenge. Whether you're performing backups, migrating data, or setting up test environments, having reliable tools for exporting and importing data is crucial. Today, I'm excited to share a set of bash scripts I've developed to streamline this process: the Azure SQL Data Export and Import Tools.

The Problem

Working with Azure SQL databases often requires exporting large amounts of data for backup or analysis, and then importing that data back into the same or different databases. While Azure provides various methods for these tasks, they can sometimes be cumbersome or require multiple steps. I needed a solution that was:

  1. Scriptable and easy to automate
  2. Flexible enough to handle different table structures
  3. Capable of compressing data for efficient storage and transfer
  4. Able to generate DDL scripts for table recreation

The Solution

To address these needs, I created two main scripts:

  1. export_to_tsv.sh: Exports data from Azure SQL tables to TSV (Tab-Separated Values) files, optionally compresses them, and uploads them to Azure Blob Storage.
  2. import_from_tsv.sh: Downloads TSV files from Azure Blob Storage, decompresses them if necessary, and imports the data into Azure SQL tables.

Let's dive into the key features and how to use these tools.

Export Tool (export_to_tsv.sh)

This script exports data from specified Azure SQL tables to TSV files, with options to compress the files and generate DDL scripts for table recreation.

Export

Key Features:

  • Exports multiple tables in a single run
  • Compresses TSV files using gzip (optional)
  • Generates DDL scripts for table recreation (optional)
  • Uploads exported files to Azure Blob Storage
  • Supports overwriting existing blobs in Azure Storage (optional)
  • Provides detailed logging and error reporting

Usage:

./export_to_tsv.sh --schema <schema_name> --tables <table1,table2,...> --container <container-name> --path <blob-path> [--delimiter <delimiter>] [--compress] [--generate-ddl] [--overwrite] [--dry-run]
Enter fullscreen mode Exit fullscreen mode

Example:

./export_to_tsv.sh --schema "archive" --tables "logs-2023-08-25" --container "sqlbackup" --path "archive/logs" --compress --generate-ddl --overwrite
Enter fullscreen mode Exit fullscreen mode

This command exports the logs-2023-08-25 table from the archive schema, compresses it, generates a DDL script, and uploads both files to the specified Azure Blob Storage location.

Import Tool (import_from_tsv.sh)

This script downloads TSV files from Azure Blob Storage, decompresses them if necessary, and imports the data into Azure SQL tables.

Import

Key Features:

  • Imports multiple tables in a single run
  • Handles compressed (gzip) files
  • Provides detailed logging and error reporting
  • Supports dry-run mode for testing

Usage:

./import_from_tsv.sh --schema <schema_name> --tables <table1,table2,...> --container <container-name> --path <blob-path> [--delimiter <delimiter>] [--compressed] [--dry-run]
Enter fullscreen mode Exit fullscreen mode

Example:

./import_from_tsv.sh --schema archive --tables "logs-2023-08-25" --container "sqlbackup" --path "archive/logs" --compressed
Enter fullscreen mode Exit fullscreen mode

This command imports the logs-2023-08-25 table into the refine schema, downloading and decompressing the TSV file from the specified Azure Blob Storage location.

Behind the Scenes

Both scripts use a combination of Azure CLI commands, sqlcmd, and bcp utilities to interact with Azure services and perform data operations. They also implement robust error handling and logging to ensure smooth operation and easy troubleshooting.

Here's a snippet from the export script that demonstrates how we're using bcp to export data:

bcp "SELECT * FROM [$SCHEMA].[$table]" queryout "$tsv_file" -c -t "$DELIMITER" \
    -S "$SOURCE_AZURE_SQL_SERVER" -d "$SOURCE_AZURE_SQL_DATABASE" \
    -U "$SOURCE_AZURE_SQL_USERNAME" -P "$SOURCE_AZURE_SQL_PASSWORD"
Enter fullscreen mode Exit fullscreen mode

And here's how we're using sqlcmd to generate DDL scripts:

sqlcmd -S "$SOURCE_AZURE_SQL_SERVER" -d "$SOURCE_AZURE_SQL_DATABASE" -U "$SOURCE_AZURE_SQL_USERNAME" -P "$SOURCE_AZURE_SQL_PASSWORD" -Q "
SET NOCOUNT ON;
DECLARE @TableName NVARCHAR(128) = '$table';
DECLARE @SchemaName NVARCHAR(128) = '$SCHEMA';
DECLARE @SQL NVARCHAR(MAX) = '';

-- Generate CREATE TABLE statement
SELECT @SQL = 'CREATE TABLE [' + @SchemaName + '].[' + @TableName + '] ('

-- ... (rest of the DDL generation logic)

PRINT @SQL
" -o "$ddl_file"
Enter fullscreen mode Exit fullscreen mode

Conclusion

These Azure SQL Data Export and Import Tools have significantly streamlined our data management processes. They provide a flexible, scriptable solution for moving data in and out of Azure SQL databases, with features like compression and DDL generation that save time and storage space.

While these scripts are tailored to our specific needs, they can serve as a starting point for others facing similar challenges. Feel free to adapt and extend them to fit your own requirements.

You can find the full source code and detailed documentation in my GitHub repository

I hope you find these tools useful in your Azure SQL data management tasks. Happy data wrangling!

Top comments (0)