DEV Community

Pedro Correia
Pedro Correia

Posted on • Edited on

How did I automate the execution of database scripts on Azure Pipelines?

Indroducing

As the title suggests, this post discusses the time when I needed to create a pipeline to automatically deploy all changes to the database schema, including ADD/ALTER TABLES, ADD/ALTER COLUMNS, PROCEDURES, etc.

My Use Case

The use case involves a project built using Azure SQL Server with Dapper and .NET 7, where each client has its instance of the database. Within the .NET project, there's a Database project responsible for versioning all the database changes, and that part is fine. However, the issue arises when deploying a new schema change as it had to be done manually using Scheme Compare (a function provided by Database projects in Visual Studio IDE). This manual process became time-consuming, especially as the number of clients increased, requiring more scheme comparisons and manual control. This is the reason for automating these processes. How do I automate it?

Database_Project
Image description

What solution I apply?

To begin with, I created a new Azure repository specifically to store the Database project, allowing me to subsequently create a dedicated pipeline for it.

Image description

I created a YML pipeline with the following instructions: it builds the database project and saves the artifact in the drop folder. This artifact is then utilized for proper deployment on Azure SQL Database.

trigger:
- main
- stage
- development

pool:
  vmImage: windows-latest

stages:

- stage: 'ContinuousIntegration'
  displayName: 'Continuous Integration'
  jobs:
    - job: BuildPublishArtifacts
      displayName: 'Building & Publish Solution Artifacts'
      steps:
        - task: MSBuild@1
          displayName: 'Building Solution'
          inputs:
            solution: '**/*.sln'
            clean: true
            configuration: 'Release'
            platform: 'Any CPU'

        - task: CopyFiles@1
          displayName: 'Moving Database Artifacts to Drop folder'
          inputs:
            SourceFolder: '$(Agent.BuildDirectory)\s\Database\bin\Release'
            Contents: '**'
            TargetFolder: '$(Build.ArtifactStagingDirectory)\DBArtifact'

        - task: PublishBuildArtifacts@1
          displayName: 'Publishing Solution Artifact'
          inputs:
            PathtoPublish: '$(Build.ArtifactStagingDirectory)'
            ArtifactName: 'drop'
            publishLocation: Container
Enter fullscreen mode Exit fullscreen mode

Finally, the last step I had to take was to create a release, set up the artifact built by the pipeline, and create a task to deploy a DACPAC.

Image description

As you can see, this isn't a guide; instead, I'm sharing a use case that I encountered during my work and explaining why I chose to automate this process.

Have you ever automate that?

Top comments (0)