View on GitHub

SqlBuildManager

SQL Build Manager is an all in one tool for building and maintaining a package of SQL scripts to manage builds and updates to your SQL Server database.

Leveraging Azure Batch for distributed builds

Set up your Batch Account

To use Azure Batch, you will need to have an Azure Batch account and upload the Sql Build Manager code package zip file (either from a GitHub release or a custom build) to the account. This setup is a one-time event and can be done via scripts or manually via the Azure portal

### Option 1: PowerShell and ARM Template Note: this method has the added benefit of also uploading the latest release zip file to your batch account so it is ready to go

  1. Prerequisite: Make sure you have the Azure PowerShell Modules installed
  2. Download the files in the templates folder
  3. Edit the azuredeploy.parameters.json file, giving your Azure Batch and Azure Storage account names (keep in mind the rules for naming storage accounts)
  4. Run the deploy_batch.ps1 file, providing values for:
    • -subscriptionId - Guid for the subscription to deploy to
    • -resourceGroupName - Resource group name to put the Batch and storage accounts into
    • -resourceGroupLocation - Azure region for the accounts. You can get the location values via the PowerShell Get-AzLocation | Select-Object -Property Location

Assuming the script succeeds, the last few lines will provide pre-populated arguments that you can save for your command line execution (You can also retrieve this data at a later time from the Azure portal):

Pre-populated command line arguments. Record these for use later: 

/BatchAccountName=mybatchaccountname
/BatchAccountUrl=https://mybatchaccountname.eastus.batch.azure.com
/BatchAccountKey=CLHvqpOqRW2X+z6Z2G/25zY9sQn/ePLMRknX1EbA79AJ74UVLV/7X1HqE91xV0UF24fPJYZfqDM/cfU6c1lPTA==
/StorageAccountName=mystorageaccountname
/StorageAccountKey=deDGkC2D3eOzI2BiVVmrxVpP1PPf7AdllA89HRYRAxD703iM/Me4D815aNYJTan8xiRypmfQ7QxCnZhM7QlYog==

Option 2: Direct deployment

Use the “Deploy to Azure” button to deploy using the template via the Azure portal. You will need to collect the account information from the resources once created the same as step 7 here

Option 3: Manually via Azure Portal

  1. Login to the Azure Portal at http://portal.azure.com
  2. Click “Create a Resource”
  3. Search for “Batch Service”
  4. On the information page, select “Create”
  5. Fill out the new batch and storage account information and click “Create”
  6. Wait for the Azure Batch account to be provisioned (should only take a few minutes)
  7. Collect Azure Batch and storage account information
    • Create a new text document to capture the information you will collect
    • In the Azure Portal, navigate to your new Batch account
    • On the Keys blade, record the Batch Account, URL and Primary access key values
    • On the Storage Account blade, record the Storage Account Name and Key1 values

      Upload SQL Build Manager code package

  8. First, make sure you have a build of SQL Build Manager either from GitHub Release or built locally (clone/pull from Github and build, the executables will be in the root bin/debug or bin/release folder)
  9. Zip up all of the files in the build folder - or grab the latest release Zip file from here
  10. In the Azure Portal, navigate to your Azure Batch account and click the “Applications” blade.
  11. Click the “+ Add” link
  12. Fill in the Application Id with “sqlbuildmanager” (no quotes) and the version field (can be any alpha-numeric)
  13. Use the folder icon to select your zip file that contains the compiled binaries
  14. Click the “OK” button - this will upload your zip package and it will now show up in the Application list

Running a Batch execution

(for a full end-to-end example, see this document)

Azure Batch builds are started locally via SqlBuildManager.Console.exe. This process communicates with the Azure Storage account and Azure Batch account to execute in parallel across the pool of Batch compute nodes. The number of nodes that are provisioned is determined by your command line arguments.

#### Pre-stage the Azure Batch pool VMs Execute SqlBuildManager.Console.exe with the /Action=BatchPreStage directive. This will create the desired number of Azure Batch VM’s ahead of time\ (NOTE: it can take 10-20 minutes for the VMs to be provisioned and ready). See the argument details here

Execute batch build

Execute SqlBuildManager.Console.exe with the /Action=batch directive. See the argument details here

This will start the following process:

  1. The provided command line arguments are validated for completeness
  2. The target database list is split into pieces for distribution to the compute nodes
  3. The resource files are uploaded to the Storage account
  4. The workload tasks are send to each Batch node
  5. The local executable polls for node status, waiting for each to complete
  6. Once complete, the aggregate return code is used as the exit code for SqlBuildManager.Console.exe
  7. The log files for each of the nodes is uploaded to the Storage account associated with the Batch
  8. A Sas token URL to get read-only access to the log files is included in the console output. You can also view these files via the Azure portal or the Azure Batch Explorer

Inspect logs if an issue is reported

If there is a issue with the execution - either with the SQL updates or something with the program, logs will be created. See the log details to see what files to expect.

Cleanup post build

  1. Execute SqlBuildManager.Console.exe with the /Action=BatchCleanup directive. This will delete the Azure Batch VM’s so you are no longer charged for the compute. See the argument details here\ NOTE: this will not delete the log files, these are generally needed more long term and they will stay in the storage account

Alternative run options

If you prefer a one step execution, you can run the command line to create and delete the pool VMs in-line with your execution. To do this, you would use the /Action=Batch action argument along with the additional arguments to create and delete the pool

Examples

The following command contains all of the required arguments to run a Batch job:

SqlBuildManager.Console.exe /Action=Batch /override="C:\temp\override.cfg" /PackageName=c:\temp\mybuild.sbm /username=myname /password=P@ssw0rd! /DeleteBatchPool=false /BatchNodeCount=5 /BatchVmSize=STANDARD_DS1_V2 /BatchAccountName=mybatch /BatchAccountUrl=https://mybatch.eastus.batch.azure.com /BatchAccountKey=x1hGLIIrdd3rroqXpfc2QXubzzCYOAtrNf23d3dCtOL9cQ+WV6r/raNrsAdV7xTaAyNGsEagbF0VhsaOTxk6A== /StorageAccountName=mystorage /StorageAccountKey=lt2e2dr7JYVnaswZJiv1J5g8v2ser20B0pcO0PacPaVl33AAsuT2zlxaobdQuqs0GHr8+CtlE6DUi0AH+oUIeg==

The following command line uses a generated DACPAC and assumes that the Batch, Storage and password settings are in the /SettingsFile:

SqlBuildManager.Console.exe /Action=batch /SettingsFile="C:\temp\my_settings.json" /override="C:\temp\override.cfg" /PlatinumDbSource="platinumDb" /PlatinumServerSource="platinumdbserver" /database=targetDb /server="targetdbserber" 

Azure Batch - Pre-Stage Batch nodes (/Action=BatchPreStage)

Note: You can also leverage the SettingsFile option to reuse most of the arguments


Azure Batch Execution (/Action=Batch)

In addition to the authentication and runtime arguments above, these are specifically needed for Azure Batch executions. \ Note:

  1. You can also leverage the SettingsFile option to reuse most of the arguments
  2. either /PlatinumDacpac or /PackageName are required. If both are given, then /PackageName will be used.

Additional arguments

If you don’t run the /Action=BatchPreStage and Action=BatchCleanup command sequence you will need to use the following:


Azure Batch Clean Up (delete) nodes (/Action=BatchCleanUp)

Note: You can also leverage the SettingsFile option to reuse most of the arguments


Log Details

The logs will be stored in the Azure Storage account associated with the Batch account. You can view the logs in several ways, including the the Azure portal, Azure Batch Explorer and Azure Storage Explorer. However you view the logs, you will find a set of files and folders:

Files

Note: All of these files are consolidated logs from across all of your Batch nodes.

Folders

Troubleshooting tips

If you have SQL errors in your execution, you will probably want to figure out what happened. Here is a suggested troubleshooting path:

  1. Open up the failuredatabases.cfg file to see what databases had problems
  2. Taking note of the server and database name, open the server folder then the database folder
  3. Open the logfile in the database folder. This file should contain an error message that will guide your troubleshooting should you need to correct some scripts
  4. Once you have determined the problem, use the failuredatabases.cfg file as your /Override argument to run your updates again - hopefully successfully this time!