Deploying Database Projects with Octopus Deploy

Are you interested in automating your deployment process? Leave a comment and I would be more than happy to help you along the path, you won’t look back!

At DrDoctor we are currently under going a massive initiative to automate our deployment process. We have made excellent progress over the last couple of months,  and as it stands we can confidently and effortlessly release new versions of the various software components of the system (IIS websites, Windows services).

My colleague Steve has written on the stack which we are using to automate, at the core of it is TeamCity and Octopus Deploy.

From the beginning of this initiative there has always been an elephant in the room, a task that no one has wanted to tackle nor known how to. You can probably guess from the title of this post that I’m taking about database deployments.

At the moment the database upgrade part of our process looks like this:

  1. Build the SQL Server Database Project in Visual Studio
  2. Find the output directory of the SQL project
  3. Connect to the remote server
  4. Copy the DacPac file to the remote server
  5. Look up syntax for SqlPackage.exe
  6. Generate update script (using SqlPackage.exe)
  7. Look over script in management studio
  8. Run script

As you can see there are a lot of steps and one needs to remember various things, like the syntax for SqlPackage (I’ve just about memorised it now).

We are using Octopus Deploy as our deployment system, it works by copying NuGet packages to remote servers and then uses PowerShell scripts to install and configure. We also use TeamCity as our build server.

The guys at octopus deploy have a handy NuGet package which you install into the projects you want Octopus to deploy, and during the Build stage in TeamCity it will then turn these projects into NuGet packages, ready to be deployed.

Unfortunately there is no way to install NuGet packages into SQL Server Database Project types (not that I could find anyway), there are two ways of turning your database project into a NuGet package on TeamCity.

The first is to manually edit the project file adding the targets for OctoPack, the other method is to use a NuSpec file and the NuGet Package build step in TeamCity. The following snippet needs to be pasted at the very bottom of the .sqlproj file right before the </Project> tag.

<Import Project="..\..\..\packages\OctoPack.3.0.31\tools\OctoPack.targets" Condition="Exists('..\..\..\packages\OctoPack.3.0.31\tools\OctoPack.targets')" />
<Target Name="EnsureOctoPackImported" BeforeTargets="BeforeBuild" Condition="'$(OctoPackImported)' == ''">
  <Error Condition="!Exists('..\..\..\packages\OctoPack.3.0.31\tools\OctoPack.targets') And ('$(RunOctoPack)' != '' And $(RunOctoPack))" Text="You are trying to build with OctoPack, but the NuGet targets file that OctoPack depends on is not available on this computer. This is probably because the OctoPack package has not been committed to source control, or NuGet Package Restore is not enabled. Please enable NuGet Package Restore to download them. For more information, see" HelpKeyword="BCLBUILD2001" />
  <Error Condition="Exists('..\..\..\packages\OctoPack.3.0.31\tools\OctoPack.targets') And ('$(RunOctoPack)' != '' And $(RunOctoPack))" Text="OctoPack cannot be run because NuGet packages were restored prior to the build running, and the targets file was unavailable when the build started. Please build the project again to include these packages in the build. You may also need to make sure that your build server does not delete packages prior to each build. For more information, see" HelpKeyword="BCLBUILD2002" />

NB: You will need to make sure that the relative paths to OctoPack is correct.

The downside to this obviously is that when a new version comes out I’ll have to manually update this path to point to the new version.

Now TeamCity will turn this into a NuGet package, at this point I can automate steps 1-4.

To automate steps 5 and 6 requires writing a PowerShell script which will call SqlPackage.exe. The following script will have the the required parameters to filled in during deployment by Octopus Deploy and use them to execute SqlPackage.exe:

$dbName = $OctopusParameters['DbName']
$user = $OctopusParameters['DbUser']
$pwd = $OctopusParameters['DbPassword']
$server = $OctopusParameters['DbServerName']

Write-Host DbName: $dbName
Write-Host User: $user
Write-Host Server: $server

& 'C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\SqlPackage.exe' /Action:Script /OutputPath:Upgrade.sql /SourceFile:National.dacpac /TargetServerName:$server /TargetDatabaseName:$dbName /p:IncludeCompositeObjects=true /TargetUser:$user /TargetPassword:$pwd

It used to take about 30 minutes on average to do all this and was a massive pain just to deploy a relatively minor change. Now it takes between 5 and 10 minutes, depending on the amount of changes being applied. We now have a very simple process to follow.


Octopus Deploy doing the deployment of the database project and generating the upgrade script


The upgrade.sql script generated by SqlPackage.exe


Nancy Testing – Razor View with Referenced Models

The last few weeks I’ve been working with Nancy (an awesome web framework for .Net), the core goal of the framework is the super-duper-happy path:

The goal of the framework is to stay out of the way as much as possible and provide a super-duper-happy-path to all interactions.

This means that everything in Nancy is setup to have sensible defaults and conventions, instead of making you jump through hoops and go through configuration hell just to get up and running. With Nancy you can go from zero to website in a matter of minutes. Literally.

From the experience that I’ve had with Nancy to date, I would say it easily lives up to this goal for both development and testing. It is very easy to get started writing tests and most of what you need to know is covered in the docs.

However, on Friday I started experiencing a problem with some of our existing tests which check that the Razor view has been rendered correctly – by counting the number of rows in a <table>.

The cause of the failing test was that I changed the ViewModel to include a List of objects which live outside of the Web App assembly. To get my Nancy project to work when deployed I had to update the web.config file to include a list of assemblies for Nancy to reference when rendering the Razor views.

This morning I came back this problem afresh and found the solution 😀 Reading through the Razor View Engine docs on GitHub there is mention of the IRazorConfiguration:

You can specify assemblies and default namespaces that Razor needs to use whilst compiling the views by bootstrapping your own IRazorConfiguration implementation, thus removing the need to add the @using statements to each view. This step is totally optional if you don’t require additional references or namespaces in your view.

After trying various things I came across two different ways to get my tests to pass:

1. Update the App.Config file

This is the simple and easy solution, unfortunately I discovered this solution after the harder one below. On Friday I made the mistake of updating the web.config file which was in my Tests project and my tests kept failing – no surprise there.

What I should have done was update my App.Config file, after I did this all the tests passed.

Add the following section to the configSections group:

<section name="razor" type="Nancy.ViewEngines.Razor.RazorConfigurationSection, Nancy.ViewEngines.Razor" />

Then add the following to the end of the config file just before </configuration>:

    <add assembly="Domain" />
    <add assembly="Some.OtherAssembly" />

With that simple change my view tests passed. I guess the web.config file was added when I added the Nancy NuGet package.

2. Implement the IRazorConfiguration interface

The slightly harder solution is to implement the IRazorConfiguration interface and reference this when setting up the ConfigurableBootstrapper – the advantage of this is that you have finer controller over what is going on, the downside is that you will need to include this everywhere you are using the ConfigurableBootstrapper.

You will need to create a RazorConfigurationTestingStub which implements the IRazorConfiguration, mine looked like this:

internal class RazorConfigurationTestingStub : IRazorConfiguration

    public bool AutoIncludeModelNamespace
        get { return true; }

    public IEnumerable<string> GetAssemblyNames()
        return new List<string>() { "Domain", "Some.OtherAssembly" };

    public IEnumerable<string> GetDefaultNamespaces()
        return Enumerable.Empty<string>();

As you can see the list of Assembly names is the same list which was included in the App.Config file (in my Tests project) and the same list which is included in the web.config file (in my Nancy Web App project).

When setting up the ConfigurableBootstrapper you will need to include the stub as the implementation for the IRazorConfiguration dependency:

var bootstrapper = new ConfigurableBootstrapper(cfg =>
    cfg.ApplicationStartup((container, pipelines) =>
    cfg.RootPathProvider(new TestRootPathProvider());
    <strong>cfg.Dependency<IRazorConfiguration>(new RazorConfigurationTestingStub());</strong>

After I did this all the tests passed.

Biml From the Trenches – Convention over Configuration

Biml from what I have observed is still in it’s infancy. Don’t get me wrong, what can actually be done with Biml is impressive, however it seems to me how we are using it and how we are talking about it still needs to mature before it can be fully and effectively embraced by the community at large.

There is a growing number of people in the SQL Server community that are contributing excellent guidance and driving forward its maturity, as well as which has an ever growing set of examples to learn from. I firmly believe that the future of Biml is bright, if you don’t believe me check out this interview with the creator Scott Currie.

What I would like to see more of, is people talking about implementing design patterns in Biml, let’s have less examples and more theory.

An area that there is still little guidance on is convention based Biml.

Talk to any experienced software developer and they will tell you all about “Convention over Configuration” and how it makes their life easier, but if you’re a BI developer this might be brand new concept to you.

Wikipedia has a great description:

Convention over configuration is a software design paradigm which seeks to decrease the number of decisions that developers need to make, gaining simplicity, but not necessarily losing flexibility.


The phrase essentially means a developer only needs to specify unconventional aspects of the application.

In my mind there are a few different types of conventions:

  1. Those that require strict adhered
  2. Those that can be overridden
  3. On/Off by default

In this post I’m going to give two examples of the first type, those I class as “strict adherence required” from a recent ETL project I was working on. I’ll cover the other types in subsequent posts.

NB: I’m deliberately not including any Biml or screenshots of SSIS packages in this post. The reason for this is that there are numerous ways to implement design patterns, so rather than showing you how I do it, I want to introduce the concepts and get you thinking about it.

Use a convention to Extract data

What I want to achieve: I want to move data from a source system into my target database, but only a subset of tables in the source system.

My convention: For each table in the Import schema of my Target database, there is a table with the same name and structure in the source system. Then, for each table in the Import schema generate Data Flow task which will read from the source table and copy the data into the target table.

The advantage: When I need to start importing more data from the source system all I need to do is create a new table in my import schema that matches the source and then regenerate the extract package. No configuration is needed, I don’t need to maintain a list of tables which needed to be imported, the presence of the table in the Import schema is enough. Simple.

Use convention to Transform data

The ETL project that I working on already had a number of queries written which would transform the source system data to match our table structure. We had a number of tables in the Stage schema which the transformed data needed to go before being loaded into the live tables.

What I want to achieve: Transform data from the Import schema tables and load into tables in the Stage schema

My convention: For each table in the Stage schema, there is a corresponding stored procedure with a name like: [Stage table name]_Transform

The advantage: Once again, when I need to start transforming more data, only two things need be done, create new table in Stage schema, create transform stored procedure. Then I just regenerate my SSIS package, no changes required.

What’s the point?

Hopefully you’ve figured it out, but the point of “convention over configuration”, or “coding to a convention” is to remove the number of decisions that you need to make during the development effort. This allows you to spend more time on the hard stuff and let the computer take care of the rest.

TeamCity – Branching Databases and Continuous Integration

This week there have been lots of build failures, thanks to our HipChat and TeamCity integration I’m notified constantly. As a “senior developer”, this matters to me. The problem comes down to multiple feature branches using a single CI database, which means that it isn’t always “up-to date”, hence the failing tests.


In this post I’m going to show you how I fixed this by setting up TeamCity to publish a database for each of our feature branches.

There are a two bits that need to be done to achieve this end goal:

  1. TeamCity needs to publish the SQL Server Database Project into a separate database for each branch
  2. All connection strings throughout the solution need to be updated to point to this new database

Publishing the SQL Database Project

To accomplish this we will be using the SqlPackage.exe command line utility (MSDN docs) which is shipped with SQL Server Data Tools (you will have probably already installed this on your build server to get it to build the sqlproj). This command line tool does the same thing as when you right click in Visual Studio on your project and select publish.

We can use SqlPackage.exe to publish a new database by using the following command:


The arguments are pretty self-explanatory, except for perhaps the last one. The profile as you will see in a minute allows you to define different parameters in an XML file which are then used during deployment.

Creating the Continuous Deployment Profile

For our CI environment deployments I always want the database to be dropped and recreated, this can be configured in the deployment profile.

In Visual Studio, right click on your database project and click Publish. This will bring up the Publish Database window.


Then click Advanced, this will bring up the Advanced Publish Settings. Tick “Always re-create database”. The options you can select are extensive and worth a look through.


Click OK to close this and go back to the Publish Settings window.

To create the Profile simply click the “Create Profile” button, this will add a new {projname}.publish.xml file to your project.


I’ve renamed mine to Database.CI.publish.xml

Setting up TeamCity

The next thing to do is to have TeamCity actually publish the database project. In TeamCity add a new Build step:

  1. Runner type: Command Line
  2. Step name: Publish database
  3. Run: Executable with parameters
  4. Command executable: C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\SqlPackage.exe
  5. Command parameters:


Then click Save.


Updating Connection Strings

Once TeamCity has built the solution and published the latest version of the database I needed a way to update all the connection strings in the various app.config and web.config files.

To accomplish this I put together the following PowerShell script:

$connectionStringName = $args[0];
$newConnectionString = $args[1];

# get the directory of this script file
$currentDirectory = [IO.Path]::GetDirectoryName($MyInvocation.MyCommand.Path)

$fileList = gci $currentDirectory -Recurse -Filter *.config

foreach($file in $fileList)
    $appConfig = New-Object XML
    # load the config file as an xml object

    foreach($connectionString in $appConfig.configuration.connectionStrings.add)

        if ($ -eq $connectionStringName)
            'file: ' + $file.FullName
            # write the name to the console
            'name: ' + $
            # write the connection string to the console
            'connectionString: ' + $connectionString.connectionString
            # change the connection string
            $connectionString.connectionString = $newConnectionString
    # save the updated config file

What this does is it will find all *.config files files recursively from the directory that the PowerShell script lives in, it will then load each config file as an XML file and find all the connection strings, any that have a name that matches the first argument will have the ConnectionString property updated to the value of the second argument.

To configure TeamCity add a new build step:

  1. Runner type: Powershell
  2. Step name: Update connection strings
  3. Powershell run mode: Version 2.0; Bitness: x86
  4. Script: File
  5. Script file: ConnectionStringUpdate.ps1
  6. Script execution mode: Execute .ps1 script with “-File” argument
  7. Script arguments: “MyDatabaseConnString” “;Server=…”


Back into the green

Now when code is pushed up to BitBucket TeamCity will run through the following steps:

  1. Build the solution
  2. Publish the database project(s)
  3. Update connection strings
  4. Run unit test / integration tests – currently one step

TeamCity is back to being happy


and more importantly, when the build completes HipChat notifies me that the build has been