This topic is strongly related to process of your product development. Developers have tendency to modify dev database and forget about changes they’ve made… This awful manner leads to problems like:

  • We have to use RedGate tools to compare databases and find the differences…
  • How to install our software to client ? Just generate scripts from one of our existing databases…
  • Which version of database our client have ? Nobody knows…

Those examples were not imaginary! They were real… big software companies that I’ve worked with were unable to keep database project in form. This is my solution how to avoid this problems and keep database project in check.

1. Create structure of folders in your project’s directory.

You need to have SQL scripts in the same place as code.

<projectDir>
   CreateDatabase.sql
   InitDbVersioning.sql
   <release001>
      001.CreateSchema.sql
      002.InsertDefaultData.sql
      003.<customScript1>.sql
      ...
   <release002>
      001.<customScript1>.sql
      002.<customScript2>.sql
      003.<customScript3>.sql
      ...
   <release003>
      001.<customScript1>.sql
      002.<customScript2>.sql
      003.<customScript3>.sql
      ...

CreateDatabse.sql – This script can create empty database, its users and sql logins. This script should be reusable – this means, that user or automated tool can execute it many times, but only one database will be created, and next executions won’t raise errors. You can read more here.

InitDbVersioning.sql – You can find more info in the next step.

<releasexxx> – Directory for scripts created for every release, xxx have to match release number. Order of numbers is very important.

yyy.<customScriptx>.sql – Script file which is a part of  a release. It’s important to keep ascending order of this scripts. Each script is a diff to previous version. Moreover, this script is created using a template – this will be explained in next points!

2. Prepare database for versioning.

This step is actaully a InitDbVersioning.sql script. This code creates three objects:

  • DbUpdate table – This table will store informations about executed scripts (database version). Please have in mind that you have to use template from point 3!
  • DbLog table – This table will strore logs from execution of every template based script (point 3). This solution will help when automated database installation fails.
  • AppendLine – This is internal stored procedure used to append strings during execution of a template based script (point 3).
print 'Creating DbUpdate.'
if not exists (select * from sys.tables where name like 'DbUpdate') 
begin
    create table [DbUpdate]([Id] int identity(1,1) not null, [FileIndex] int not null,
        [FileDescription] nvarchar(max) not null, [FolderIndex] int not null, 
        [FolderDescription] nvarchar(max) not null, [Date] datetime not null)
    print 'Done'
end else begin print 'Skipped' end

-- ###
print 'Creating DbUpdate.PK_DbUpdate.'
if not exists (select * from information_schema.table_constraints 
    where constraint_type = 'PRIMARY KEY' and [table_name] = 'DbUpdate')
begin
    exec('alter table [DbUpdate] add constraint [PK_DbUpdate] primary key clustered ([Id] asc)')
    print 'Done'
end else begin print 'Skipped' end

-- ###
print 'Creating DbLog.'
if not exists (select * from sys.tables where name like 'DbLog')
begin
    exec('create table [DbLog]([Id] int identity(1,1) not null, 
        [Text] nvarchar(max) not null, [Date] datetime not null)')
    print 'Done'
end else begin print 'Skipped' end

-- ###
print 'Creating DbLog.PK_DbLog.'
if not exists (select * from information_schema.table_constraints 
    where constraint_type = 'PRIMARY KEY' and [table_name] = 'DbLog')
begin
    exec('alter table [DbLog] add constraint [PK_DbLog] primary key clustered ([Id] asc)')
    print 'Done'
end else begin print 'Skipped' end

-- ###
print 'Creating AppendLine.'
if not exists (select * from sys.objects where type = 'P' and name = 'AppendLine')
begin
    exec('create procedure [AppendLine](@lineToAppend nvarchar(max), @variable nvarchar(max) output)
        as
        begin
            print isnull(@lineToAppend, ''null'')
            set @variable = @variable + isnull(@lineToAppend, ''null'') + char(13) + char(10) 
        end')
    print 'Done'
end else begin print 'Skipped' end

As you may noticed, this script is reusable, it can be executed many times on existing database, and only first execution will take effect. It’s important that other executions of this script will not raise errors.

Hint! Primary keys are created using separate add constraint statement. Adventage of this solution is that we have control over a PK’s names. Otherwise SQL Server will generate new name with some numeric value.

3. Version scripts using pattern given below.

This is a template for all scripts. This pattern should be used by developers when creating new change. Execution of script created with this pattern will cause:

  • Saving output to DbLog table (even if execution will fail).
  • Executing your changes in transaction.
  • Saving version of database to DbVersion (this will protect against multiple execution of our code).
set xact_abort on

declare @dbUpdate_FileIndex int
declare @dbUpdate_FileDescription nvarchar(max)
declare @dbUpdate_FolderIndex int
declare @dbUpdate_FolderDescription nvarchar(max)

-- ### USER VALUES BELOW

-- Update script file number
set @dbUpdate_FileIndex = @@fileIndex@@
-- Update script file description
set @dbUpdate_FileDescription = '@@fileDescription@@'
-- Update script folder number
set @dbUpdate_FolderIndex = @@folderIndex@@
-- Update script folder description
set @dbUpdate_FolderDescription = '@@folderDescription@@'

-- ### USER VALUES ABOVE

declare @msg nvarchar(max)
set @msg = ''
begin transaction
begin
    begin try
        exec [AppendLine] '### Begining Script', @msg output

        exec [AppendLine] '### File index:', @msg output
        exec [AppendLine] @dbUpdate_FileIndex, @msg output
        exec [AppendLine] '### File description:', @msg output
        exec [AppendLine] @dbUpdate_FileDescription, @msg output

        exec [AppendLine] '### Folder index:', @msg output
        exec [AppendLine] @dbUpdate_FolderIndex, @msg output
        exec [AppendLine] '### Folder description:', @msg output
        exec [AppendLine] @dbUpdate_FolderDescription, @msg output

        if (select COUNT(*) from [DbUpdate] where [FileIndex] = @dbUpdate_FileIndex
            and [FolderIndex] = @dbUpdate_FolderIndex) = 0
        begin
            exec [AppendLine] '### Begining transaction', @msg output
            -- ### USER CODE BELOW ###

            --- ### USER CODE ABOVE
            insert into [DbUpdate] ([FileIndex], [FileDescription], [FolderIndex], [FolderDescription], [Date]) 
                values (@dbUpdate_FileIndex, @dbUpdate_FileDescription, @dbUpdate_FolderIndex,
                    @dbUpdate_FolderDescription, getdate())
            commit transaction
            exec [AppendLine] '### Transaction commited', @msg output
        end else begin 
            exec [AppendLine] '### Update applied already', @msg output 
            rollback transaction
        end
    end try
    begin catch
        declare @ErrorNumber int = isnull(error_number(),-1)
        declare @ErrorMessage nvarchar(4000) = isnull(error_message(),'null')
        declare @ErrorProcedure nvarchar(4000) = isnull(error_procedure(), 'null')
        declare @ErrorLine int = isnull(error_line(),'null')
        declare @tmp nvarchar(max)
        set @tmp = 'An error occurred within a user transaction.
            Error Number: ' + cast(@ErrorNumber as nvarchar(max)) + '
            Error Message: ' + @ErrorMessage + '
            Procedure: ' + @ErrorProcedure + '
            Line Number: ' + cast(@ErrorLine as nvarchar(max))
        exec [AppendLine] @tmp, @msg output
        raiserror(@msg, 16,1)
        if @@trancount &gt; 0 begin
            exec [AppendLine] '### Rolling back', @msg output
            rollback transaction;
        end
    end catch
    insert into [DbLog] ([Text], [Date]) values (@msg, getdate())
end

After pasting this code into your new script file, you have to fill parameters:

@@fileIndex@@ – This is a file number (first 3 digits from filename, ex: “002.AddFieldsToCustomersTable.sql”).

@@fileDescription@@ – This is a part of filename after its index (ex: “002.AddFieldsToCustomersTable.sql”).

@@folderIndex@@ – This is a part of folder (release) number (ex: “Release021“)

@@folderDescription@@ – This is a description part of a folder name (ex: “Release021″).

Hint! The aboving rules are only a proposition, you can change it freely. The most important thing is that these four variables have to help you find the script file in directory structure.

Important! Don’t change [file|folder]indexes in existing scripts. Otherwise checking system in template will execute scripts more than once!

After filling indexes and descriptions in, you can fill the template with SQL code. Just paste it:

...
--- ### USER CODE BELOW ###

Here :)

--- ### USER CODE ABOVE
...

4. Use VCS or DVCS to version scripts on the same rules as code files.

The structure presented in first point should be stored in version control system on the same rules as your code. The main difference is that: if you want to change something in database, you can not change existing scripts. You have to create new and threat it like a diff to existing DB’s. This approach will help you keep scripts backward compatible and ease automatic installation.

5. Automate installation.

This structure is ready for automation, the easiest way is to create your own installation tool that will open scripts in ascending order recursively (folders firstly, then scripts). This tool can be simple, it have to execute all scripts that will be found. First two scripts are reusable, they will not throw exceptions. All other scripts are created using a template that will check if this script was executed.

Hint! Please have in mind that scripts can be parametrized. You can paste markers like: “@@USER_NAME@@” to your sql script, and fill it with a value during execution.

Following this clues will lead you to creating an automated installer/updater for your databases. You can threat this idea as a homework :)