At the opening keynote at PASS Summit 2013 it was mentioned that SQL was adding backup encryption along with the ability to backup databases to an Azure storage container. In my last post I wrote about my early testing of those features, albeit in very simple scenarios. I was able to encrypt a database backup both to local and Azure storage and restore to the original and a secondary DR server.
But back to the keynote for a moment. Another new feature relating to database backups was announced. Microsoft was releasing a tool that could also encrypt backups and save them to Azure, but it would work with older SQL instances. In this post I begin to look at using this new tool to backup to local storage.
The SQL Server Backup to Windows Azure Tool (download link is http://www.microsoft.com/en-us/download/details.aspx?id=40740) requires the OS to be at Windows 2008 or higher for servers or at least Windows 7 if you’re running it on your own pc. It’s only a 10 MB download so obtaining and installing the tool took no time at all. By default it installs to C:\Program Files\Microsoft SQL Server Backup to Windows Azure Tool. I haven’t yet identified any SQL objects it created.
The tool documentation claims it works against instances as old as 2005 and makes no mention of any specific edition. Using it you can compress and encrypt your database backups, even those from a SQL 2005 Standard instance. You have the choice of saving your backup files locally or to Azure storage, as long as you have an account there. I don’t see any mention of being able to use other cloud vendors.
As before, I set up two new virtual environments running Windows 2008 R2 for my testing. I wanted to test against the oldest version of SQL that I could so I installed SQL 2005 Standard Edition with SP4. I tracked down an old version of AdventureWorks to use as my demo database. The copy I’m using is about 180 MB with an 18 MB log file.
My first step after setting up my two servers was to create a benchmark. Backing up AdventureWorks without using the new tool took between 5 and 6 seconds in 3 different runs, each creating a 166 MB backup file. I ran restore 3 times as well, each finishing in about 4 1/2 seconds.
Now I’m ready to try using the new and wonderfully descriptively named Microsoft SQL Server Backup to Windows Azure Tool.
After installing you just run the tool’s Configuration. Basically it’s just a configuration wizard allowing you to add backup rules. You’re able to add more than one rule so you can customize your backup plans to suit your needs; maybe 1 rule to keep some backups locally and another rule to move others to Azure. You only need to run the tool when you want to create, modify, or delete a rule.
The first screen asks where to monitor for backups. You can choose the whole computer or a certain location. In my tests I choose a specific location and set it to look for any .bak file.
The second screen asks where to save your backups, locally or to Azure. My initial backups are to local storage; later I test to Azure. At that point I needed to enter my Azure information, similar to when I set up a certificate in my last post.
The third and final screen allows you to select encryption and/or compression. If you choose encryption you’ll need to enter a secure key. Notice that it only encrypts using the AES-256 algorithm; you don’t have a choice to go weaker or stronger.
And that’s it. After clicking finish you get a screen showing all the rules you set up.
Now let’s start the tests. First I backup to local storage using both the encryption and compression options. I backup using T-SQL with the only option being to write the backup to disk. The backups took a little longer, about 2 extra seconds on average.
The tool uses the .bak file to store metadata about the backup, where the backup is, whether the backup is compressed or encrypted, etc. Depending on the options you are using in your rule the backup will create an additional file that contains the data for the backup. Choosing encryption will create a file with a .enc extension. Choosing just compression you’ll get a .gz file. If you backup without either option you’ll see a .data file. And if you backup without any rule then a regular backup .bak file is created.
Testing backing up locally with compression and encryption created a 2 KB .bak file and a .enc file about 56 MB.
If you open the .bak file in a text editor you can see the values. This file is not encrypted even if you set that option. One of my backups produced the following (I’ve changed the strings, you will see much longer random characters in your backups).
“Trailer”: “Extremely long string here”,
“Salt”: “Salt string here”,
“Iv”: “IV string here”,
“Mac”: “Mac string here”
“Salt”: “salt string 2″,
“Iv”: “IV string 2″,
“Mac”: “Mac string 2″
Next I wanted to test restoring the database. Again I just used a RESTORE T-SQL script with overwriting the existing database being the only option. Restores usually took about 30 seconds, much longer than backups without compression or encryption. You will need to keep the rule in use when you run your backup. If you delete it you won’t be able to restore it.
Restoring the backup to another server was similar. I copied the backup filed from my first instance and tried to restore, getting the following error:
Msg 3241, Level 16, State 0, Line 1
The media family on device ‘C:\SQL2005\Backups\AdventureWorks_full.bak’ is incorrectly formed. SQL Server cannot process this media family.
Msg 3013, Level 16, State 1, Line 1
RESTORE DATABASE is terminating abnormally.
After I installed the new tool on my secondary instance and configured a rule that was identical to the rule on the source server the restores completed successfully. One thing I noticed about backing up locally. You can’t write the backup to the same file using WITH APPEND. If the file exists you’ll get an error. You’ll need to either move your local backups or create backups with unique names.
In my next post I’ll cover backing up to Azure with the new tool.