Using Microsoft’s New Database Backup Tool – Azure Storage

In my last post I wrote about using Microsoft’s new SQL Backup to Azure tool to compress and encrypt database backups to local storage. In this post I’m going to talk about using it to backup to an Azure storage container.

Before I get to the Azure tests let me go over some of the basics of using the tool from my last post. 

Before Microsoft released the SQL Server Backup to Windows Azure Tool (download link is http://www.microsoft.com/en-us/download/details.aspx?id=40740) you needed a third party tool compress/or encrypt your database backups. The upcoming SQL 2014 release will have both features, plus allowing you to mirror a backup to an Azure storage container. But if you are still running older instances of SQL you can use the new tool to get the same functionality, going back to SQL 2005.

Microsoft SQL Server Backup to Windows Azure Tool runs as a Windows Service under a new local account named .SQLBackup2Azure.To use the tool you just need to configure three options. The first asks where your backups are stored. The second asks if you want to use local or Azure storage. If you choose Azure you’ll need to enter your Azure account name, key, and container name.

image 

The third step allows you to select to compress and/or encrypt your backups, as well as enter your encryption password. When you finish you’ll get a screen where you can manage your backup rules.

Then just backup and restore your databases as you do normally, either with SSMS or with T-SQL commands.

The files you get depend on the options in your backup rule. The tool will create a .bak file that stores the metedata about your backup. This file is not encrypted,even if you choose that option. If you select to encrypt your backups you’ll see a .enc file with the same name as the .bak file. If you just choose compression then a .gz file is created. And if you didn’t choose compression or encryption for your rule then a .data file is created. Backing up with no rule will just create the .bak file.

I’m using the same two new virtual environments I set up before, running Windows 2008 R2 for my testing and SQL 2005 SP4 Standard Edition.My demo database will be AdventureWorks. The copy I’m using is about 180 MB with an 18 MB log file. I’m using my MSDN Azure subscription, where I created a blob storage container named “backups” for my tests.

For my first test I backed up AdventureWorks using a T-SQL command. Backing up to Azure with both the compression and encryption options took 2 minutes 11 seconds. The tool created a 3 KB .bak file containing just metadata in the local location I specified. In my Azure container it appended a string to my backup name and created the 3 KB .bak and 56 MB .enc files.

image 

Restoring the backup also took just over 2 minutes. I ran RESTORE FROM DISK and pointed to the .bak file on local storage. The restore took the info and restored using the files stored in Azure.

Restoring the backup file on my second DR instance was easy. As I did when testing restores to local storage using the tool, I created the same rule as on my source server and copied the 3 KB .bak file created there. Running the same RESTORE command took 2 1/2 minutes.

I’m not entirely sold on using this tool to backup to Azure, for a few reasons. First and most importantly, it doesn’t always work. At least half of the time backing up with the tool I get one of the following errors. 

Msg 3202, Level 16, State 2, Line 1
Write on “C:\SQL2005\Backups\AdventureWorks_full.bak” failed: 1723(The RPC server is too busy to complete this operation.)
Msg 3013, Level 16, State 1, Line 1
BACKUP DATABASE is terminating abnormally.

Msg 3201, Level 16, State 1, Line 1
Cannot open backup device ‘C:\SQL2005\Backups\AdventureWorks_full.bak’. Operating system error 31(A device attached to the system is not functioning.).
Msg 3013, Level 16, State 1, Line 1
BACKUP DATABASE is terminating abnormally.

What appears to be even worse is that the backups seem to appear in my Azure container. The .enc file shows a size of 1 TB.

image

I haven’t been able to troubleshoot either error yet so I don’t know if the error is on my side, with Azure, or with the backup tool. To rule out my virtual server setup I installed the tool on my laptop running Windows 7 and SQL 2008 with SP3; I get the same errors there.

Another drawback is that I haven’t been able to download the files from Azure to local storage and restore those copies. I’ve edited the metadata to point to the downloaded files and changed my rule to look at local storage but so far I haven’t had a successful restore. Again, this could be my error, I’m probably doing something wrong (the documentation that comes with the tool is pretty thin).

I’m going to keep testing, but until I can backup to Azure without errors consistently and I figure out how to restore from a downloaded file I’m not trusting it with any critical backups. I also need to run some tests of backing up transaction logs.

SSDT – Arranging windows to customize your workflow

One of my favorite features in the new SQL Server Data Tools is the query window. Specifically, what I like is the ability to rearrange the panes to better see query results and different sections of a long query or procedure. If you use Visual Studio you’re probably already familiar with rearranging code windows.But if you’re just used to using SSMS for queries this might be new to you.

If you just run a query the way you do in SSMS you can see results the same way. This screenshot is using SSDT for Visual Studio 2010 but you can do the same things in Visual Studio 2012.  

 image

But because you’re in Visual Studio you can rearrange the windows. You can show the results on the side, nice when you return a lot of records. Just click the little switch button to show vertically or horizontally.

ssdt1

But even better is the ability to split a large block of code into two sections. Ever find yourself jumping around a long procedure, repeatedly going from the middle back to the top to see where variables get their values? Well, in SSDT you can split the code window and scroll the different sections. Drag the splitter to resize them to whatever size you’re comfortable with.

ssdt2

image

Finally, like in SSMS you can separate the different tabs to display side by side You can also move a tab to a second monitor.

SQL Data Tools for Business Intelligence released

Last week saw the long awaited release of SQL Server Data Tools for Business Intelligence. This version of SSDT  has the templates for creating Intelligence Services packages, Reporting Services reports, and Analysis Services projects both cubes and tabular, inside of Visual Studio 2012. That’s great news as a lot of developers, myself included, have been waiting for this functionality.

However the bad news is that it doesn’t include the Database Projects templates released last year. You’ll still need to install them separately. But they will work together.

So if you want just the BI templates for Visual Studio 2012 you only have to install the BI version of SSDT. If you also want the database projects you will need to install both the BI templates and the database templates. And if you want to use the test plans for your new database projects and create SSRS reports or SSIS packages you’ll need a full edition of VS 2012, either Premium or Ultimate, plus the database templates plus the BI templates.

You can get the database project templates for either VS2010 or VS2012 from the SSDT Team blog. The BI templates for VS2012 can be found as a Microsoft download. And you can read more about the added functionality on the SSRS Team blog, the SSIS Team blog, or the SSAS Team blog.

I can’t keep up with SSDT

Earlier today I posted a quick little walk-through on extracting and publishing a database using Microsoft’s SQL Server Data Tools. A few hours later the SSDT team announced another update. This one will allow you to do unit database testing. It also rolls the SSDT Power Tools into this version. It looks like I’ve got some more research to do. But I don’t mind. Actually unit testing of databases is something I’d like to learn more about anyway.

The SSDT says they will blog soon on the new functionality so stay tuned to their blog for more information. And get the December update here.

Quickly Build a Database Sandbox Using SSDT Power Tools

I’ve written a few posts on using SQL Server Data Tools (SSDT) to build a database sandbox to play in. There’s a few ways to build out the schema, either by using database object scripts or directly from an existing database. The problem with those solutions is that it doesn’t include the data, though there are ways to populate the sandbox db. But if you use the Power Tools for SSDT you can quickly create and deploy a fully populated version.

First, go to the SSDT blog site to download and install the latest version of SSDT Power Tools. There’s one for Visual Studio 2010 and Visual Studio 2012, be sure to grab the right one. Also grab the latest SSDT update, if you haven’t already. After that it’s a simple 2 step process..

Step One is to right click on the database you’re creating the copy of and choose Extract Data-Tier Application. Enter the name and file path for the extract. Next choose the Extract Settings. You have the option to only extract the schema, or you can include the data. If you choose to include the data you can select one or more tables whose data you want to include. Finally check the options you want, like also scripting logins and permissions. When you’re done your screen will look something like the one below. Go ahead and click OK when you’re done.

image

You can watch the progress while the extract is running. The time required depends on the size of the database. In my little demo I’m using a small database so it only took 30 seconds. The process creates a more detailed log that you can read if you want to see more than just the summary.

image

Step Two is just Step One in reverse. This time right click on the Databases node of the SQL instance where you’re deploying to and choose Publish Data-Tier Application. Select the .dacpac file you created in Step One, enter a database name, edit the Target database connection if you need to, and choose whatever other options you want. Click Publish when you’re ready. In this case I’m publishing back to the same SQL instance where the original lives, but with a different database name. But you will most likely be publishing it on a different instance.

image

Again, you can watch the progress of your deploy, and view the log for more detailed info. My little sample only took 20 seconds.image

After that you can go play in your new database sandbox. Not counting mouse clicks it only took me 50 seconds!

image

First Look at updated SQL Server Data Tools

Just about the time I was wrapping up my earlier posts on SQL Server Data Tools the SSDT team announced that new versions and new functionality were available. So, sooner than I thought, here’s a little more on the September 2012 SSDT.

  • If you installed SSDT from the SQL 2012 or VS 2012 media, or from a web download before 9/25/2012, you’ve got the old version of SSDT. That version opened the VS2010 shell, even if you had VS 2012 installed, one of the biggest complaints about SSDT that I’ve heard. But now you have a choice. You can download either the VS 2012 OR VS 2010 version. The new version will upgrade the existing one, you don’t have to uninstall it first. And if you’re installing the VS 2010 version it will also install SP1. But if the computer you’re installing on already has VS 2010 you need make sure SP1 is there before continuing with SSDT.
  • One of the biggest changes to SSDT was in how use the databases you create in database projects. Before the update you’d see a new instance for every project you created. Talk about server sprawl! Now you get two instances. The first instance is (localdb)\V11.0 and this is the SQL Express LocalDB instance. You can download and install this separately from SSDT, just go to the SQL Server 2012 Express page to get it. The second instance is called (localdb)\Projects. All the databases you create in SSDT projects will run from this instance. Much cleaner from an organizational standpoint.
  • One of my biggest disappointments about SSDT is that it doesn’t include the Business Intelligence templates. If you need to create SSIS packages or SSRS reports you’ll still need Visual Studio. And I really wish they’d bring back the Data Generator tool. I’m keeping a full VS 2010 version around just for the “Data Dude” features that haven’t made it over to SSDT yet.

Links

  • A good source of info on SSDT is available on the SQL Server Data Tools Team Blog. This is where they announce the availability of upgrades. You’ll also find links to the September 2012 SSDT upgrades for VS 2012 and VS 2010.

If you just want the LocalDB database engine go to the Microsoft SQL Server 2012 Express page.

SQL Server Data Tools – A Last Look (for now)

This post will wrap up my introduction to SQL Server Data Tools (SSDT). In my first post I wrote about my using SSDT to set up a developer sandbox of a SQL database. My second post covered using SSDT to build a prototype database using LocalDB instead of a SQL instance. I added a third post to clear up some facts I didn’t cover correctly, about LocalDB. Today I’m going to show what the LocalDB looks like, as well as a little more on those .dacpac files created when you take a snapshot of your project.

When you create your first SQL database project SSDT will start SQL but not as a service. You can verify this by opening Task Manager and looking at the running processes. You should see a process sqlservr.exe running under your name. If you’re also running a SQL instance you’ll see a second sqlservr.exe process running under whatever the startup account is. Notice that the location of the executable is C:\Program Files\Microsoft SQL Server\110\LocalDB\Binn. clip_image002

Sqlservr.exe will also create a new set of system databases for your new project. You’ll find these at C:\Users\your name\AppData\Local\Microsoft\Microsoft SQL Server LocalDB\Instances. You’ll see an instance for every SQL Database project you create.  Confirm this by querying sysfiles. Open that location and you’ll also see the error logs and blackbox traces you’d normally see in SQL instances. Open either to see how your instance is behaving.

image

Open the location where you saved your SSDT project. You should see a folder called  Sandbox. Inside you’ll see a .mdf and .ldf file for your database. You can detach it and reattach it, or backup and restore it, to another server when you’re ready to move it.

There’s a few things about the snapshots I’d like to mention. There’s other ways to use then instead of just with Schema Compare. You can pass it the latest .dacpac file from your project to another developer who can create a new project based on your snapshot. You can open .dacpacs created in one version of Visual Studio in another; 2012 to 2010 for example. And you can also open it to see what it contains.

A .dacpac file is like a compressed folder. If you open it you can extract the files to a new directory. Inside you’ll see some .xml files with configuration information; database settings for example. You’ll also see one model.sql file that has a create object for all objects in your database. There’s no INSERT statements, though.

image

Finally, the SSDT team announced an update last Friday. There’s some new stuff there, like opening VS 2012 and only creating one instance on LocalDB instead of one for each project. Check it out on their blog. And Google/Bing for SSDT, there’s a lot of good information out there.

p5rn7vb

SQL Server 2008 Training Kit is available

One of the things I like about Microsoft is the wealth of training they make available for their products. There was a training kit for ASP.NET, one for Visual Basic 2008, and now there’s one for SQL Server 2008. You can get yours at http://www.microsoft.com/downloadS/details.aspx?familyid=E9C68E1B-1E0E-4299-B498-6AB3CA72A6D7&displaylang;=en