Backing up SSRS Encryption Key with Powershell

A few months back I wrote a post about how I wanted a way to automate a backup of the encryption key used by our SQL 2008 reporting servers. I thought that Powershell would be the way to go, but I ran into a few issues. First, I couldn’t see a way to use Powershell to access the key. I could always use rsKeyMgmt from a command line, but I’d have to feed in a Y since I couldn’t suppress a prompt. I found a way to do that, but I didn’t want the password to be saved inside a batch file. And I wanted to copy the key file to a second location, in case I needed to use it to restore a report server. You can read my earlier post here.

One day, while I had my twitter feed open on my desktop, I saw a tweet from Laerte Junior (Blog | Twitter). In it he mentioned a post he wrote about accessing the report server WMI objects. And finally everything clicked into place.

The first thing I did was to install the SQLPSX modules on our report servers. The modules are available on Codeplex and you can download them here. Next I created a profile for all users to load the SQL Server snapins. I use Invoke-Sqlcmd to query a database.


Add-PsSnapin SqlServerCmdletSnapin100
Add-PsSnapin SqlServerProviderSnapin100

The final step was to create a table inside a database to store a password to use for the encryption key file. I figured I could easily control access to that password to only those who needed to know it. Now I was ready to create a script to do everything I wanted.

Getting the password out and storing it in a variable was easy:


$results = Invoke-Sqlcmd -Query "SELECT pwd FROM AdminDB.dbo.SSRS" -ServerInstance "Java"

Thanks to Laerte Junior I know what namespace, class and method to use. I simply need to pass in the password stored in my variable.


$SSRSClass = Get-Wmiobject -namespace "root\microsoft\sqlserver\reportserver\rs_mssqlserver\v10\admin" -class "MSReportServer_ConfigurationSetting"

$key = $SSRSClass.BackupEncryptionKey($results.pwd)

One thing to note. The server in my example is the default instance and is named “JAVA”. So when I reference the namespace I use the default instance of “rs_mssqlserver”. If you’re using a named instance just change it to rs_YourInstanceName.

Now that I’ve backed up the key I need to create the file.


$stream = [System.IO.File]::Create("c:\\SSRS.snk", $key.KeyFile.Length)
$stream.Write($key.KeyFile, 0, $key.KeyFile.Length)
$stream.Close()

The last thing I do is to copy the file to a second, safe location.


Copy-Item "C:\SSRS.snk" "\\DEVSERVER\SQLBackups"

And that’s all there is to it. I can use the Report Server Configuration Manager to restore the key when I need to.

image

Now I have this script on all of our report servers, and it runs after we promote new reports. There’s a few things I’ve been meaning to add, like deleting the old file before I create the new one. And I should put in error handling. I’ll be putting that into v2.

Monitoring SSRS Report Performance – Part 4

m4s0n501

This is the last in my series on monitoring performance on a server hosting Microsoft’s SQL Server Reporting Service. In Part 1 I show a way to monitor reports that are running. In Part 2 I shared a few queries to see historical report performance. In Part 3 I discuss building an SSAS cube that lets me analyze the performance in Excel or SSRS reports. Today I’m going to look on the reporting server itself.

A good place to start is with report server performance counters.  Some of the ones I keep a close eye on are the Requests Queued, Requests Executing, Requests Rejected, and Tasks Queued. Overall these counters give me a good picture of how active the server is. Continuous high values in any of them may tell me that some reports may be running sluggish. Another similar counter is Active Connections. Ordinarily this number is pretty small, but I’ve seen issues when a high number of active connections caused CPU and memory issues. Though I haven’t been able to pinpoint yet why it’s happening at least I get a heads up that it’s about to.  Memory Pressure State is also useful. and I’ll discuss that a little more later. You can view the full list of available counters on Books Online.

Secondly, look at the report server error logs. Anytime a report runs it’s logged in these files. Usually it’s just the fact that the report was  processed. But if you see reports that show a status other than rsSuccess in the ExecutionLog2 view the reason is probably here as well. in this screenshot I can identify a report that probably didn’t render fully as there’s a data type mismatch, most likely in the data source of the report.  Then I can check other times this report ran to see if there were similar errors. That could help in determining if the fault is with the report or with the data for the time this report ran. Remember that you can see the report parameters back in the ExecutionLog2 view. image

Ted Krueger, a SQL MVP who writes for LessThanDot (Blog | Twitter) wrote an excellent post on the logs here. And Books Online also shows you how to  control the logs in Server Trace Log; what info to store, how long to store them, and where they can be found.

Finally, check out the settings in the rsreportserver.config file. Again, Books Online has a breakdown of each option (RSReportServer Configuration File). Some of the settings here can really be helpful in getting the best performance out of your SSRS server. Remember the Memory Pressure State performance counter I mentioned earlier? Here is where you can define what memory pressure is. There are some great articles online about what each option is used for. One of the best (Exploring the RSReportServer.config File) was written by Jes Schultz Borland (Blog | Twitter) for SQL University.

I hope I was able to help out a bit. Definitely check out the links I’ve provided if you need more information. All those authors have written about SSRS much better than I can.

Other Resources

There’s another very useful post on Reporting Services Performance Optimizations on the SQLCAT site. Especially helpful is the section on Memory Configurations on both SQL 2008 and SQL 2005 instances.

Also, check out Configuring Available Memory for Report Server Applications in Books Online.

Monitoring SSRS Report Performance – Part 3

In Part 1 I discussed building a small VB.NET project to see what reports were currently running on a report server. In Part 2 I talked about the ExecutionLog2 view to show how long individual queries ran. Now in Part 3 I’m going to show how I look at performance using SSIS, SSAS, SSRS, and Excel 2010.

If you’ve read some of my older posts you may have run across a few where I mention using SSIS to collect data from the report servers. Well, I finally got around to building my package. I ran across a few excellent examples on how to do this, and the links are in the References section at the end of this post. I borrowed some from each and came up with what I began using a few weeks ago. My whole solution isn’t complete; I’ve left some features out of the SSAS cube for now.

The design of my data warehouse is fairly simple. I built a fact table based on the ExecutionLog2 view and use execution, pagination and processing results as my measures. My dimension tables are on servers, reports, users, sources, formats, statuses, and date. My date dimension doesn’t use time, so I can relate the TimeStart and TimeEnd only to a data and not time of day.

My ETL package runs nightly against 2 SQL 2008 SSRS servers. First it collects all the data from the last 24 hours and stores it in a local staging table. Then it loads the data into my fact and dimension tables. I’ve been keeping the staging table for archiving reasons for now. I’ll probably start archiving is soon as the size of the is growing pretty fast and I’m going to run out of room soon.

The final step is the cube I build using SSAS. There’s nothing fancy in the data source or data view, I’m just pointing to my data warehouse database. I did create a named query that points to the staging table, and there’s a few simple named calculations that concatenate values. I set the properties on all the dimensions so that I can return the name and not the key values. The cube is a star schema and looks like this;

image

Inside the cube I defined calculations for Average Data Retrieval, Average Processing Time, and Average Rendering Time. I partitioned the fact table by month, and right now I have 7 partitions gong back to June of this year. Then I used the wizard to build 11 aggregations. Finally I created an drilldown action that will show individual report execution details when I click on a cell. I haven’t set up any KPIs yet, that will be in the next version.

So now, I can query the cube in SSMS. Here I’m looking at last month’s data by server.

image

Using the drilldown action I defined I can look at some of the report execution log details. This screenshot hides some columns, there’s more that I display here.

image

Or I can use Excel to connect to the cube and view the data. There I can set up any slicers I want. This one is for the report server.

image

And I can also see the data by running a report. This example shows average report times for November 2011, sorted by the time it took to execute the report, and displaying the ones that took over a minute in red.

image

As I said, this is still a work in progress. I’ve got more I want to play with, especially with using MDX to query SSAS. But it should help me better understand Microsoft’s Business Intelligence and also when I finally get around to taking the 70-448 exam.

I’ll have one final post in the series, on monitoring performance on the server itself.

References

There are two very good examples on building SSIS packages for report server performance data. You can find the first on MSSQL Tips. The examples there are for SQL 2005, but they should work on SQL 2008 with just minor tweaks.

http://www.mssqltips.com/sqlservertip/1908/analyze-report-execution-and-usage-statistics-in-sql-server-reporting-services/

The second example is on Codeplex. This version was built for SQL 2008 and the download contains code to build reports against the data.

http://scrubs.codeplex.com/releases/view/36223

Monitoring SSRS Report Performance: Part 2

In part 1 of this series I discussed how I built a small VB.NET application that would show me what reports were currently running on our SQL 2008 Report Servers. In this post I’m going to show how to look at the performance of reports after they run.

Reporting services stores execution history in a table in the ReportServer database called ExecutionLogStorage. The ReportID can be joined to the Catalog table to get the name and path of the report. Report Services has a view called ExecutionLog2 prepared that you can use instead.

Most of the column names are self explanatory, and if you visit the links in the references section you’ll see a full description. The columns I mostly look at are

  • Parameters: Here I can see the parameters used for the report execution. This will let me rerun the report the same way the user did, helpful to determine where the problem is.
  • Status: This column shows if the user could view the report with no problems. I’m looking for rsSuccess. If the report was unsuccessful you’ll see why here. Depending on the Status I can look at the report server logs for more information.
  • TimeStart and Time End: These columns show me when the report ran and how long it took. I can see if the report execution happened when the server was under pressure or if it was the only report running at the time.
  • TimeDataRetrieval: This column shows the time, in milliseconds, that the report took to fetch the data. High values here may indicate the problem is with the report data source and not the report server. It’s also nice to compare times when the report has changed.
  • TimeProcessing and TimeRendering: How long did it take the report server to deliver the report to the user? Values here are in milliseconds.
  • ByteCount and RowCount: How big is the report? Bigger reports take longer to process.
  • AdditionalInfo:There’s information here that shows information related to the memory and paging of the report.

Here are two scripts I use to look at the ExecutionLog2 view. The first will show each report execution for the time range I provide. I usually run it once in the morning for a quick look and again when users report performance problems.

DECLARE @TimeStart DATETIME, @TimeEnd DATETIME
SET @TimeStart = '2011-10-24 06:00'
SET @TimeEnd = '2011-10-24 12:00'
SELECT ReportPath
       , UserName
       , [Status]
       , TimeStart
       , TimeEnd
       , DATEDIFF(ss,TimeStart,TimeEnd) AS ElapsedSec
       , TimeDataRetrieval /1000.00 AS TimeDataRetrieval
       , TimeProcessing /1000.00 AS TimeProcessing
       , TimeRendering /1000.00 AS TimeRendering
       , ByteCount
       , [RowCount]
       , ISNULL(AdditionalInfo.value('(/AdditionalInfo/ScalabilityTime)[0]', 'int'),0) AS PaginationScalabilityTime
       , ISNULL(AdditionalInfo.value('(/AdditionalInfo/ScalabilityTime)[1]', 'int'),0) AS ProcessingScalabilityTime
       , ISNULL(AdditionalInfo.value('(/AdditionalInfo/ScalabilityTime)[0]', 'int'),0) +
         ISNULL(AdditionalInfo.value('(/AdditionalInfo/ScalabilityTime)[1]', 'int'),0) AS TotalScalabilityTime
       , ISNULL(AdditionalInfo.value('(/AdditionalInfo/EstimatedMemoryUsageKB/Pagination)[1]', 'bigint'),0) AS PaginationEstimatedMemUsageKB
       , ISNULL(AdditionalInfo.value('(/AdditionalInfo/EstimatedMemoryUsageKB/Processing)[1]', 'bigint'),0) AS ProcessingEstimatedMemUsageKB
       , ISNULL(AdditionalInfo.value('(/AdditionalInfo/EstimatedMemoryUsageKB/Pagination)[1]', 'bigint'),0) +
         ISNULL(AdditionalInfo.value('(/AdditionalInfo/EstimatedMemoryUsageKB/Processing)[1]', 'bigint'),0) AS TotalEstimatedMemUsageKB
FROM ReportServer.dbo.ExecutionLog2
WHERE TimeStart BETWEEN @TimeStart AND @TimeEnd
ORDER BY TimeStart

The second script is similar to the first except that it show report averages for the desired time frame. I usually run this once a month to get a baseline, then again at certain high volume times.

DECLARE @TimeStart DATETIME, @TimeEnd DATETIME

SET @TimeStart = '2011-09-01 00:00' 
SET @TimeEnd = '2011-10-01 00:00' 

SELECT
      ReportPath
    , COUNT(ExecutionId) AS NumReports
    , AVG(DATEDIFF(ss,TimeStart,TimeEnd)) AS AvgElapsedSec
    , AVG(TimeDataRetrieval /1000.00) AS AvgTimeDataRetrieval
    , AVG(TimeProcessing /1000.00) AS AvgTimeProcessing
    , AVG(TimeRendering /1000.00) AS AvgTimeRendering
    , AVG(ByteCount) AS AvgByteCount
    , AVG([RowCount]) AS AvgRowCount
    , AVG(ISNULL(AdditionalInfo.value('(/AdditionalInfo/ScalabilityTime)[0]', 'int'),0)) AS AvgPaginationScalabilityTime
    , AVG(ISNULL(AdditionalInfo.value('(/AdditionalInfo/ScalabilityTime)[1]', 'int'),0)) AS AvgProcessingScalabilityTime
    , AVG(ISNULL(AdditionalInfo.value('(/AdditionalInfo/EstimatedMemoryUsageKB/Pagination)[1]', 'bigint'),0)) A AS AvgPaginationEstimatedMemUsageKB
    , AVG(ISNULL(AdditionalInfo.value('(/AdditionalInfo/EstimatedMemoryUsageKB/Processing)[1]', 'bigint'),0)) AS AvgProcessingEstimatedMemUsageKB
 FROM ReportServer.dbo.ExecutionLog2
 WHERE TimeStart BETWEEN @TimeStart AND @TimeEnd
 GROUP BY ReportPath

I’ll have one final post where I look at report history with SSIS and SSAS.

Reference:

Robert Bruckner has a great blog on Report Server performance. This post is a great description of the ExecutionLog2 view.

ExecutionLog2 View – Analyzing and Optimizing Reports

Monitoring SSRS report performance: Part 1

One of my self-assigned tasks is to monitor the performance of our reports. It’s a good way to see how reports behave over time or what effect changes on a report has. There’s a few methods I use, from seeing what report is running right now to how that report compares to other report runs.

Reporting Services saves report execution statistics in a table called ExecutionLog in the ReportServer database on your report server. In SQL 2008 there’s also a view called ExecutionLog2 that does a few joins and shows the values in a way that may make more sense. I’m not going to go into too much detail about that view or table here, they’ve been written up pretty well in posts by others. See the references section at the end for a few helpful links.

The only problem with the ExecutionLog table is that it doesn’t show a row for a report until the report execution is complete. It doesn’t help you to see what report is running at a particular time. There are times when the report server is under load that I want to track what reports are running at that exact time. To do that I wrote a small VB.NET program that will show me what’s running anytime I choose. I wrote mine in Visual Studio 2010 but it could be created in earlier versions, and you can easily convert this to C# if you wish.

To start, create a new Windows Forms project called Current Reports, create 2 forms, set one up to enter login information, and drop a data grid with a label for each report server you want to monitor, a timer control, and a button or the other. Next create a web reference to a SQL2008 report server and call it ReportService:

   1: http://localhost/ReportServer_Reporting/ReportService2005.asmx

Then set a reference to the report service, one for each server you’re monitoring. Pass in the login information from your login form.

   1: Imports Current_Reports.ReportService

   2:  

   3: Public rsServer1 As New ReportingService2005

   4:  

   5: rsServer1.Url = "http://localhost:85/ReportServer_Reporting/ReportService2005.asmx"

   6: rsServer1.Credentials = New Net.NetworkCredential(UserName, Password, Domain)

Finally, write a procedure to fetch the information you want to see. I call the procedure when I first load the form, again when the timer fires (I set the timer interval to 30 seconds), or on demand when I click the refresh button.

   1: Public Sub ListRunningJobs()

   2:     dgvServer1.Rows.Clear()

   3:    

   4:     Dim Server1Jobs As Job() = Nothing

   5:     Server1Jobs = rsServer1.ListJobs()

   6:     Dim job1 As Job

   7:  

   8:     For Each job1 In Server1Jobs

   9:         If job1.Status = JobStatusEnum.Running Or job1.Status = JobStatusEnum.New Then

  10:             Dim strValues(3) As String

  11:             strValues(0) = job1.Name

  12:             strValues(1) = job1.JobID

  13:             strValues(2) = job1.StartDateTime

  14:             strValues(3) = job1.User

  15:             dgvServer1.Rows.Add(strValues)

  16:         End If

  17:     Next job1

  18:  

  19:     Me.Text = "Currently Running Reports as of " & Date.Now.ToString

  20:     

  21: End Sub  'ListRunningJobs

This code just checks to server for any running jobs. For each job it finds it returns the report name, the job id, when it started, and who’s running it. It also shows the time the job info was fetched, so you can see if there are any reports that have been running for a long time. You need to duplicate this for each server you want to monitor; in monitoring two so that’s why I used two data grids.

CurrentReports

In Part 2 I’ll describe how I check the execution logs.

References:

Robert Bruckner has a great blog on Report Server performance. This post is a great description of the ExecutionLog2 view.

ExecutionLog2 View – Analyzing and Optimizing Reports