Working with a SQL Report Server Part 3 – Configuration Manager

This is the third in a series of posts on administering SQL Server Report Services. In Part 1 we went over the basic installation of SSRS for native mode. Part 2 went over the components that were installed. This post will discus configuring SSRS using the Reporting Configuration Manager tool, a graphical tool to set up your SSRS instance.

When we installed our new instance we choose the Install and Configure option using the default settings. Let’s take a look to see what they are. Open the Reporting Configuration Manager and connect to your instance. I’m using SQL 2014 Developer, but the tool is similar for other editions and versions.


Once you’re connected you’ll see the status of the service and instance information. You can start SSRS if it isn’t already.


Next, you can set up the account that Reporting Services runs under. Since we chose the default settings during our installation the service is running under the new ReportServer account. You can change to another built-in account or any other domain account.


Configure the Web Service URL on the next screen. Our default settings created a virtual directory called ReportServer; you can create a different one if you want. Also, all IP addresses on the server are assigned using port 80. If you click the Advance button you can set a single address and change the port number. There’s also an option to use a SSL certificate if you’ve installed one.


The screen will display the ReportService URL when you’re finished. Clicking it will open the web service URL. At this point there’s probably nothing much to see. Later you’ll see any folder and reports you’ve deployed.


The next screen is where you set the databases. SSRS requires 2; one to store reports and one for sessions. The defaults are ReportServer and ReportServerTempDB but you can create new ones if you like. This is helpful if you have databases from an existing report server you’ve restored. You can also change the account SSRS uses to connect to the databases. Again the default is the ReportServer account.


Continuing on, the next step is to configure the Report Manager. It’s similar to setting up the web service.


And like the web service, nothing to see until you deploy some reports.


If you’re going to send reports via email subscriptions you need to set up an email account for SSRS first.


Some of your reports may need to open images stored on the network somewhere. Or you may have reports that need to connect to data sources that don’t require authentication. For those cases you’ll need to configure a default Execution Account. Specify a domain account that has the privileges you’ll need. Note that this step is optional.


SSRS uses a symmetric key to encrypt sensitive data, such as passwords used by reports. You should backup the key periodically. One way is through the Report Configuration Manager. You can also use the rskeymgmt command line tool, which I’ll discuss in a later post. I’ll also show you a way to schedule a backup using PowerShell.


I’m not going to discuss Scale-out Deployment here, that’s a much bigger topic.

In the next post I’ll talk about configuring SSRS with through config files.


Reporting Services Configuration Manager (SSRS Native Mode)

Working with a SQL Report Server Part 2 – What you get

This is my second post in a series that will discus SQL Report Server from an administrator standpoint. I hope to show you how to work with SSRS, how to configure your instance and what to look for when troubleshooting. None of my posts will talk about report design (except some general tips), SharePoint mode, or scale out installations. Part 1 walked through a default setup using native mode.

Before we start discussing how to configure a report server lets look at what is installed when you choose the native mode default configuration.

Report Server service

This is just as it sounds, the actual SSRS service which processes report requests. It also includes the Report Manager and Report Service web services for managing your reports and the URL address for both. A Network Service account is created that the SSRS service runs under, but you can change this later.


I mentioned in my first post that SQL will install 2 databases. The first is the ReportServer database, where SSRS stores the reports files. You’ll also see report subscriptions, snapshots, report history, encrypted keys, and other server settings.

The second database is called ReportServerTempDB. The session data is stored here, as well as cached reports and other temporary objects.

We’ll talk much more about these reports when discussing server performance.


The Reporting Services Configuration Manager is the pretty GUI way to make configuration changes. If you chose the Install and Configure option when installing SSRS it will be configured with all the defaults. But you can still come here to make changes, or just to see what those defaults are. I’ll cover using the Config Manager in Part 3.

SSRS also installs three command line tools to help deploy reports and manage your server. Use rs.exe to deploy reports to the report mananger. rsconfig.exe encrypts and stores account information in the RSConfig file, and rskeymgmt,exe is used to manage the encryption key. I’ll cover these utilities when I talk about configuration.

I haven’t mentioned any of the tools for designing reports since this series is only covering the administration side of SSRS.

In the next post we will walk through using the Report Server Configuration Manager.


Install Reporting Services Native Mode Report Server (SSRS)

Reporting Service Tools

Working with a SQL Report Server Part 1 – Installing


I first heard about SQL Server Report Service (SSRS) when it was announced as an add on to SQL 2000. If my memory is correct, it was originally presented as a beta product, something to be added in future releases. But the clamor for it was so great that Microsoft released it as a separate product. When SQL 2005 was released SSRS was part of the installation, along with Analysis Service (SSAS) and Integration Service (SSIS).

I started playing with the original beta, mainly just writing and publishing simple reports to see how easy it was to work with. My company started using SSRS with the SQL 2005 release for a few new reports we were developing. Later we upgraded to SQL 2008 as a replacement for the reporting engine we were using at the time (don’t laugh, it was MS Access 97).

But there was a learning curve. It wasn’t just a matter of installing the service and publishing reports. I had to learn how to troubleshoot the server, not just from a SQL standpoint. I had to find out how to identify slow running reports, why they were running so slowly, and how to optimize the server to run smoothly.  In this series I would like to share what I’ve learned along the way. I’ve written about some of the same topics a few years back, but this time I want to tie them all together and cover some things I skipped over earlier. I hope you find these posts helpful. And please, feel free to add your own experiences in the comments section.

This first post covers installing SSRS. It’s a basic task you’re probably already familiar with, but not everyone is. I should also mention that I’m only going to talk about the native configuration, saving the SharePoint side of things for another day. I also won’t discuss setting up a server farm here, again that will be covered later.

Setting Up The Scenario

You’re the DBA at your company, and you’re in charge of keeping the production SQL servers humming along. You know how to maintain those servers; run DBCC CHECKDB on a regular basis, backup the databases, check for bottlenecks that affect performance, help teammates tune their queries, and any other tasks that come up. One day your boss calls you in for a talk.

“DBA, ” your boss says, “our customers are complaining about their reports running slowly. Sometimes they can’t even connect when we know the server is running. I want you to see what’s happening on those report servers. They’re just another SQL instance after all, so you’re just the person to check it out and fix it. Oh, and we need you to set up another SSRS instance. It’ll come in handy later.”

“I’m on it. Should be a piece of cake.” you reply, and you head back to your desk to get started on installing the new sever.

The Installation

For this installation I’m going to use the developer edition of SQL 2014 but SSRS is not an enterprise edition only feature, you can also use standard. The process isn’t much different for SQL 2008 R2 or SQL 2012, the Reporting for SharePoint feature was a separate install.

The server you’re installing on should meet the same hardware and software requirements as the SQL edition you’re planning on using. You will need a SQL instance to install SSRS on as the service will require 2 databases. You can add SSRS as a feature to an existing instance or choose database engine. I’m going to choose new instance.


Next, choose the features you want for your new instance. You will need to include the database engine as SSRS will create two new databases. For our new server I’m also going choose Reporting Services – Native and Management Tools.

Continue with your install and you’ll finally get to the Reporting Services Configuration screen. You’ll see two choices under Native Mode: Install and Configure and Install Only. The difference between these options is that the Install and Configure will create the two databases and SSRS with all the default settings. Whichever option you choose you should manually configure the service later. I’ll have much more on configuration in my next post.

We’re didn’t choose to integrate with SharePoint so that option isn’t available. For this instance we’ll choose Install and Configure and complete the SQL instance installation.


And that’s just about all that needed to get an SSRS instance up and running quickly. If you open Management Studio and connect to your new SQL instance you’ll see the two new databases SSRS will be using, ReportServer and ReportServerTempDB. Both will be covered in detail in a separate post. For now, briefly, the ReportServer database is where the reports live and the ReportServerTempDB stores session information for SSRS.



Next week I’ll discuss SSRS configuration.

Backing up SSRS Encryption Key with Powershell

A few months back I wrote a post about how I wanted a way to automate a backup of the encryption key used by our SQL 2008 reporting servers. I thought that Powershell would be the way to go, but I ran into a few issues. First, I couldn’t see a way to use Powershell to access the key. I could always use rsKeyMgmt from a command line, but I’d have to feed in a Y since I couldn’t suppress a prompt. I found a way to do that, but I didn’t want the password to be saved inside a batch file. And I wanted to copy the key file to a second location, in case I needed to use it to restore a report server. You can read my earlier post here.

One day, while I had my twitter feed open on my desktop, I saw a tweet from Laerte Junior (Blog | Twitter). In it he mentioned a post he wrote about accessing the report server WMI objects. And finally everything clicked into place.

The first thing I did was to install the SQLPSX modules on our report servers. The modules are available on Codeplex and you can download them here. Next I created a profile for all users to load the SQL Server snapins. I use Invoke-Sqlcmd to query a database.

Add-PsSnapin SqlServerCmdletSnapin100
Add-PsSnapin SqlServerProviderSnapin100

The final step was to create a table inside a database to store a password to use for the encryption key file. I figured I could easily control access to that password to only those who needed to know it. Now I was ready to create a script to do everything I wanted.

Getting the password out and storing it in a variable was easy:

$results = Invoke-Sqlcmd -Query "SELECT pwd FROM AdminDB.dbo.SSRS" -ServerInstance "Java"

Thanks to Laerte Junior I know what namespace, class and method to use. I simply need to pass in the password stored in my variable.

$SSRSClass = Get-Wmiobject -namespace "root\microsoft\sqlserver\reportserver\rs_mssqlserver\v10\admin" -class "MSReportServer_ConfigurationSetting"

$key = $SSRSClass.BackupEncryptionKey($results.pwd)

One thing to note. The server in my example is the default instance and is named “JAVA”. So when I reference the namespace I use the default instance of “rs_mssqlserver”. If you’re using a named instance just change it to rs_YourInstanceName.

Now that I’ve backed up the key I need to create the file.

$stream = [System.IO.File]::Create("c:\\SSRS.snk", $key.KeyFile.Length)
$stream.Write($key.KeyFile, 0, $key.KeyFile.Length)

The last thing I do is to copy the file to a second, safe location.

Copy-Item "C:\SSRS.snk" "\\DEVSERVER\SQLBackups"

And that’s all there is to it. I can use the Report Server Configuration Manager to restore the key when I need to.


Now I have this script on all of our report servers, and it runs after we promote new reports. There’s a few things I’ve been meaning to add, like deleting the old file before I create the new one. And I should put in error handling. I’ll be putting that into v2.

Monitoring SSRS Report Performance – Part 4

This is the last in my series on monitoring performance on a server hosting Microsoft’s SQL Server Reporting Service. In Part 1 I show a way to monitor reports that are running. In Part 2 I shared a few queries to see historical report performance. In Part 3 I discuss building an SSAS cube that lets me analyze the performance in Excel or SSRS reports. Today I’m going to look on the reporting server itself.

A good place to start is with report server performance counters.  Some of the ones I keep a close eye on are the Requests Queued, Requests Executing, Requests Rejected, and Tasks Queued. Overall these counters give me a good picture of how active the server is. Continuous high values in any of them may tell me that some reports may be running sluggish. Another similar counter is Active Connections. Ordinarily this number is pretty small, but I’ve seen issues when a high number of active connections caused CPU and memory issues. Though I haven’t been able to pinpoint yet why it’s happening at least I get a heads up that it’s about to.  Memory Pressure State is also useful. and I’ll discuss that a little more later. You can view the full list of available counters on Books Online.

Secondly, look at the report server error logs. Anytime a report runs it’s logged in these files. Usually it’s just the fact that the report was  processed. But if you see reports that show a status other than rsSuccess in the ExecutionLog2 view the reason is probably here as well. in this screenshot I can identify a report that probably didn’t render fully as there’s a data type mismatch, most likely in the data source of the report.  Then I can check other times this report ran to see if there were similar errors. That could help in determining if the fault is with the report or with the data for the time this report ran. Remember that you can see the report parameters back in the ExecutionLog2 view. image

Ted Krueger, a SQL MVP who writes for LessThanDot (Blog | Twitter) wrote an excellent post on the logs here. And Books Online also shows you how to  control the logs in Server Trace Log; what info to store, how long to store them, and where they can be found.

Finally, check out the settings in the rsreportserver.config file. Again, Books Online has a breakdown of each option (RSReportServer Configuration File). Some of the settings here can really be helpful in getting the best performance out of your SSRS server. Remember the Memory Pressure State performance counter I mentioned earlier? Here is where you can define what memory pressure is. There are some great articles online about what each option is used for. One of the best (Exploring the RSReportServer.config File) was written by Jes Schultz Borland (Blog | Twitter) for SQL University.

I hope I was able to help out a bit. Definitely check out the links I’ve provided if you need more information. All those authors have written about SSRS much better than I can.

Other Resources

There’s another very useful post on Reporting Services Performance Optimizations on the SQLCAT site. Especially helpful is the section on Memory Configurations on both SQL 2008 and SQL 2005 instances.

Also, check out Configuring Available Memory for Report Server Applications in Books Online.

Monitoring SSRS Report Performance – Part 3

In Part 1 I discussed building a small VB.NET project to see what reports were currently running on a report server. In Part 2 I talked about the ExecutionLog2 view to show how long individual queries ran. Now in Part 3 I’m going to show how I look at performance using SSIS, SSAS, SSRS, and Excel 2010.

If you’ve read some of my older posts you may have run across a few where I mention using SSIS to collect data from the report servers. Well, I finally got around to building my package. I ran across a few excellent examples on how to do this, and the links are in the References section at the end of this post. I borrowed some from each and came up with what I began using a few weeks ago. My whole solution isn’t complete; I’ve left some features out of the SSAS cube for now.

The design of my data warehouse is fairly simple. I built a fact table based on the ExecutionLog2 view and use execution, pagination and processing results as my measures. My dimension tables are on servers, reports, users, sources, formats, statuses, and date. My date dimension doesn’t use time, so I can relate the TimeStart and TimeEnd only to a data and not time of day.

My ETL package runs nightly against 2 SQL 2008 SSRS servers. First it collects all the data from the last 24 hours and stores it in a local staging table. Then it loads the data into my fact and dimension tables. I’ve been keeping the staging table for archiving reasons for now. I’ll probably start archiving is soon as the size of the is growing pretty fast and I’m going to run out of room soon.

The final step is the cube I build using SSAS. There’s nothing fancy in the data source or data view, I’m just pointing to my data warehouse database. I did create a named query that points to the staging table, and there’s a few simple named calculations that concatenate values. I set the properties on all the dimensions so that I can return the name and not the key values. The cube is a star schema and looks like this;


Inside the cube I defined calculations for Average Data Retrieval, Average Processing Time, and Average Rendering Time. I partitioned the fact table by month, and right now I have 7 partitions gong back to June of this year. Then I used the wizard to build 11 aggregations. Finally I created an drilldown action that will show individual report execution details when I click on a cell. I haven’t set up any KPIs yet, that will be in the next version.

So now, I can query the cube in SSMS. Here I’m looking at last month’s data by server.


Using the drilldown action I defined I can look at some of the report execution log details. This screenshot hides some columns, there’s more that I display here.


Or I can use Excel to connect to the cube and view the data. There I can set up any slicers I want. This one is for the report server.


And I can also see the data by running a report. This example shows average report times for November 2011, sorted by the time it took to execute the report, and displaying the ones that took over a minute in red.


As I said, this is still a work in progress. I’ve got more I want to play with, especially with using MDX to query SSAS. But it should help me better understand Microsoft’s Business Intelligence and also when I finally get around to taking the 70-448 exam.

I’ll have one final post in the series, on monitoring performance on the server itself.


There are two very good examples on building SSIS packages for report server performance data. You can find the first on MSSQL Tips. The examples there are for SQL 2005, but they should work on SQL 2008 with just minor tweaks.

The second example is on Codeplex. This version was built for SQL 2008 and the download contains code to build reports against the data.

Monitoring SSRS Report Performance: Part 2

In part 1 of this series I discussed how I built a small VB.NET application that would show me what reports were currently running on our SQL 2008 Report Servers. In this post I’m going to show how to look at the performance of reports after they run.

Reporting services stores execution history in a table in the ReportServer database called ExecutionLogStorage. The ReportID can be joined to the Catalog table to get the name and path of the report. Report Services has a view called ExecutionLog2 prepared that you can use instead.

Most of the column names are self explanatory, and if you visit the links in the references section you’ll see a full description. The columns I mostly look at are

  • Parameters: Here I can see the parameters used for the report execution. This will let me rerun the report the same way the user did, helpful to determine where the problem is.
  • Status: This column shows if the user could view the report with no problems. I’m looking for rsSuccess. If the report was unsuccessful you’ll see why here. Depending on the Status I can look at the report server logs for more information.
  • TimeStart and Time End: These columns show me when the report ran and how long it took. I can see if the report execution happened when the server was under pressure or if it was the only report running at the time.
  • TimeDataRetrieval: This column shows the time, in milliseconds, that the report took to fetch the data. High values here may indicate the problem is with the report data source and not the report server. It’s also nice to compare times when the report has changed.
  • TimeProcessing and TimeRendering: How long did it take the report server to deliver the report to the user? Values here are in milliseconds.
  • ByteCount and RowCount: How big is the report? Bigger reports take longer to process.
  • AdditionalInfo:There’s information here that shows information related to the memory and paging of the report.

Here are two scripts I use to look at the ExecutionLog2 view. The first will show each report execution for the time range I provide. I usually run it once in the morning for a quick look and again when users report performance problems.

SET @TimeStart = '2011-10-24 06:00'
SET @TimeEnd = '2011-10-24 12:00'
SELECT ReportPath
       , UserName
       , [Status]
       , TimeStart
       , TimeEnd
       , DATEDIFF(ss,TimeStart,TimeEnd) AS ElapsedSec
       , TimeDataRetrieval /1000.00 AS TimeDataRetrieval
       , TimeProcessing /1000.00 AS TimeProcessing
       , TimeRendering /1000.00 AS TimeRendering
       , ByteCount
       , [RowCount]
       , ISNULL(AdditionalInfo.value('(/AdditionalInfo/ScalabilityTime)[0]', 'int'),0) AS PaginationScalabilityTime
       , ISNULL(AdditionalInfo.value('(/AdditionalInfo/ScalabilityTime)[1]', 'int'),0) AS ProcessingScalabilityTime
       , ISNULL(AdditionalInfo.value('(/AdditionalInfo/ScalabilityTime)[0]', 'int'),0) +
         ISNULL(AdditionalInfo.value('(/AdditionalInfo/ScalabilityTime)[1]', 'int'),0) AS TotalScalabilityTime
       , ISNULL(AdditionalInfo.value('(/AdditionalInfo/EstimatedMemoryUsageKB/Pagination)[1]', 'bigint'),0) AS PaginationEstimatedMemUsageKB
       , ISNULL(AdditionalInfo.value('(/AdditionalInfo/EstimatedMemoryUsageKB/Processing)[1]', 'bigint'),0) AS ProcessingEstimatedMemUsageKB
       , ISNULL(AdditionalInfo.value('(/AdditionalInfo/EstimatedMemoryUsageKB/Pagination)[1]', 'bigint'),0) +
         ISNULL(AdditionalInfo.value('(/AdditionalInfo/EstimatedMemoryUsageKB/Processing)[1]', 'bigint'),0) AS TotalEstimatedMemUsageKB
FROM ReportServer.dbo.ExecutionLog2
WHERE TimeStart BETWEEN @TimeStart AND @TimeEnd
ORDER BY TimeStart

The second script is similar to the first except that it show report averages for the desired time frame. I usually run this once a month to get a baseline, then again at certain high volume times.


SET @TimeStart = '2011-09-01 00:00' 
SET @TimeEnd = '2011-10-01 00:00' 

    , COUNT(ExecutionId) AS NumReports
    , AVG(DATEDIFF(ss,TimeStart,TimeEnd)) AS AvgElapsedSec
    , AVG(TimeDataRetrieval /1000.00) AS AvgTimeDataRetrieval
    , AVG(TimeProcessing /1000.00) AS AvgTimeProcessing
    , AVG(TimeRendering /1000.00) AS AvgTimeRendering
    , AVG(ByteCount) AS AvgByteCount
    , AVG([RowCount]) AS AvgRowCount
    , AVG(ISNULL(AdditionalInfo.value('(/AdditionalInfo/ScalabilityTime)[0]', 'int'),0)) AS AvgPaginationScalabilityTime
    , AVG(ISNULL(AdditionalInfo.value('(/AdditionalInfo/ScalabilityTime)[1]', 'int'),0)) AS AvgProcessingScalabilityTime
    , AVG(ISNULL(AdditionalInfo.value('(/AdditionalInfo/EstimatedMemoryUsageKB/Pagination)[1]', 'bigint'),0)) A AS AvgPaginationEstimatedMemUsageKB
    , AVG(ISNULL(AdditionalInfo.value('(/AdditionalInfo/EstimatedMemoryUsageKB/Processing)[1]', 'bigint'),0)) AS AvgProcessingEstimatedMemUsageKB
 FROM ReportServer.dbo.ExecutionLog2
 WHERE TimeStart BETWEEN @TimeStart AND @TimeEnd
 GROUP BY ReportPath

I’ll have one final post where I look at report history with SSIS and SSAS.


Robert Bruckner has a great blog on Report Server performance. This post is a great description of the ExecutionLog2 view.

ExecutionLog2 View – Analyzing and Optimizing Reports

Monitoring SSRS report performance: Part 1


One of my self-assigned tasks is to monitor the performance of our reports. It’s a good way to see how reports behave over time or what effect changes on a report has. There’s a few methods I use, from seeing what report is running right now to how that report compares to other report runs.

Reporting Services saves report execution statistics in a table called ExecutionLog in the ReportServer database on your report server. In SQL 2008 there’s also a view called ExecutionLog2 that does a few joins and shows the values in a way that may make more sense. I’m not going to go into too much detail about that view or table here, they’ve been written up pretty well in posts by others. See the references section at the end for a few helpful links.

The only problem with the ExecutionLog table is that it doesn’t show a row for a report until the report execution is complete. It doesn’t help you to see what report is running at a particular time. There are times when the report server is under load that I want to track what reports are running at that exact time. To do that I wrote a small VB.NET program that will show me what’s running anytime I choose. I wrote mine in Visual Studio 2010 but it could be created in earlier versions, and you can easily convert this to C# if you wish.

To start, create a new Windows Forms project called Current Reports, create 2 forms, set one up to enter login information, and drop a data grid with a label for each report server you want to monitor, a timer control, and a button or the other. Next create a web reference to a SQL2008 report server and call it ReportService:

   1: http://localhost/ReportServer_Reporting/ReportService2005.asmx

Then set a reference to the report service, one for each server you’re monitoring. Pass in the login information from your login form.

   1: Imports Current_Reports.ReportService


   3: Public rsServer1 As New ReportingService2005


   5: rsServer1.Url = "http://localhost:85/ReportServer_Reporting/ReportService2005.asmx"

   6: rsServer1.Credentials = New Net.NetworkCredential(UserName, Password, Domain)

Finally, write a procedure to fetch the information you want to see. I call the procedure when I first load the form, again when the timer fires (I set the timer interval to 30 seconds), or on demand when I click the refresh button.

   1: Public Sub ListRunningJobs()

   2:     dgvServer1.Rows.Clear()


   4:     Dim Server1Jobs As Job() = Nothing

   5:     Server1Jobs = rsServer1.ListJobs()

   6:     Dim job1 As Job


   8:     For Each job1 In Server1Jobs

   9:         If job1.Status = JobStatusEnum.Running Or job1.Status = JobStatusEnum.New Then

  10:             Dim strValues(3) As String

  11:             strValues(0) = job1.Name

  12:             strValues(1) = job1.JobID

  13:             strValues(2) = job1.StartDateTime

  14:             strValues(3) = job1.User

  15:             dgvServer1.Rows.Add(strValues)

  16:         End If

  17:     Next job1


  19:     Me.Text = "Currently Running Reports as of " & Date.Now.ToString


  21: End Sub  'ListRunningJobs

This code just checks to server for any running jobs. For each job it finds it returns the report name, the job id, when it started, and who’s running it. It also shows the time the job info was fetched, so you can see if there are any reports that have been running for a long time. You need to duplicate this for each server you want to monitor; in monitoring two so that’s why I used two data grids.


In Part 2 I’ll describe how I check the execution logs.


Robert Bruckner has a great blog on Report Server performance. This post is a great description of the ExecutionLog2 view.

ExecutionLog2 View – Analyzing and Optimizing Reports