In Part 1 I discussed building a small VB.NET project to see what reports were currently running on a report server. In Part 2 I talked about the ExecutionLog2 view to show how long individual queries ran. Now in Part 3 I’m going to show how I look at performance using SSIS, SSAS, SSRS, and Excel 2010.
If you’ve read some of my older posts you may have run across a few where I mention using SSIS to collect data from the report servers. Well, I finally got around to building my package. I ran across a few excellent examples on how to do this, and the links are in the References section at the end of this post. I borrowed some from each and came up with what I began using a few weeks ago. My whole solution isn’t complete; I’ve left some features out of the SSAS cube for now.
The design of my data warehouse is fairly simple. I built a fact table based on the ExecutionLog2 view and use execution, pagination and processing results as my measures. My dimension tables are on servers, reports, users, sources, formats, statuses, and date. My date dimension doesn’t use time, so I can relate the TimeStart and TimeEnd only to a data and not time of day.
My ETL package runs nightly against 2 SQL 2008 SSRS servers. First it collects all the data from the last 24 hours and stores it in a local staging table. Then it loads the data into my fact and dimension tables. I’ve been keeping the staging table for archiving reasons for now. I’ll probably start archiving is soon as the size of the is growing pretty fast and I’m going to run out of room soon.
The final step is the cube I build using SSAS. There’s nothing fancy in the data source or data view, I’m just pointing to my data warehouse database. I did create a named query that points to the staging table, and there’s a few simple named calculations that concatenate values. I set the properties on all the dimensions so that I can return the name and not the key values. The cube is a star schema and looks like this;
Inside the cube I defined calculations for Average Data Retrieval, Average Processing Time, and Average Rendering Time. I partitioned the fact table by month, and right now I have 7 partitions gong back to June of this year. Then I used the wizard to build 11 aggregations. Finally I created an drilldown action that will show individual report execution details when I click on a cell. I haven’t set up any KPIs yet, that will be in the next version.
So now, I can query the cube in SSMS. Here I’m looking at last month’s data by server.
Using the drilldown action I defined I can look at some of the report execution log details. This screenshot hides some columns, there’s more that I display here.
Or I can use Excel to connect to the cube and view the data. There I can set up any slicers I want. This one is for the report server.
And I can also see the data by running a report. This example shows average report times for November 2011, sorted by the time it took to execute the report, and displaying the ones that took over a minute in red.
As I said, this is still a work in progress. I’ve got more I want to play with, especially with using MDX to query SSAS. But it should help me better understand Microsoft’s Business Intelligence and also when I finally get around to taking the 70-448 exam.
I’ll have one final post in the series, on monitoring performance on the server itself.
There are two very good examples on building SSIS packages for report server performance data. You can find the first on MSSQL Tips. The examples there are for SQL 2005, but they should work on SQL 2008 with just minor tweaks.
The second example is on Codeplex. This version was built for SQL 2008 and the download contains code to build reports against the data.