Execution Log Performance and Audit Reports
Objective and Business Scenario
As a basis for our
practice examples in both our last and current sessions, we established a
Reporting Services business requirement of a group of information consumers at a hypothetical client
location. The consumers expressed the need to monitor report activity, for
purposes of performance and auditing evaluation, for their recently implemented
Reporting Services installation. The consumers asked that we work with them to
create a database that is populated and maintained with this data for many
potential uses. We accepted the project, informing the group that, while we
had created numerous relatively elaborate databases of this sort for clients in
the past, our initial pilot in this instance would include Execution Log
reporting capabilities that could be implemented rapidly and easily, using the
tools that accompany the installation of Reporting Services.
We made it clear that
we would likely expand the capabilities to many other measures for overall
performance and auditing monitoring after our pilot. We explained that
beginning with the prefabricated "starter set" would be efficient, because
Reporting Services provided a script to construct a basic reporting database to
house the Execution Log data that it generates. In addition to the
creation script, a DTS package to transform the data, and to populate (and
prospectively update) the new database, was also included. Finally, a set of basic
sample reports were included in the same "extras" bundle, which we
could run as soon as we had transformed data from the Execution Log to
the newly created database.
We pointed out that we
had found this approach to be a "quick win" with regard to getting
basic functionality in place; we also indicated that this was a great time in
the Reporting Services implementation to be making performance and audit
analysis capabilities available - just as we were beginning to write our first
enterprise reports. The sample reports would serve several complementary,
value-added functions, including:
-
Provision of a
means for the Administrator to immediately see who was performing what actions
within the new system;
-
Provision of
an excellent learning tool for new report authors, whereby they could obtain
confirmation that their new reports had executed;
- Presentation of basic report
samples, upon which the ultimate users could base requests for customization. (This
could save a great deal of time in obtaining the requirements from users who,
although they knew they needed these reports, may not have known "where to
start" in providing a useful specification for their design.)
- Facilitation of optimization of
reports and their underlying queries, both at times during and after creation,
and at various points in time as a part of ongoing system upkeep and
maintenance (tuning could be suggested, for example, as data sources grew in
size, as the number of information consumers increased, as security needs
changed, and so forth).
With the Execution
Log data source now in place, we are ready to proceed with aligning and
publishing the sample report set to kick off the new reporting efforts. Included
on the Reporting Services installation CD is a group of sample Execution Log
reports. These reports include the following titles:
-
Longest
Running Reports
-
Report
Parameters
-
Reports by
Month
-
Reports by
User
-
Reports
Executed by Day
-
Report Size
-
Report Success
Rate
-
Today's
Reports
The
sample files also include the Visual Studio project (.rptproj) and solution
(.sln) files for the report set, together with a data source connection file
(.rds). The reports and objects are designed to be used with the Execution Log
database we created and populated in our last session. A database diagram of
the database is shown
in Illustration 1.
Illustration 1: Simple
Database Diagram (MS Visio) of the New Reporting Database
The Execution Log Database
We discussed reasons
for creating a reporting database as opposed to simply using the Execution
Log in its original state in our previous session. We then opened and
executed the provided table creation script, using MSSQL Server
Query Analyzer, to create the schema for our new reporting database. We
then loaded and executed the accompanying DTS package to transform the Execution Log data and populate the
new database tables.
The
information contained in the transformed data includes details that can assist us in determining
report efficiency and frequency of usage, together with myriad details that
assist in resource planning, monitoring the activities of users, and so forth.
Specific data items that we can obtain from our Execution Log database
include:
-
The name of
the Report Server Instance that handled a given request;
-
The Report
Identifier;
-
The User
Identifier;
-
The Request
Type (either User or System);
-
The Rendering
Format of the report;
-
Any Parameter
values used for the execution of a given report;
-
The Start and
Stop times of a given report process, from which duration is
derived;
-
The Percentage
of Processing Duration spent retrieving the data, processing the
report, and rendering the report;
-
The Source
(including Live, Cache, Snapshot, or History) of
report execution;
-
Status of
Processing for a
given report (indicating either a successful process, or the code of the
first Error Condition encountered);
-
The size
of each generated report.
-
The number
of rows returned from query(ies) underlying each report.
Considerations and Comments
Before we can work with the sample
reports, we need to take several preparatory steps to establish connectivity.
As part of our setup, we will create a Data Source Connection, then upload the report files we have relocated to a
convenient location, as well as performing various read, and other, actions
with the sample reports. We will also be modifying a copy of a report to
customize it to fit a specific consumer need, to get some practice in creating
new capabilities within our report
set, as well as performing other actions within Reporting Services.
To complete the procedures we will undertake,
you will need to have the appropriate access and privileges, at the MSSQL
Server 2000 level, and within Reporting Services, MSSQL Server Analysis
Manager, and the file system, to perform the respective actions. You will also need access to the Reporting Services
installation CD, from when we will be copying the sample report files to our
local drive.