How to Do SQL Server Performance Trend Analysis Part 2: Storing Performance Data in SQL Server

In this
four-part tutorial, you
will learn how to use NT Server 4.0’s Performance Monitor and
Microsoft Excel to monitor and analyze SQL Server performance.
You will also learn how to use a SQL Server database to store
your Performance Monitor logs. This tutorial assumes that you
already know the basics of using Performance Monitor, Excel, and
of course, SQL Server.

is part two of the four-part tutorial. This part discusses how to use SQL Server to store Performance Monitor
log data. Part three will show you how to use Microsoft
Excel to analyze the Performance Monitor Data. Part four will
show you how to interpret your results.



In the first part
of this series, we took a look at how you can use NT Server
4.0’s Performance Monitor to collect performance data from your
SQL Servers. While the Performance Monitor tool is great for
collecting performance data, and fair at analyzing it, it is
very poor when it comes to storing the performance data for
later use.

Monitor stores the log data it collects in a proprietary format
that cannot be easily used by other software. But what you can
do is to export the data from the Performance Monitor into the
ASCII format so that it can be imported into other software,
such as SQL Server, for storage.


Do We Need to Store Performance Monitor Data?

The whole purpose
of this article is to learn how to do SQL Server performance
trend analysis. When we talk about trend analysis, we are
talking about collecting SQL Server performance data every day,
day-after-day, storing it, and then analyzing what is
happening over various time periods.

For example, as a
DBA, it is important for me to know how many users are using my
server. In addition, I need to know: Is the number going up or
down? Is the number changing slow or fast? Are the users light
or heavy users? If the number of users is going up fast, will
this significantly impact SQL Server’s performance? Unless you
have hard data to review, it is hard to make wise decisions
about what to do to ensure good responsiveness of your SQL
Server for your users.

We want to collect data over long periods of time
in order to provide
the data we need to help identify current performance issues and to
help plan for
the future. Is the current surge in CPU activity due to more
users, or because the same number of users are running more
queries? To answer this question, and hundreds of more questions
like this one, you must have good historical performance data.
Without it, you can only guess as to what’s currently going on.

Predicting the
future, at least as far has hardware needs goes, is also
important. Most DBAs just can’t go to their boss and request
that a new server be installed tomorrow to resolve a newly
discovered performance issue. Instead, the DBA needs to try to project
future needs, and before the need actually exists, begin the
often long process of initiating the purchase of new equipment.
This process often requires the DBA to justify the request, and
if good data exists, then the justification process is made much

If you are a DBA
of one SQL Server, or hundreds of SQL Servers, you need to
collect Performance Monitor data, and then store it for easy retrieval


Doesn’t Make It Easy to Store and Analyze Performance Monitor

When Microsoft
designed Performance Monitor, they apparently didn’t plan on
people using it to store data for easy later retrieval. While
this article will show you how store Performance Monitor data in
SQL Server, this task is not as straight-forward as it should
be. In fact, once you read this article, you might decide not to
bother because of the hassle.

If you do come to
this conclusion, then you are making a short-sighted mistake.
Yes, it is somewhat of a hassle, but overall, it is worth it.

But what if you
don’t have the time needed to collect and store your Performance
Monitor data? If you don’t have the time, but if
you have the money, you may want to consider purchasing third
party tools to make the task of collecting, storing, and
analyzing Performance Monitor data.

While there are a
number of tools on the market that collect Performance Monitor
data, there are two that I am specifically
familiar with, and they include Event Log Monitor from TNT
, and AppManager from NetIQ.
Both tools will automatically capture SQL Server performance
data and store it in a SQL Server 7 database.

But if don’t have
the money for specialized software, then you need to store your
Performance Monitor data manually, as described in this article. As you read
the steps I have outlined below, keep in mind that you don’t
have to do every step exactly as I describe. The key thing to
keep in mind is the big picture, the details of how you move
data from the Performance Monitor to SQL Server is not critical.
What is critical is that you save your performance data for
later trend analysis.


of How to Store Performance Monitor Data Using SQL Server 7.0

Below are the
steps you can follow to export SQL Server Performance Monitor
data from Performance Monitor to SQL Server 7. Moving data to
SQL Server 2000 should be very similar, although a few of the
minor details will be different. These steps assume you have
already collected SQL Server Performance log data using the
techniques described in part one of this article.

In brief, here is
what you need to export data from Performance Monitor and to
import it into SQL Server:

  • Point to the
    log data using Performance Monitor

  • Select the
    counters to export from your log data

  • Export the
    data into the comma-delimited ASCII format

  • Clean-up the
    Performance Monitor data using Microsoft Excel

  • One time only,
    create a database to store your performance monitor data

  • Use DTS to
    create a package to import the Excel data into a SQL Server

  • Import the
    Excel spreadsheet data into a SQL Server table


to the Log Data Using Performance Monitor

  • Before you can
    export the performance data from Performance Monitor, you
    must first point the the log that contains the data you want
    to store in SQL Server.


  • To do this,
    start Performance Monitor and select the Graph Mode if it
    is not already selected. Next, select "Options"
    from the drop-down menu and then select "Data
    From". This displays the "Data From" dialog


  • In the
    "Data From" dialog box, click on the "Log
    File" radio button, then click on the browse button
    (the button with three periods). This displays the
    "Open Input Log File" box. Here find and select
    the log file you created earlier, then click on the
    "Open" button. You are returned to the "Data
    From" dialog box. Now click on the "OK" button. Now,
    Performance Monitor knows where the log file is and can work
    with it.


the Counters to Export from Your Log Data

  • Your next step
    is to select the counters that you want to store in SQL
    Server for later trend analysis. As was discussed in part
    of this four part series, data for every
    counter is collected for every object you selected when
    creating your log file, even if you didn’t want to monitor or
    analyze all the counters
    for all the objects you created. This means your next step is to select only
    those counters we want to store in SQL Server, while
    ignoring all the rest. While you can attempt to store all
    the counters for all the objects in SQL Server, you will
    probably end up with too much data, most of which will
    probably never be used.


  • The way you
    select only the counters you want to store in SQL Server is to add the
    counters to the Graph Mode of the Performance Monitor, just like
    you do when you monitor counters in real-time in the Graph
    Mode. You will need
    to add every counter you want to track from every object
    that was logged.


the Data into the Comma-Delimited ASCII Format

  • Once you have
    added all the counters to the Graph Mode of the Performance
    Monitor, the screen may be very hard to read because there
    is just too much data to easily see. Don’t worry about this
    as it does not affect how we export the counter data.
    Your next step is to save all these counters so that you
    won’t have to reselect them all over again the next time you
    need to import them.


  • You can save these counters by clicking
    on the "File" menu option and then selecting
    "Save Chart Settings". You will be presented with
    a dialog box where you can save a file, with a name you
    assign, that contains all your settings. These settings then
    can be retrieved at any time later by clicking on the
    "File" menu option and then selecting
    "Open". This displays another dialog box you can
    use to select and load the file that contains all of your


  • Now that you
    have selected all the counters you want to export and store
    in SQL Server, you are ready to perform the actual export
    from Performance Monitor. To do this, click on the
    "File" menu option, and then select "Export
    Chart". This will display the "Performance Monitor
    – Export As" dialog box. Here, you decide where you
    want to store the export file, the name you want to give to
    the export file, and you choose which format to store the
    format in. The default export file format is "Export
    TSV Files", which is the Tab-delimited ASCII format.
    Instead of using the default format, select "Export CSV
    Files", which exports the data in the Comma-delimited ASCII
    format. Either one will work, but the example I use in this
    article uses the Comma-delimited format.


the Performance Monitor Data using Microsoft Excel

  • The next step
    before you can import the log file into SQL Server is to do
    some clean up of the file using Microsoft Excel. The reason
    we have to do this is because the export file has some
    header information that needs to be removed before we can
    get a clean import into a SQL Server table using DTS. In the
    illustration below, you can see the kinds of header
    information that is produced in the export file. It is our
    job to clean it up.

  • Here’s is what
    you will need to do to clean up this file. The first six
    rows contain header data, and row seven is a blank line, so
    all of these rows need to be deleted. Row 11 contains the
    name of the object for each counter. We don’t need this
    information so delete row 11. Row 12 includes the names of
    the Date and Time columns, but the rest of the columns
    indicate the name of the server being monitored. What I do
    here is move the Date and Time column names to row 8, which
    is where the rest of the column names are located. After
    moving these two column headings, I then delete row 12. You
    may or may not have any data in row nine. This row is used
    for the name of the counter’s instance, if there is one. In
    this example, there are a few cases where there is more than once instance
    of a counter, so what I do is to move the instance name from
    the column in row 9 and add it to the column name in row 8.
    This way, I know which counter goes with each instance. Rows 10 is also blank, and I
    delete it. This should only leave you row 8 at the top of
    the spreadsheet, which should
    now have all the column names that will be exported to a SQL
    Server table.


  • Once my data
    is cleaned up, I save the spreadsheet as an Excel
    spreadsheet, not as a CSV file. The format you decide to
    save the file in is not too important, but for this article,
    I will assume the Excel format.


a Database to Store Your Performance Monitor Data

  • Before you can
    import the data into a SQL Server table, you will need to
    create a database. I call my database SQL_Performance_Data.
    I created it to be 5MB with a 1MB log file, but you can start
    with any sizes you feel appropriate. To store the data, you will create a table
    later when you import the performance data into
    the database using DTS for the first time.


DTS to Create a Package to Import the Excel Data into a SQL
Server Table

  • Once the
    database is created, the next step is to use DTS to import
    the spreadsheet into the database as a table. The first time
    you do this you can have DTS create the table for you,
    using the column names from the Excel spreadsheet. After you
    run DTS for the first time, when you import additional
    performance data at a later
    time, you will only need to append the data to the preexisting
    data in the table. Use the following steps to use DTS to import the performance data from
    the Excel spreadsheet into a newly created table.


  • Now start the DTS
    Import Wizard and specify
    "Microsoft Excel 8.0" as the data source and point to
    the Excel spreadsheet that contains your performance data.


  • For the
    destination source,
    specify the "Microsoft OLE DB Provider
    for SQL Server", your SQL Server name, and the name of
    your database. Use
    whichever authentication method is appropriate for you.


  • Next, choose
    the "Copy table(s) from the source
    database" option from the wizard.


  • This next step
    requires the most work, and there are several different
    approaches you can take, all of which will work. The one I
    am showing you here is just one possible approach. Use your
    own experience and preferences to determine exactly how you
    want to proceed.


  • The first
    thing I recommend is to rename the destination table as the same
    name as the SQL Server that is being logged. This way, you
    can keep a separate
    table of performance data for each SQL Server you monitor, and by giving
    each table the name of the server, you won’t get confused
    about which data belongs to which server.


  • Next, click
    on the "Transform" button, which displays the
    column mappings. You may want to choose different Data Types
    for the table that will be created rather than choosing the
    ones selected by the DTS Wizard. I use all of the default
    Data Type settings except for one. For the Time column, I
    change the default Data Type from smalldate to Char(11).


the Excel Spreadsheet Data into a SQL Server Table

  • Once the DTS
    package has been defined, then save the DTS package. And
    once the package is saved, it can be run and the data
    imported from the Excel spreadsheet into a newly created


  • The next time
    you import data into your table, you will need to modify the
    DTS package so that is appends the data to currently
    existing table instead of creating a new one.


  • Now that the
    data is in SQL Server for the first time, you may want to
    add indexes to your table to boost query performance, as there aren’t any by default.
    Other than that, you are ready to use the data in the table
    anyway you want. You can analyze it directly using
    Transact-SQL, or export it back out to Excel for analysis.


a Routine

All the steps
above are worthless unless you regularly import your Performance
Monitor log files into SQL Server. You need to develop a
routine, such as daily, weekly, or monthly, where you import
your Performance Monitor logs into SQL Server. If you have a knack
for programming, you may want to write your own code to automate
the tasks. However you decide to do it, you need to develop a
regular routine and stick to it. 


Stay Tuned
for Part Three

As I think you can
see from reading this article, exporting data from Performance
Monitor and importing it into SQL Server is not hard, although
is is more hassle than it should be.

In the next installment
(number three of four parts), we will take a look at how you can use Microsoft Excel to analyze
the performance data that is now in SQL Server. Check back in August for part


Latest Articles