Automated Server Information Gathering – Part 5

A Daily Report

This article continues the same topic of Automatically Gathering
Server Information. If you want to read the previous articles, click on one of these links:
Getting a Count of Database Users – Part 1
Automatically Gather Database Size – Part 2
Automatically Gathering Log Space – Part 3
Getting Organized – Part 4

Introduction

Now that I have a few solutions for gathering information, I
realized that information overload is rapidly becoming a problem. In the last
article, I discussed a way to start getting organized with a dedicated database.
However, having to check a bunch of tables in one database on a few servers
is not a whole lot better than checking it in various databases on a few servers.
the information being recorded everyday is a time consuming problem, time I do
not have. After all, what good is the information if you do not read it or use
it. So, what can you do?

The Problem

Well, when I first started to gather information about my
SQL Server, I had one server and it wasnt a big deal. Then I got two more
servers pretty quickly and it started to become an issue. Then I got a new job
with over ten servers and I really needed a way to manage the information.
There was no way I could afford to spend a couple hours each day manually
checking on all the information I had automated the gathering of.

First Try

I first wrote a script that had a query for each piece of
information I had gathered and saved this. I then would call this up in ISQL
(this was the old days) and run the saved script and pipe the output to a file.
This got old really fast, even with some scheduling.

Then I added an exec master..xp_sendmail around each
query and set these up on a schedule. After all, email is easy and I use it
everyday. Well, if we implemented this with the information that I have written
about, wed be up to 3 emails per day per server. And I have a few more pieces
of information I think are important. This also got old really fast.

Then I got smart (or so I thought). I combined all. Cool!
Not I get one email with all the queries in it. It looked something like this:

DB AvgUsers EntryDt
—————- —————- —————-
Master 2 10-01-2000
MSDB 1 10-01-2000
Main 10 10-01-2000
GreatPlains 3 10-01-2000
DB SpaceUsed
——————- ——————–
Master 25
Msdb 11
Main 120
GreatPlains 210
DB Logspace Percent
——————- ——————- ——————
Master 1 10
Msdb 1 10
Main 15 20
GreatPlains 210 40

As you can see, this is not the easiest to read. Even with
the headers for the columns and after I filter out pubs, tempdb, model, and
other databases that I am not concerned about monitoring. This also quickly
became cumbersome, especially as more queries were added to the stored
procedure.

The Solution

Well, as you can see, I do keep trying to make things
better. Most of my knowledge is through mistakes and trial and error (the
lesson being perseverance appears to be equal to genius, at least in my case).
I finally hit upon this idea after being asked to cc my boss with this
information. I thought it was pretty cool, so let me know what you think.

I decided that I needed to format this output better and
create a real report. However, I was not in the mood for learning a report
writer, nor did I want to spend any budget dollars on software. Access would
work, but then I would need to go outside my SQL Server as well as manually
generate the report. So my brilliant idea was..

Create a table to hold the report.

But not just any old table. A
special table that I will share with you now. Here is the T-SQL code that I wrote:

Create DBARpt
  ( srvr char( 40),
    typ char( 20),
    entrydt datetime,
    txt char( 80)
   )

Tada!!!!!!

OK, OK, its not much to look at, but we are just
getting started. This table is designed to work like a banded report writer. We write
each line of the report as we need it and then generate the report at the end
and email it to the appropriate people.

Creating The Report

To fill this table, I wrote a series of stored procedures to
execute a query against the tables where I stored the information. For example,
for the space used by the log, I wrote a stored procedure that ran on a
schedule every morning and checked the data that was being gathered about log
space. This stored procedure ran a query and inserted the results into the
DBARpt table. Before and after running the query, it inserted other rows to
make the report prettier. Here is the code for the space used procedure and the table
above.

One thing that I will mention here, I always have kept the
stored procedures that insert data for each section separate from one another.
Why? Well, I learned pretty quickly (after being burned) that a single piece of
bad data could sometimes crash the stored procedure. All the queries after that
point failed to run and it became a real pain. Instead, I learned to let each
section insert its own data, so if one section failed, it did not cause the
whole process to fail. I try to prevent dependencies from occurring where I can
so as to maximize the fault tolerance in any procedure.

Another lesson learned was to keep the information gathering
separate from the report creation. Two reasons here, one is the same as above:
prevent dependencies. If the information gathering process fails, then the
report generation process can note this through a lack of data and make the
appropriate entries. The second reason is that I think it is fundamentally
better programming to separate separate functionality into separate modules. I
keep the gathering process encapsulated in its own stored procedure and the
report writing process in its own procedure. This way as I tweak the report, I
prevent bugs from keeping into the information gathering process which is
already working. I can also reuse one of the two pieces of functionality
without requiring the other to be present in this way.

Conclusion

I realize most of you are probably still reeling from that
amazing, database guru-like table I created :), but I hope to have
presented two points in this article: one, you should also automate the
information reporting process as well as the gathering process. I think you
will find that time spent here pays great dividends later on. The second is
that you can use a table as a staging area for automated reports that you send
to anyone. I have used this technique to create quick status reports for
various business processes that are much more appreciated and better received
than a simple query from xp_sendmail. They also require much less development
(and $$$) than using some third party packages.

For those of you still needing it, there is still more
enhancement to come to this process. Getting a single email from each server is
still not a great solution. I am working to build a fairly bulletproof solution
for multiple servers and that will be one of the next few articles. I also need
to better explain what information to gather and what to not gather. After all,
what good is a nice neatly formatted report if its ten pages long? But I will
save that for another day.

As always, I welcome feedback, especially if you see some
way to improve this process that I may have missed. If you have ideas for other
information that you think is important, let me know. I may not have thought of
it and will probably tackle the project and spread the word.

Steve Jones
November 2000

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends & analysis

Latest Articles