Log Explorer

Introduction

How to read the transaction log was a question that I heard asked quite often at TechEd 2000
last summer. Most of the SQL Server team mentioned that work was underway with some third party
software vendors, but would provide no more information. Platinum (now CA) had developed a Log reader for v6.5, but
nothing was available for SQL Server 7 or 2000.

I was excited to see a log reader released by Lumigent Technologies and was anxious to take a look.
Unfortunately, I was buried at work when I heard about this product and did not get a chance to use
it for a few months. After the start of the new year, I downloaded an evaluation copy of v1.3 and decided to
give it a try.

Installation

After filling out a short form, I downloaded the 3MB file and ran the installation program.
This is the same version that you purchase, but is limited to 14 days without a license key. 14 days is
a little short (to me), but should be enough for someone to determine if they want can use this product.

The installation is very straightforward. It asks to install the server and/or client pieces. I choose both
and pointed the product to a SQL Server 7 server that is used for development. This version (1.5) supports
SQL Server 2000. The installation product asked for a license key and included a note to leave this
blank if the installation is an eval. I also downloaded and installed v1.3 which includes the same setup. Version 1.3,
however, does not support SQL Server 2000.

Once setup is complete, I usually check the box to read the release notes. I have found that this often
prevents me from running into some gotcha that is already documented. The release notes start by mentioning
that Log Explorer is the leading transaction analysis tool. As far as I know it is the only one, but score a point for
humility. The release notes cover the basic and mention a few known bugs and behaviors in the product. I do like
that the notes are in HTML format and include contact information for the company.

Using Log Explorer

When I first ran this program, it asked me to connect to the server, with the option of using NT
or SQL authentication. Once this completed, you choose the database and whether the online log is to
be used or a backup file. After this the evaluation includes a reminder that this is an evaluation
copy and good for 14 days only. I chose the Northwind database and also received the following dialog box:

I like this informative message. It lets me know what the problem is as well as a couple potential
items to check. More software should include messages like this.

I decided to check Northind and see what the problem was. SQL Server 2000 has a Recovery section instead of
the "truncate log on checkpoint" option. Since Northwind is set as a sample database, it is set to the "Simple"
model by default and essentially has this option set. Instead of altering Northwind (I use it for testing and writing),
I created a small DBA database. I ran a DB backup as soon as I created this DB to prepare it for Log Explorer.

I also had to reset the startup account for this SQL Server to my network account. I am a local administrator,
so this should be no problem. After stoppng and restarting the SQL Server, I received an error trying to connect to the
SQL Server from Log Explorer. I get a connection error. Neither stopping and starting SQL Server nor restarting my computer seemed
to help. I finally uninstalled and reinstalled the product (with a reboot between) and this seemed to fix it.
to connect.

When I connected, I ran the following script:

create table myTest( 
	MyID int identity( 1, 1),
	Name char( 20),
	Updated datetime
 )

declare @i int
select @i = 1
while @i <100 begin insert MyTest values ( 'ID:' + cast( @i as char( 3)), getdate()) select @i="@i" + 1 end select * from MyTest 

This basically created a number of rows in a test table. When I chose "Browse" from the Log
Explorer menu, I received an error that there were no rows that matched my selection. This
is a little disconcerting since I had just inserted a number of rows. By going to the second tab and
choosing "View Recyled Data" I did get a screen that looks like this:

When I select "View Log" I see my transactions listed.
I decided to undo one of the transaction. I chose the transaction for ID 199 and right clicked on the line.
I got a dialog box of where to save the script. I picked a location

and then the following screen

This script would undo my change. A simple cut and paste would indeed undo this transaction.
Next I tried a rollback. I got this screen which allowed me to choose a time.

I choose 10:00am which
was a time between the two exections of my script above. This generated the following script.

Next I deleted a number of rows.

delete MyTest where MyID > 220

This deleted 77 rows. I next refreshed the view in Log Explorer. I see a single transaction
that spans 77 deletes.

Suppose I only wanted to delete rows above 222. I had accidently removed the
rows with ids 220 and 221. I selected the row with MyId = 221 and right clicked "Undo Op". This produced

This is exactly what I would like to see. This alone is probably worth the cost of the product.

Next I updated a few rows.

update MyTest
 set Name = 'test', updated= getdate()
 where MyID > 199

update MyTest
 set Name = 'My ID: 200', updated= getdate()
 where MyID = 200

I then refreshed the browse view and I can see my two transactions. Here are the screen shots:

I show these two because they show how an update appears in the screen.

With the last row selected, I chose "Revision History" and received the following screen.

>From this screen, I can right-click the row and select "rollback". This should allow me to restore the row
to its original state. At least that is what I would think, and Log Explorer gave this screen:

This script would acually rollback the script to the original values that the row contained
after the first insert. I am not sure how handy this function is, but by selecting a particular
row, I can generate the rollback script to get the row back to that state. Which is a very handy
feature. I can see using this in places where I make a data change, but find out quickly that this
change is not correct and want to immediately roll this change back.

Annoying Things

There are a few things with this product that are rather annoying. I feel like this is a nice product
that is needed and fulfills a great need for SQL Server, but it seems a little unpolished. Most of
these complaints relate to how I use Windows, but they are items that seem to exist in most
programs that I use.

  • When connecting to a SQL Server, the "Connect" button does not automatically receive focus
    if I hit the <Enter> key. I have to tab to the button or use the mouse. For someone who
    uses the keyboard heavily, this is really annoying.
  • Once I connect, the next screen has the focus on the "Attach" button rather than the database name box.
  • The tab order on the "Attach" screen does not flow down the screen. It jumps around. This is really annoying.
  • Some options are only available through a right click. Since I ususally use the keyboard, it would be nice to
    see all the right click options duplicated in the menu.

Things I like

  • Color coding of transactions. this really helps to see what is occuring in the log.
  • Unobtrusive. It is nice to see this product work by reading the log and then generating
    code to reverse the entries like an accounting system rather than undoing the transactions through
    SQL Server. I hate having extra software installed on my server and this appears to be a fairly
    minimal addition.

Conclusion

I think this is a great product. The $1500/server seems a bit expensive for anyone with a few servers, but
one accidental delete could easily pay for this product. I do have to admit that at first glance I was a little
skeptical that this product would provide me benefits. With my small databases (largest is 1.5GB), three
SQL Servers and a good knowledge of my data, I was not sure that I could save $4500 in lost work
with this product. However, after some experimentation, this could easily save me quite a bit of time.
Right now, if someone accidently deleted a row this morning, I could get the existing data from my QA
site quickly. However, if this was not up to date for some reason, I would need a few hours to restore the DB
to another server, find the data and move it. I believe that Log Explorer could accomplish this task quicker. It would
also be a simpler process than restoring the db to another server. If I had a 10 or 20GB database, then I may not
even be able to restore without taking over a QA or development server for a day.

I know a product like this has been long awaited by DBAs for SQL Server 7/2000. I think Lumigent has done
a great job and recommend the product. It does need some refinement and enhancement, but it does do a great
job of decoding the transaction log. I will be spending some more time with this product and tracking how a series of
transactions involving referential integrity are handled and will expand this review.

As always, I welcome feedback on this article.

Steve Jones
February 2001

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends & analysis

Latest Articles