Performance Testing – SQL Server 2008 versus SQL Server 2005

It’s New. Is It Faster?

Back in May of 2005, I wrote an article
comparing the performance of SQL Server 2005 to SQL Server 2000. With that
particular workload, I saw an improvement of 18.5% in the time to process the
workload. Now that SQL Server 2008 has shipped, I figured it was time to run a
similar test for the new version.

One really nice thing about the new version is that it can
be installed on the same system as 2005 without any issues. For that first
test, I actually installed and uninstalled the complete software for each test.
Painful.

Let’s get right to the testing details and then to the
results.

System Details

For this test, I had both SQL Server 2008 and 2005 loaded on
the same system. Here’s the setup:

Intel Core2 at 2.13 GHz
2 GB RAM
Windows XP Professional SP2
2 physical hard drives
1 disk controller
Disk 1: OS, SQL Server, Logs
Disk 2: Data

For each test, I rebooted the system and started only the necessary
software. Only the required database engine was running.

Test Schema

Each table has a clustered primary key on the ID column.

Non-Clustered Indexes:

  • Vendor: non-unique index on BusinessName
  • Card: unique index on CardNumber, SecurityCode
  • Card: non-unique index on SecurityCode, CardNumber
  • Card: non-unique index on SecureString
  • Purchase: non-unique index on CardID
  • Purchase: non-unique index on VendorID

Test Details

The test scripts are built to mimic an OLTP workload. There
are three tables involved, Vendor, Card, and Purchase.

The “Executive Summary”

The tests use multiple simultaneous calls to a SQL script
that calls stored procedures in a loop to first load and then query the data.

The Gory Details

The “driver” for this test is a simple framework I built
years ago called Hummer. It uses a .bat file to run some initial setup scripts
then starts n simultaneous processes, each running the same script. The
idea is to simulate multiple clients fighting for database resources. Each
script includes an occasional call to DELAY. This helps to allow the multiple
processes to share the database resources. It also better simulates a real OLTP
workload.

There were five tests with different parameters. You can see
all the details in the chart below. Let’s look at Test Run 2 as an
example.

The .bat script performs these steps:

  • Drop and recreate the database.
  • Create the tables and indexes.
  • Create the stored procedures.
  • Set all the control parameters.
  • Execute DBCC FREESYSTEMCACHE (‘ALL’).
  • Execute DBCC DROPCLEANBUFFERS.
  • Start 16 processes, each running the main test script.

The CREATE DATABASE script creates the Data file with an
initial size of 400 MB and the Log with an initial size of 20 MB, each set to
grow 10%. After Test Run 1, the Data file had not grown, but the Log file had
grown to 48 MB (2008) and 53 MB (2005).

As stated in the System Details section above, the OS, SQL
Server, and Log were all on drive C:, and the Data was on drive D:.

The main test script performs these steps:

  • Delay 1 second on start-up (to allow all processes to start).
  • Loop 200,000 times
    • Read control parameters (SP call). If finished with test, stop.
      (This also allows you to change the parameters while the test is running.)
    • Every 300 loops, delay 1 second.
    • For the first 20,000 loops, create a Vendor record (SP call).
    • For the first 20,000 loops, create a Card record (SP call).
    • For the first 40,000 loops, create a Purchase record (SP call).
    • Of every 10 loops:
      • 1 of 10 times: insert a Purchase record.
      • 1 of 10 times: update a Vendor record.
      • 8 of 10 times: read a Card record.
  • End loop

It is important to understand that we’re inserting 20,000
Vendor records, 20,000 Card records, and 40,000 Purchase records per process.
So the total number of records inserted (in the initial set) is 320,000, 320,000,
and 640,000. For Test Run 2, the loops do this work:

  • First 20,000 loops: create a Vendor, Card, and Purchase record,
    either do an insert, an update, or a read.
  • Next 20,000 loops: create a Purchase record, either do an insert,
    an update, or a read.
  • Last 160,000 loops: either do an insert, an update, or a read.

After the first 40,000 loops, all insert, update, or read
activity is on some random record in the set.

Test Parameters

Test Parameters

Results

results

The attached results files include output for each process
plus Perfmon graphs from each run, one at the beginning and one as the test
finished. Pull up the Perfmon charts. My eye tells me that SQL Server
2008 is taking noticeably more CPU in every test. Again, this is running the
exact same workload on the exact same hardware.

Conclusion

These tests show that, for the workloads tested and without
utilizing any SQL Server 2008-specific features, SQL Server 2008 is not
significantly faster or slower than SQL Server 2005.

Disappointed? I was. I had gotten great results when testing
2005 against 2000. But 2008 is a different kind of release than 2005 was.

Does this mean that you shouldn’t move to 2008? Certainly
not. Just don’t sell the upgrade to your boss based on performance improvements
without making any other changes.

Thoughts or comments? Drop a note in the forum. Also,
all of the code to run these tests is linked here.
If you want to rerun these tests on different hardware or with different
parameters, please do. It would be great if you could share the results in the forum. We will all
benefit from understanding the performance characteristics of the new version
of our favorite database platform.

Download the files for this article.

»


See All Articles by Columnist
Rob Garrison

Rob Garrison
Rob Garrison
Rob Garrison is a database architect at Fiserv in Portland, Oregon specializing in OLTP design and development. He has over twenty years of IT experience, with over ten years in database development and architecture.

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends & analysis

Latest Articles