Performance Testing SQL Server 2008's Change Data Capture functionality
February 27, 2009
If you havent read the first CDC article, consider reading that now before reading this one. It will help lay the groundwork for what is covered here.
Introduction: CDC Performance Tests
The purpose here is to test the performance of SQL Server 2008s Change Data Capture functionality. How does it perform compared to a system with no change-tracking functionality?
If you have read some of my recent performance test articles, much of the details below will be familiar. This test used a different system and SQL Server 2008 only, so read the Test System Details section, and then skip down to the results. The test framework also is updated to use a Common database that stores the run parameters and test results.
It is important to note here that the tests are performing no SELECTs. This is because what we care about here is tracking the cost of change data capture.
Test System Details
For this test, I used this setup:
4 64-bit Dual-Core AMD Processors
Each table has a clustered primary key on the ID column.
The test scripts are built to mimic an OLTP workload. There are three tables involved: Vendor, Card, and Purchase.
The Executive Summary
The tests use multiple simultaneous calls to a SQL script that calls stored procedures in a loop to first load and then insert and/or update data.
The Gory Details
The driver for this test is a simple framework I built years ago called Hummer. It uses a .bat file to run some initial setup scripts then starts n simultaneous processes, each running the same script. The idea is to simulate multiple clients fighting for database resources. Each script includes an occasional call to DELAY. This helps to allow the multiple processes to share the database resources. It also better simulates a real OLTP workload.
There were multiple tests with different parameters. You can see all the details in the chart below. Lets look at Test Run 2 as an example.
The .bat script performs these steps:
The CREATE DATABASE script creates the Data file with an initial size of 400 MB and the Log with an initial size of 20 MB, each set to grow 10%. After Test Run 2, here are the Data and Log sizes:
The main test script performs these steps:
(After the first 20,000 loops, all insert or update activity is on some random record in the set.)
It is important to understand that were inserting 10,000 Vendor records, 10,000 Card records, and 20,000 Purchase records per process. So, the total number of records inserted (in the initial set) is 50,000, 50,000, and 100,000.
The elapsed time is for the looping work only, not the database creation or any of the other setup work.
The attached results files include sample output files. Remember that this is running the exact same workload on the exact same hardware.
The expected pattern is that using CDC is costly. That is not surprising at all when you understand that it is recording the CDC changes in the same database as the base changes. The average penalty for CDC with full recovery is 10.51%. The average penalty for CDC with simple recovery is 11.10%.
I expected to see a pattern showing that UPDATEs were more expensive. This did not turn out to be true in these tests.
As a side note, it is very interesting to see the performance results comparing full recovery to simple recovery. I expected to see a fairly significant difference in elapsed times when the databases were switched to simple recovery, but that did not turn out to be true. The average time actually went up for the simple mode tests (by 0.69 to 1.27%).
It is clear from the results that there is some variability in the numbers. It would be nice to see a perfect increase or decrease, but there are just too many variables involved regardless of how strictly you control the test environment. What you have to look at is patterns, and its clear here that adding CDC to this workload caused an increase of roughly 10%.
All of the code is included, so feel free to modify it and rerun the tests to suit your specific needs.
Thoughts or comments? Drop a note in the forum.
Download the files for this article.SqlCredit - Developing a Complete SQL Server OLTP Database Project
Performance Testing SQL 2008's Transparent Data Encryption
SQL Server 2008's Change Data Capture - Tracking the Moving Parts
Performance Testing - SQL Server 2008 versus SQL Server 2005
Exploring SQL Server's Index INCLUDEs
Exploring SQL 2005's Ranking Functions - NTILE() and ROW_NUMBER()
SqlCredit - Part 19: Exploring SQL 2005's Ranking Functions - RANK() and DENSE_RANK()
SqlCredit, Part 18: Exploring the Performance of SQL 2005's OUTPUT Clause
SqlCredit - Part 17: Exploring SQL 2005's OUTPUT Clause
SqlCredit - Part 16: The Cost of Bloat
SqlCredit - Part 15: The Cost of Distribution
SqlCredit - Part 14: The Cost of Translation
SqlCredit - Part 13: More on Indexed Persisted Computed Columns
SqlCredit - Part 12: Exploring Indexed Persisted Computed Columns
SqlCredit - Part 11: Change Tracking Using History Records
SqlCredit - Part 10: MAC Performance and Updating SqlCredit
SqlCredit - Part 9: Message Authentication Codes
SqlCredit Part 8: Comparing Encrypt/DecryptByCert and Encrypt/DecryptByKey
SqlCredit Part 7: Performance Impact of EncryptByCert and DecryptByCert
SqlCredit Part 6: Exploring EncryptByCert and DecryptByCert
SqlCredit - Part 5: Adding Card, Vendor, and Purchase, Plus Much Refactoring
SqlCredit - Part 4: Schema and Procedure Security
SqlCredit - Part 3: Schema/Proc Updates and Automated Unit Testing
SqlCredit - Part 2: Creating the Database, Tables, CRUD Procedures
SqlCredit - Developing a Complete SQL Server OLTP Database Project