Load testing involves simulating a full production workload in a non-production environment. Performance analysts gather information related to high transaction volumes, large numbers of users and a full-sized database. The objective is to determine if the application suite and its associated databases can handle large workloads.
Load tests give the database administrator (DBA) quite a lot of valuable information and may make the difference between poor and acceptable application performance. But what about a big data environment? Are there any gotchas or traps associated with big data?
Load Testing is a Best Practice
Many software development projects produce software that moves into the production environment without a formal load test. This is unfortunate! As noted above, load testing can uncover performance issues before you implement in production. The DBA must insist on load testing a big data application, even if some shortcuts are necessary.
There are several categories of load testing. Depending upon your enterprise standards and the type or size of project, this may include one or more of the following:
- Integration testing. Ensure application modules join together well.
- Systems testing. Confirm the application works well with other applications.
- Recovery testing. Can the application recover correctly and within service level agreements after a failure?
- Security testing. Verify that users have necessary and sufficient security to perform functions.
- Stress testing. Tests various combinations of transactions loads, including number of concurrent users, volume of input data, and transaction load on the database. In some cases, it is necessary to determine at what load point the application or database will fail (sometimes called test-to-failure).
Usually, the application team and the release manager have discussed and decided upon the testing schedule and level of testing to be performed prior to releasing the application to production. Reasons for executing various test scenarios may depend upon various factors including staff availability and budget, project delivery date, and availability of a suitable database environment for testing.
But What about Big Data?
Big data means exactly that: lots and lots of data. This may also be accompanied by large transaction volumes and an analytics suite that analyzes the data. In a load test, one must create a production-like environment, and that means storing a lot of data in your test system. If you are using a hybrid hardware/software appliance to do analytics (one common solution is the IBM DB2 Analytics Accelerator, or IDAA), you will need one for the test system. Apart from these two criteria, load testing a big data application follows the same principles as normal load testing.
The DBA looks for certain symptoms during a load test. These symptoms may be caused by underlying database configuration issues, transaction data access issues, or other application-related items. For a big data implementation, the DBA most commonly looks for the following:
- Deadlocks and timeouts. These symptoms may be caused by one or more separate issues. The most common in load testing is the “initialization problem”. Since the application is processing for the first time, many transactional tables may be empty (e.g., customers, current orders, account activity). Since table and index I/O is done at the block level (usually by the 4K page), many active table rows and index entries will be stored on the same page. Simultaneous insert and update activity for different rows on the same page may cause deadlock or timeout issues. As the tables and indexes grow this issue tends to disappear. Later, the occurrence of these symptoms is most likely due to poor application design, transactions that are too long (i.e., do far too many database updates per unit of work), or poor database design.
In a big data environment deadlocks and timeouts usually indicate one of the following problems: (a) Queries accessing the data store are not executing quickly enough, indicating a low-performance access path or poorly-coded query; (b) Queries are accessing (and locking) data concurrently with update or load processes; (c) Data load or update processes are running too long, locking out queries. In most cases, the typical fix will include a mixture of data partitioning to separate data loads from queries and query performance tuning.
- Hot spots. These are physical areas of data where multiple concurrent transactions are accessing the same table row or rows. One common issue is when new table rows are stored at the physical end of the table, perhaps because it is transactional data with ascending keys such as customer number or order number. While this may help performance by clustering the data in key sequence, the table pages at the end of the table will be very frequently referenced by new row inserts and by queries. The same is true for indexes that support these keys.
In a big data environment, hot spots generally do not occur. Analytical queries typically access historical data, while concurrent update processes load current data in another part of the database. Still, the DBA should watch for this: it might indicate poorly-schedule load processes or queries accessing the incorrect data.
- Monitor growth patterns. The DBA will usually have a pre-defined purge process for the big data implementation. Old, stale, or unneeded data needs to be purged so that the big data database doesn’t get too big! However, many times the application development team does not define or create a purge process. The usual reason: “We won’t be purging for a year, so there’s plenty of time to get it defined.” Regrettably, in many cases, the purge process is not defined, and the DBA is left to deal with a combination of huge tables and application performance problems. Far better to deal with this before production implementation, and the load test provides a good place for the DBA to demonstrate how data volume can affect performance.
The Load Test Begins
In preparation for the load test, the DBA needs to have several processes ready. Here are the most common.
- Reorgs and RunStats. The DBA should be prepared to run index and tablespace reorgs and runstats prior to the beginning stages of the load test. This sets up an initial condition that is production-like: data is in-place, organized similarly to production, and data distribution statistics are up-to-date.
- Metrics for resource constraints. The most common resource constraints encountered in load testing in a big data environment are waits for disk reads and writes to occur, and memory used for query processing. Decide ahead of time how these will be measured and what criteria will be used for database service levels.
- Query performance monitoring. The point of most big data implementations is to have historical data available for analysis. This analysis may involve complex SQL and access to several large tables (the big data). Business analysts expect the analytics software (and SQL) to perform well; after all, a query that takes a week to execute will probably consume a lot of resources and produce an answer too late to be useful.
After the Test
After results are reviewed, the DBA will probably take one or more actions to address perceived issues. Since the most common issues are performance- and resource-related, the DBA will usually attempt to mitigate constraints by a process called resource constraint balancing. For example, if a query is executing slowly, this can be considered a response time constraint. The DBA may choose to add one or more new indexes to the tables accessed. This use of another resource (disk space for the indexes) may assist in reducing query execution time.
Typical actions the DBA takes may include:
- Adding new table indexes, or removing unneeded ones;
- Changing the table locking parameters to lock more (or fewer) rows during query execution;
- Tuning SQL statements using DB2’s Explain command, or perhaps a software tool;
- Partitioning or re-partitioning tables.
In a big data environment load testing is an essential tool for finding problems before they occur in production. The smart DBA insists on load testing as a condition for certifying the application as production-ready. And, with big data implementations becoming more and more common, load testing should become a standard best practice in your IT enterprise.