Database Journal
MS SQL Oracle DB2 Access MySQL PostgreSQL Sybase PHP SQL Etc SQL Scripts & Samples Links Database Forum

» Database Journal Home
» Database Articles
» Database Tutorials
MS SQL
Oracle
DB2
MS Access
MySQL
» RESOURCES
Database Tools
SQL Scripts & Samples
Links
» Database Forum
» Sitemap
Free Newsletters:
DatabaseDaily  
News Via RSS Feed


follow us on Twitter
Database Journal |DBA Support |SQLCourse |SQLCourse2
 

Featured Database Articles

Oracle

Posted Feb 28, 2008

Oracle Database 11g: SQL Plan Management, Part 2 - Page 2

By Jim Czuprynski

SPM Scenario #2: Deploying a New Application

Whenever I’ve deployed a new application against my production databases, I know that I’m essentially at the mercy of my application developers and quality assurance analysts. Even though I’ve insisted on tight database object naming standards, strict adherence to PL/SQL best practices, and extensive testing of SQL statements using various combinations of bind variable values, I also know that these guidelines are sometimes unrealistic. When my shop’s application development team and QA resources can’t seem to agree upon a set of standards, or if my senior IT management decides that we’re spending too much time testing, I know there’s a good chance that a newly-developed in-house application will wreak havoc on the performance of a (hitherto) well-tuned Oracle database.

Another vector for potentially poor database performance is those marvelous third-party applications that we’re sometimes forced to implement by executive fiat. Every experienced Oracle DBA has encountered this situation at least once: An outside application is causing extremely poor performance, and upon investigation, it turns out the application has been written using what I like to call “agnostic SQL” by an offshore development team whose marching orders are to insure that this application can run on any database in any OS environment. The end result, of course, is SQL whose source code is unavailable for direct tuning because it’s buried deep in the application layer.

The good news is that Oracle 11g does offer some hope for avoiding degraded performance when deploying a brand-new application in either circumstance:

  • For an “in-house” application, I only need to capture that application’s most typical SQL statements into a SQL Tuning Set while it’s undergoing evaluation in the QA environment.
  • For a third-party application, I can either ask the vendor to send me all (or even just the most commonly executed) SQL statements before the application is implemented so that I can capture and evaluate them ahead of time. Another option: I can simply capture the third party application’s most common SQL statements while they’re being executed in a QA or development environment directly from the Library Cache.

Once I’ve captured the new application’s SQL statements, I can use the existing QA or development database to capture their corresponding SQL Plan Baselines, and then transfer those baselines directly to the production database’s SMB. The end result? These execution plans will already be present when the application is eventually deployed, thus relieving the CBO of the responsibility for building the execution plans for these statements “on the fly” when the application is first deployed to production. Best of all, if better execution plans do eventually arise for these statements, the CBO will automatically evolve those improved plans.

Preparing the Simulation. Before I can simulate this scenario, I have some setup tasks to perform. I’ve gathered them in Listing 2.5:

  • I’ll first clear any SQL Plan Baselines from the SMB of my current Oracle 11g database using procedure DBMS_SPM.DROP_SQL_PLAN_BASELINE. This procedure accepts as arguments the SQL handle and plan name and then removes the corresponding SQL Plan Baseline from the SMB. I’ve written an anonymous PL/SQL block that utilizes a cursor to capture all SQL Plan Baselines whose SQL text contains a comment like SPM_2, and then feeds the corresponding SQL handle / plan name combinations to the procedure. This method yields fine-grained control of the content of the SMB; I’ll use it several times throughout these scenarios to selectively “depopulate” the SMB when necessary.
  • I’ll then construct the components that will form the basis for a new Sales Force Administration application. As shown in SFA_Setup.sql, I’ll create a new schema owner (SFA), grant it appropriate system and object privileges, and finally create and populate several new tables within this schema.

Capturing SQL Plan Baselines. I’ve illustrated the steps to capture a simulated SQL workload for this new application in Listing 2.6:

  • To simulate the capture of a sample workload in a testing environment, I’ll first insure that the Library Cache and Database Buffer Cache of my current Oracle 11g database is empty before I execute the code shown in SPM_2_2.sql. These six queries – all labeled with a comment (SPM_2_2) for easier identification – show several different ways a user might decide to combine information about my sales force at the Region, District, and Territory level from the new application’s SFA schema with historical sales information in the Sales History (SH) schema. Note that I’m also using bind variables generously in these queries so that I’ll have the opportunity to evaluate other execution plans in future demonstrations.
  • Once the workload is generated and these six queries have been parsed into the Oracle 11g database’s Library Cache, it’s relatively simply to capture their SQL Plan Baselines into the SMB via function DBMS_SPM.LOAD_PLANS_FROM_CURSOR_CACHE. This function’s filtering capabilities allow me to capture only those SQL statements whose text contains the comment SPM_2_2.

Exporting SQL Plan Baselines From a Test Environment. As shown in Listing 2.7, I’ll export the captured SQL Plan Baselines from my simulated testing environment:

  • First, I’ll use procedure DBMS_SPM.CREATE_STGTAB_BASELINE to create a SQL Plan Management staging table.
  • I’ll then use function DBMS_SPM.PACK_STGTAB_BASELINE to populate that staging table with only the SQL statements that have been created by the SYS user.
  • Finally, I’ll invoke DataPump Export to export the metadata and contents of that staging table.

Importing SQL Plan Baselines Into a Production Environment. To conclude this scenario, I’ll simulate the deployment of the SQL Plan Baselines to a production environment (see Listing 2.8):

  • First, I’ll simply import the SQL Plan Baseline staging table back into my target production environment.
  • Since I’m using the same Oracle 11g database to simulate both test and production environments, I’ll once again use the DBMS_SPM.DROP_SQL_PLAN_BASELINE procedure to clear out any SQL Plan Baselines that contain a comment of SPM_2.
  • I’ll then reconstitute those baselines directly within my production database’s SMB via function DBMS_SPM.UNPACK_STGTAB_BASELINE.

Proof of Concept. I’ve verified the successful “unpacking” by querying data dictionary view DBA_SQL_PLAN_BASELINES. (I used the same query shown in Listing 1.3 in the prior article in this series.) The SQL Plan Baselines for my six target queries appear in the resulting output in Listing 2.9, tagged with an ORIGIN value of MANUAL-LOAD to indicate they originated via DBA intervention instead of the automatic SQL Plan Baseline capture method.

Next Steps

I’ve demonstrated how SQL Plan Baselines can effectively limit (if not eliminate) unexpected SQL statement regression during database upgrades, and how to capture SQL Plan Baselines before brand-new application code is deployed to limit plan regression as well. In the next and final article in this series, I’ll illustrate another scenario in which SQL Plan Management can help prepare the way for an upcoming database upgrade without having to export SQL Plan Baselines between Oracle 10g and 11g databases. I’ll also discuss how to:

  • Control the automatic evolution of SQL Plan Baselines
  • Employ manual methods to override plan evolution
  • Manage and monitor the automatic purging of outmoded SQL Plan Baselines

References and Additional Reading

While I’m hopeful that I’ve given you a thorough grounding in the technical aspects of the features I’ve discussed in this article, I’m also sure that there may be better documentation available since it’s been published. I therefore strongly suggest that you take a close look at the corresponding Oracle documentation on these features to obtain crystal-clear understanding before attempting to implement them in a production environment. Please note that I’ve drawn upon the following Oracle Database 11gR1 documentation for the deeper technical details of this article:

B28274-01 Oracle Database 11gR1 Performance Tuning Guide

B28279-02 Oracle Database 11gR1 New Features Guide

B28419-02 Oracle Database 11gR1 PL/SQL Packages and Types Reference

» See All Articles by Columnist Jim Czuprynski



Oracle Archives

Comment and Contribute

 


(Maximum characters: 1200). You have characters left.

 

 




Latest Forum Threads
Oracle Forum
Topic By Replies Updated
Oracle Data Mining: Classification jan.hasller 0 July 5th, 07:19 AM
Find duplicates - Unique IDs Lava 5 July 2nd, 08:30 AM
no matching unique or primary key rcanter 1 April 25th, 12:32 PM
Update values of one table based on condition of values in other table using Trigger Gladiator 3 February 29th, 06:01 PM


















Thanks for your registration, follow us on our social networks to keep up-to-date