by torsten zimmermann real application testing · peoplesoft, and hyperion, ... as optimizer...

8
38 Testing Experience – 22/2013 Ensuring application quality and performance is one of the biggest challenges companies face as they deploy more applications and move more of their services online. It is also increasingly expensive as the pace of new application releases, upgrades, and patches continues to grow and in- creases the need for testing. And in terms of cloud solutions, the service quality – as well as the software quality – is gaining more attention. The question is: how to ensure that the quality meets the customer requirements? Oracle’s Real Application Testing certainly seems to deliver on the software quality front! Companies have invested huge amounts in new applications to deliver better and more cost effective services to their customers. But poor software quality can put these investments at risk. Studies have shown that more than 40 % of software applications are released with critical defects. And the cost of fixing those defects in production is up to 100 times more expensive than in the development phase. In a customer satisfaction survey performed for Siebel customers, a correlation was shown between the amount of testing performed and customer satis- faction with the application. In every measure including overall product effectiveness, reliability, speed, and scalability, customers that did more testing and more formalized testing had better results than those that did not test. This is why Application Quality Management is very important for maintaining business agility and ensuring service levels while also reducing costs and risk. Typical Testing Challenges Due to ever-increasing and more stringent customer requirements, software testing is going to be become more complex and expensive. Therefore test managers and test engineers are often faced with the following testing challenges: To approve whether applications are able to handle the actual circumstances (e.g. workloads, amount of data within the database, concurrent requests, user activities/interaction etc.) of the produc- tion environment, the respective conditions have to be reproduced within the testing environment. But this test requirement is hard to fulfill with common test techniques within the test environment and application testing on production environments represents a “no-go” with regard to testing best practices. Current process analysis of testing activities in various IT projects with different execution times from months to years over several industries shows the increasing time and effort (human, hard- ware resources). This result can be attributed to the following two reasons: By Torsten Zimmermann Real Application Testing How to ensure the quality of databases, their services, and content? Figure 1. Overview of the Application Quality Management Solution

Upload: phungdien

Post on 12-Apr-2018

219 views

Category:

Documents


1 download

TRANSCRIPT

38 Testing Experience – 22/2013

Ensuring application quality and performance is one of the biggest challenges companies face as they deploy more applications and move more of their services online. It is also increasingly expensive as the pace of new application releases, upgrades, and patches continues to grow and in-creases the need for testing.

And in terms of cloud solutions, the service quality – as well as the software quality – is gaining more attention. The question is: how to ensure that the quality meets the customer requirements? Oracle’s Real Application Testing certainly seems to deliver on the software quality front!

Companies have invested huge amounts in new applications to deliver better and more cost effective services to their customers. But poor software quality can put these investments at risk. Studies have shown that more than 40 % of software applications are released with critical defects. And the cost of fixing those defects in production is up to 100 times more expensive than in the development phase. In a customer satisfaction survey performed for Siebel customers, a correlation was shown between the amount of testing performed and customer satis-faction with the application. In every measure including overall product effectiveness, reliability, speed, and scalability, customers that did more

testing and more formalized testing had better results than those that did not test. This is why Application Quality Management is very important for maintaining business agility and ensuring service levels while also reducing costs and risk.

Typical Testing ChallengesDue to ever-increasing and more stringent customer requirements, software testing is going to be become more complex and expensive. Therefore test managers and test engineers are often faced with the following testing challenges:

▪ To approve whether applications are able to handle the actual circumstances (e.g. workloads, amount of data within the database, concurrent requests, user activities/interaction etc.) of the produc-tion environment, the respective conditions have to be reproduced within the testing environment. But this test requirement is hard to fulfill with common test techniques within the test environment and application testing on production environments represents a “no-go” with regard to testing best practices.

▪ Current process analysis of testing activities in various IT projects with different execution times from months to years over several industries shows the increasing time and effort (human, hard-ware resources). This result can be attributed to the following two reasons:

By Torsten Zimmermann

Real Application TestingHow to ensure the quality of databases, their services, and content?

Figure 1. Overview of the Application Quality Management Solution

Testing Experience – 22/2013 39

▪ Today, the QA activities have to deliver answers to more qual-ity characteristics compared with the past. As a consequence, more testing tasks have to be performed.

▪ Due to more complex failure/error situations compared with the past, the time required for application testing increases due to higher analysis efforts and execution of more test cycles.

Overview of Oracle’s Application Quality Manage-ment in Terms of TestingOracle now offers several testing tools with advanced features to oper-ate Real Application Testing within the latest Oracle Enterprise Manager Version 12c. These tools belong to the Application Quality Management solutions which can be accessed via the Oracle Enterprise Manager. They provide high quality testing for all tiers of the application stack to help companies identify application quality and performance issues prior to deployment, reduce costs, and ensure a positive experience for application end users.

Oracle Enterprise Manager’s AQM solutions provide a unique combina-tion of test capabilities which enable users to:

▪ Test infrastructure changes: Real Application Testing is designed and optimized for testing database tier infrastructure changes us-ing real production workloads to validate database performance in your test environment.

▪ Test application changes: Application Testing Suite provides a complete end-to-end application testing solution that allows you to automate functional and regression testing, execute load tests, and manage the testing process.

▪ Manage test data and enable secure production-scale testing: the Data Masking Pack helps to achieve security and compliance objec-tives by obfuscating sensitive data in your production databases so they can be leveraged by non-production users in test environ-ments.

Oracle Application Testing Suite is an integrated test solution that pro-vides end-to-end testing capabilities for ensuring application quality, performance, and reliability. Application Testing Suite (ATS) accelera-tors make it the most comprehensive solution for testing Oracle pack-aged applications, such as E-Business Suite, Siebel, Fusion, JD Edwards, PeopleSoft, and Hyperion, and allow faster scripting, and more reliable and robust scripts than any other solution on the market, with script reduction times of up to 50 % and overall test effort reduction of up to 25 %. Application Testing Suite includes a suite of products for automated functional testing, load testing, and test management. By automating test cases, testing and tuning application performance, and better man-aging test processes, ATS can help you deliver higher quality applications while also increasing the efficiency of testing teams. There are three separately licensed products in the Oracle Application Testing Suite:

▪ Oracle Test Manager: for documenting and managing the overall test process including test requirements, test planning, test cases, test execution, and issues.

▪ Oracle Test Manager provides support for manual testing, unit test-ing, and automated testing.

▪ Oracle Functional Testing: for automating functional and regres-sion testing of Web applications, Oracle packaged applications, and Web services.

▪ Oracle Load Testing: for automated load and performance testing of Web applications, Oracle packaged applications, database, and Web services.

Introduction of Real Application TestingOne of the highlights of Oracle’s Application Quality Management so-lution is the Real Application Testing feature. It provides the highest quality load testing solution for the database stack, complementing ATS’s capabilities. Real Application Testing can be used for testing exist-ing applications for system changes related to database stack or below. Some typical examples of supported system changes that happen in a database operational environment are: operating system and hardware upgrades, storage subsystem changes, database upgrade and/or patches, RAC instance addition, conversion to RAC, migration to Exadata V2, and database parameter or optimizer-related changes.

Application Replay

SQL Performance Analyzer

Database Replay

Figure 2. Real Application Testing Modules

Real Application Testing includes two solutions to test the effect of system changes on real-world applications:

▪ SQL Performance Analyzer (SPA) to assess the impact of system changes on SQL response time by identifying any variations in SQL execution plans and performance statistics resulting from the change.

▪ Database Replay to effectively test system changes in test environ-ments by replaying a full production workload on the test system to help determine the overall impact of change on the workload.

Database Replay and SPA together provide a comprehensive, flexible, and end-to-end solution for assessing the impact of database stack-related changes. They enable businesses to fully assess the outcome of a system change in a test environment, take any corrective action if necessary, and then introduce the change safely to production systems, minimizing the undesirable impact on them. The functionality of Real Application Testing is accessible both from Oracle Enterprise Manager and command-line APIs, but resides within the Oracle database.

SQL Performance AnalyzerChanges in SQL execution plans due to routine system changes, such as optimizer statistics refresh, schema changes, upgrades, or patch set

40 Testing Experience – 22/2013

application often severely impact production system performance and stability. Therefore, the ability to perform fine-grain SQL response-time assessment with SPA and to fix any regressions is important to the smooth functioning of any application. SPA runs the SQL statements in isolation and serial manner in before-change and after-change environments and provides a detailed change impact report showing SQLs that have remained the same, improved and regressed.

SPA functionality is integrated with database tuning solutions like SQL Tuning Advisor, and SQL Plan Management. As a result, SPA completely automates and simplifies the manual and time-consuming process of identifying application SQL performance regressions in even extremely large SQL workloads. Furthermore, the identified SQL regressions are automatically remediated, saving users a lot of time and effort.

The SPA report summarizes the change impact on the entire workload as well as the net impact on individual SQL statements. Figure 1 below shows Oracle Enterprise Manager’s SPA Task Result page for a completed SPA test run.

Application ReplayThe Oracle Application Replay Pack offers test automation based on middle-tier or http requests and enables realistic testing of planned changes to any part of the application stack from application server down to disk by re-creating the production workload on a test system. Using the Application Replay feature, workload on the production system can be captured and replayed on a test system with the exact timing, con-currency, and transaction characteristics of the original workload. This ensures the assessment of the impact concerning any change, including new contention points, SQL execution plan regressions, or undesired results. Extensive analysis and reporting is provided to help identify any potential problems, such as new errors encountered and performance divergence. Oracle Application Replay Pack can help assess various system changes, such as application server upgrades, hardware updates, and operating system or configuration changes.

Database ReplayDatabase Replay provides full workflow coverage and uses real pro-duction-scale workload that results in the highest quality of testing. Database Replay allows you to capture a production workload with negligible performance overheads and to replay it on a test system with the exact timing, concurrency, and transaction characteristics of the original workload.

By replaying real production-scale workload, Database Replay provides a complete assessment of the impact of the change including identifying undesired results, new contention points, or performance regressions. It also provides extensive analysis and reporting to help identify potential problems, such as new errors encountered and performance divergence. Thus the task of assessing a system change using Database Replay can be reduced from months to days.

The Database Replay procedure has to be performed in four steps:

1. Workload capture: When workload capture is enabled on the production system, all external client requests directed to the Oracle Database are tracked and stored in binary files, called cap-ture files, on the file system. The user specifies the location of the capture files as well as the duration of capture. These files contain all relevant information needed for replay, such as SQL text, bind values, wall clock time, system change number, etc. The capture is database-wide and RAC-aware, and, hence, it needs to be initiated on one instance only. The progress of capture and the system activ-ity can be controlled via Enterprise Manager.

2. Workload Processing: Once captured, the workload files have to be processed before replay. This creates the necessary metadata need-ed for replaying the workload. It is important that the processing be done once on the same database version as the Replay system. Once processed, the captured workload can be replayed repeatedly on the same version.

OracleDatabase

Server

Clients

Storage

Clients

OracleDatabase

Storage

WorkloadCapture

WorkloadPreprocessing

WorkloadReplay

Analysis andReporting

Figure 3. Main steps of Database Replay

Testing Experience – 22/2013 41

© iS

tock

phot

o.co

m/n

umbe

os

Prof. van Testingrecommends

Course Details:

The three-day training course “IREB® Certifi ed Professional for Requirements Engineering” is given by Díaz & Hilters-cheid in German and English and can be completed with an independent certifi cation exam.

Incomplete or inconsistent requirements engineering leaves scope for interpre-tation in software development and makes verifi cation and validation diffi -cult. Corrections during the project can have serious consequences on time and cost planning.

The training introduces you to the cor-rect and complete capture, documen-tation, checking and administration of requirements. You will become ac-quainted with procedures, techniques and tools and will shown the fundamen-tals of communication theory. The train-ing is based on the independent syllabus of the International Requirements Engi-neering Board (IREB®).

Target Audience:

The training is intended for requirements managers, project managers, quality managers, software testers and soft-ware developers who preferably have experience in IT projects and some initial experience with the handling of require-ments.

Requirements:

Information regarding the required knowledge can be found in the IREB® syl-labus, which can be downloaded from the IREB® website: www.certifi ed-re.com.

In-house Training:

All of our courses are available as pri-vate/in-house training. Please contact us for details.

Open seminars in English:

03.–05.09.13 Helsinki, Finland

12.–14.11.13 Amsterdam, NL

Open seminars in German:

20.–22.08.13 Berlin, Germany

08.–10.10.13 Berlin, Germany

05.–07.11.13 Mödling, Austria

26.–28.11.13 Berlin, Germany

Please note that training dates are subject to change.

IREB® Certified Professional for Requirements EngineeringFoundation Level

© iS

tock

phot

o.co

m/d

avid

nay

Follow me @vanTesting

Díaz & Hilterscheid Unternehmensberatung GmbHKurfürstendamm 17910707 BerlinGermany

Phone: +49 (0)30 74 76 28-0Fax: +49 (0)30 74 76 28-99

E-mail: [email protected]: training.diazhilterscheid.com

For more information, please visit our website or contact us:

42 Testing Experience – 22/2013

3. Workload Replay: After the workload has been processed it is ready for Replay. The test system should have the change applied and the database should reflect the same application data state as for the start of capture. A special client called the Replay Client is provided to issue the workload to the database server with the same tim-ing and concurrency characteristics as in the capture system. A calibration tool is provided to help determine the number of replay clients, since multiple replay clients may be needed to drive large workloads. The workload files have to be made available to the database server as well as to all the Replay clients.

4. Analysis and Reporting: Extensive reports are provided to enable detailed analysis of capture and replay. Any errors or divergence in data encountered during replay are reported. It is highly recom-mended to have an Application-level validation script to assess the state of the application data after replay. Reports using the AWR data are also available for detailed performance analysis. AWR data is automatically exported into the workload directory after the replay, allowing comparison between replays.

Use Case ScenariosSome use case scenarios (based on performed testing projects at the customer side) are explained in the following sections to demonstrate how Real Application Testing can improve software quality by using real workload and provide better application coverage.

Example 1: Upgrade of the Database Version

Challenge

A production environment is still on an older database version. IT Man-agement wants to upgrade to the latest database release to be on sup-ported software releases and compliant with the corporate security standards. It is important to maintain the committed performance and quality requirements after the upgrade. Any unplanned outages are expensive and may cause consumers to choose other providers.

Solution

The solution in this example uses the SQL Performance Analyzer as well as the Database replay. IT Management needs to identify whether any performance regression due to plan changes needs to be addressed and whether any performance issues or errors caused by bugs or code changes are available in the new release.

Approach

The first performance issues due to plan changes should be addressed. Every time a new version is introduced, there are changes in the opti-mizer that may cause a few SQL statements to have sub-optimal plans. It is important to address SQL performance regression before we per-form concurrency testing with Database Replay. This enables SQL to be eliminated as a source of performance problems. The following steps have to be performed:

▪ Capture SQL Tuning Set (STS) from production. It is recommended to capture several STSs from different type of workloads, e.g. day activity, night batches, and month ending.

▪ Run SPA trial against production or production copy to establish a baseline.

▪ Upgrade the database.

▪ Run SPA trials against the upgraded database.

▪ Identify regressed SQL statement with SPA report.

▪ Tune regressed SQL statements. Hint: the Tuning Advisor offers additional insight and recommendations, such as SQL profiles, new indexes, or fresh statistics. If these actions would not help then it is possible to revert back to old plans for the regressed SQL using SQL Plan Management.

▪ Run SPA trial to validate performance improvements. Always imple-ment one change at a time and validate it with a new SPA trial. Some tuning recommendations may inadvertently impact other SQL statements, so it is important to test one change at a time to understand cause and effect.

1. Real Application Testing will reduce the time spent testing, but it is still important to create a testing project plan where scope, objectives, limitations, and success criteria are well defined.

2. Identify the latest RAT bundle patch for both capture and replay environment. Patch information can be found in My Oracle Support note #560977.1. It should be considered as a mandatory patch.

3. Run Workload Analyzer to identify if there are any non-supported workloads or if there are any issues that should be considered before replay.

4. Always start with SQL Performance Analyzer to identify any performance regression.

5. Solve all identified regression before performing database replay.

6. Capture SQL Tuning sets for same period as for database replay,

7. Identify critical workloads; perform capture of each one as a separate capture.

8. For database replay, do not start capture during peak workload, start capture some time before the peak occurs.

9. If batch workload should be captured, always start some time before the batch is initiated.

10. Start with smaller capture periods, such as 30 mins – 1 hour; once all issues have been resolved, extend to longer periods.

11. Avoid really long captures such as several days. It will be difficult to monitor and there is a big risk that even 0.1 % errors will be overwhelming for diagnoses. One insert that fails can cause a huge amount of cascading errors.

12. Do not replay a complete workload on a smaller system than the one from which it has been captured. The replay will reproduce the production workload. If the test system is smaller, consider using replay filters to limit the workload.

13. Consider Flashback database as the restore method between replays.

14. Use both out-of-the-box SPA and Database replay reports as well as own-defined application dependent reports.

DOs and DON’Ts

Testing Experience – 22/2013 43

▪ Implement tuning advice and run SPA trials until all regressions are solved in an iterative process.

Now all regressions have been resolved and the database replay can be initiated. Best practice is to start with short replays and then progress to longer replays.

▪ Capture production workload for an interesting time period.

▪ Perform backup of production system. Make sure that you can restore the system to a point in time that aligns with the start of the capture. Capture report contains the SCN (System Change Number) for the start of the capture. Endorsement: each time a commit is done in the database we bounce up the SCN. This is stored in block headers, log files and control files. As soon as the database is shut down, all artifacts need to be consistent. Otherwise the database needs to be restored with transactions that are mismatching.

▪ Process captured workload for the version you are going to replay.

▪ Check the result from workload analyzer. The workload analyzer will identify common problems during replay. This feature helps to detect and correct issues before starting to replay.

▪ Restore production database to test system.

▪ If replay is done on the upgraded system, then implement tuning recommendations from the SPA trials.

▪ Replay the workload. Replay can be done in different synchroniza-tion modes. Use the one that suits your application by validating it with application validation scripts.

▪ Analyze the result. Use replay report, compare period report, ADDM (Automatic Database Diagnostic Monitor), and standard AWR (Au-tomatic Workload Repository) reports.

▪ Analysis of errors to find the reasons and bug fix solutions.

▪ It is not uncommon for the replay to have errors or fail on the first attempt. This is most likely due to an incorrect test system setup, new issues discovered with the system change, and so on.

▪ Identify patches and replay again until the expected requirements have been met.

Results

SPA identified the individual SQL statements with performance issues. Database Replay identified concurrency-related performance issues. All the issues identified should be corrected before the upgrade sys-tem change is deployed in production. The comprehensive testing and analysis based on Real Application Testing is able to recreate most of the production workload and its characteristics on the test system. Therefore, no major surprises should come to light when the new database version is deployed in production.

Example 2: Consolidation and validation of new hardware

Challenge

There are 80 separate Oracle databases supporting 95 business services on 15 servers within a production environment. They have 5 different operating systems and are running 6 different versions of the database. The environment is difficult to maintain. Many servers are now at the end of their support and need to be replaced using newer hardware.

IT management wants to consolidate the databases into one standard platform. The new estate should contain about 20 production databases on 6 servers. This will help IT management to be more compliant with security updates and to reduce maintenance costs.

Approach

The first objective is to find cost efficient hardware which would be able to support the production workload. The hardware vendor offers two different hardware configurations with different architecture that he claims are sufficient to meet the requirements.

Each platform has different specifications concerning CPU, I/O, etc. based on different cost models. But which platform would represent the best alternative? Thus the comparisons of performance figures will be insuf-ficient to find comprehensive answers to that question. The respective hardware validation was done by using Real Application Testing on the most important production database.

The table below explains the differences between the SPA and Database Replay methods:

SPA Database Replay

Test focus single SELECT analysis entire database workload analysis

Repeat? yes yes, but respective configuration needed

Installation no installation needed; patches may need to be installedplease refer to Oracle Support Note 560977.1

no installation needed; patches may need to be installedplease refer to Oracle Support Note 560977.1

Usage before 11g Supported in the following cases:from 9.x to 10.xfrom 9.x to 11gfrom 10.x to 10.xfrom 10.x to 11gfrom 11g to 11gplease refer to Oracle Support Note 560977.1

Supported in the following cases:from 9.2.0.8 to 11gfrom 10.2.0.x to 11g

please refer to Oracle Support Note 560977.1

Workload type SELECT Statements, DML via respective package entire database workload, e.g.: SELECT, PL/SQL, DML, DDL, transactions, etc.

Test prerequisites the SQL Tuning Set (STS) has to be executed no

Test results single execution path analysis Capture and Replay Report, Compare Period Report, AWR and ASH Reports

Technical Comparison between SPA and Database Replay

44 Testing Experience – 22/2013

To avoid any SQL performance impact, SPA is used to identify perfor-mance regression. The respective approach is the same as in example 1. Or, in other words, the same activities have to be executed as listed in example 1.

After the successful trial in which no performance regression is avail-able, the workload should be replayed with database replay. All database analyses are done by using AWR reports and primarily focus on data-base time. In parallel, OS statistics should be monitored throughout the complete survey:

▪ Execution of the first iteration of replay to assure that the test contains no major errors. The parameter synch = false may not work for all applications. Therefore, it is recommended to try a few replay parameters prior to the planned testing, such as sync = true/false and application validation scripts. The parameter combination which works best for the application should be used. Additionally, the think_time or connect_time can be adjusted appropriately to increase or decrease load on the system. Our first test starts with a configuration suggested as sufficient by the hardware vendor.

▪ The workload is replayed on each platform to detect any perfor-mance changes compared to production. By picking the top 10 statements from the AWR statistics, such as CPU, I/O, and elapsed time, the efficiency of each configuration can be measured.

▪ TASK 1/Identification of the CPU bottleneck: This point can be as-sessed by replaying the workload and decreasing the amount of CPUs for each platform until the wait time starts to increase (visible in AWR report). This method allows the minimal configuration to be detected.

▪ TASK 2/Platform scalability: The captured workload has to be performed with SCALEUP_MULTIPLER. This parameter will multiply the number of users and, from each shadow session, replay the read only part of the workload multiple times. By the way, the param-eters connect_time and think_time can be reduced to replay the same workload faster.

▪ Modification of the SCALEUP_MULTIPLIER (2, 4, or 8) helps to deter-mine the scalability for each configuration. This method helps to identify the platform with the best price performance.

▪ TASK 3/Reduction of the number of databases through deconsolida-tion of the servers: Execute the verification as to whether multiple applications can coexist in the same database. This verification can be performed by using consolidated replay. With consolidated replay it is possible to replay multiple captures against the same database. This is done in the following way:

▪ Capture SQL Tuning Sets for databases that should be consoli-dated.

▪ Run SQL Performance Analyzer for each individual database that should be consolidated to identify SQL regression.

▪ Capture interesting workload from databases that should be consolidated.

▪ Replay each capture in isolation to identify and fix any prob-lem concerning performance or errors.

▪ Run a consolidated replay. Each successful iteration adds an additional capture to the replay schema. The performance values from the server as well as the database side have to be monitored and assessed. The replay report highlights any new errors or performance issues seen during the replay.

Results

IT management could identify which platform alternative represents the best price performance. Furthermore, the applications can be migrated and consolidated without any risks. The executed tests prove that all applications can coexist in the same database without interfering with each other. They can also make sure that enough resources are available for all databases and applications.

Example 3: Optimizer Statistics Refresh

Challenge

A large web shop is reluctant to update table statistics since earlier attempts to refresh statistics resulted in a few SQL regressions and sub-optimal performance. As a result, the system has been running for a long period with stale statistics but performance has degraded during the last month. The database administrator is now in the position that

In comparison to common software testing, Real Application Test-ing provides the following advantages that are realized in several IT projects:

1. By using real production workload for testing, organizations have identified and corrected issues in testing before production deployment. This has led to improved performance, SLAs, and stability of production systems.

2. By using Real Application Testing on an ongoing basis, orga-nizations have been able to reduce the burden on DBAs who previously had to be involved in firefighting operational issues in the production environment. Now these resources are directed towards more proactive and strategic parts of the business.

3. DBAs now have a test infrastructure at their disposal to test their changes without the overhead of having to duplicate an entire application infrastructure. Database Replay does not require

the set-up overhead of having to recreate a middle tier or a web server tier. Thus, DBAs and system administrators can rapidly test and upgrade data center infrastructure components with the utmost confidence, knowing that the changes have truly been tested and validated using production scenarios.

4. Another major advantage of Database Replay is that it does not require the DBA to spend months getting a functional knowledge of the application and developing test scripts. With a few point and clicks, DBAs have a full production workload available at their fingertips to test and roll out any change. This cuts down testing cycles from many months to days or weeks, and brings significant cost savings to businesses as a result.

5. The new testing approach ensures smooth production go-lives with no surprises in the productive environment due to unde-tected weaknesses.

Advantages of Real Application Testing

Testing Experience – 22/2013 45

he needs to gather new statistics to maintain performance but new performance problems may arise as a result.

Approach

Since Oracle RDBMS 11gR1, the functionality to gather statistics in a pending mode is available. That means new statistics will be gathered but the respective information is not set to public and is not available for the optimizer. SQL Performance Analyzer can use pending statistics to validate the performance. Enterprise Manager 12c contains a guided SPA workflow that can be used to validate new statistics. Since SPA does not change the database content, it is safe to validate performance in the production environment. The following steps have to be performed:

▪ Capture SQL Tuning Set (STS) from production. It is recommended to capture several STSs from different type of workloads e.g. day activity, night batches, and month ending.

▪ Gather new statistics in pending mode.

▪ Run SPA guided workflow “Optimizer Statistics” in production.

▪ Identify regressed SQL statement with SPA report.

▪ Tune regressed SQL statements. Hint: the Tuning Advisor offers additional insights, such as SQL profiles, new indexes, or fresh sta-tistics. If these actions do not help, then the old plan can be inserted into the SQL plan management.

▪ Run SPA trial to validate performance improvements. Always imple-ment one change at a time and validate it with a new SPA trial. Some advice can cause other statements to regress and by imple-menting several pieces of advice at once it is hard to assess which change would affect the result.

▪ Implement tuning advice and run SPA trials until all regressions are solved in an iterative process.

▪ Make sure that the new statistic does not cause any performance regression. If the results fulfill the requirements, the new statistics can be published.

Results

The database administrator has been able to validate new statistics as well as to identify and tune SQL statements that would have regressed after implementing fresh statistics. He can be confident that fresh sta-tistics will not have any impact on this business-critical system.

ConclusionsEnsuring the quality and performance of your enterprise applications requires a comprehensive approach to application quality management.

This requires a thorough testing of all tiers of the application stack prior to deployment. This includes testing both applications and infrastruc-ture, for new application deployments as well as upgrades of existing applications. Comprehensive testing requires validating both application functionality and performance under real-world operating conditions. And, to maximize efficiency, it is important to have an effective frame-work in place to plan and manage your test processes and to leverage test automation to reduce the need for manual testing.

Oracle Enterprise Manager provides a comprehensive set of Application Quality Management solutions which include Application Testing Suite, Real Application Testing, and Data Masking Pack. This best-of-breed Oracle AQM offering allows you to test the entire stack from application to database. It provides the only load testing solution on the market that combines both real and synthetic workload testing. And it enables high quality and secure testing with the lowest risk of change. Oracle AQM solutions provide a heterogeneous testing solution that is also optimized for testing Oracle applications. Oracle AQM will help you reduce test cycle times and costs while increasing the quality and performance of your applications. ◼

Since 1985, Torsten Zimmermann has been developing software applications for business and administration. After completing his degree as “Diplom Wirtschafts-informatiker” (1993), he became familiar with quality management within the software lifecycle. As of 1995, he has been contributing to international projects in the area of software quality and quality/test manage-ment at various corporations including BMW, Daim-

lerChrysler, German Railways, Hewlett-Packard, Hoffmann-La Roche, and Logica. Over the years, he has become an expert at the European level.He has developed a risk-based testing approach which was published in the professional publication “QZ”, among others, and is now established as state-of-the-art knowledge in the area of software quality assurance. His accumulated experience has led to the creation of T1 TFT (Test Framework Technologies, 2001) for the advent of a new area of testing systems.Torsten Zimmermann is now developing new approaches to enhanced test concepts and test frameworks as T2 TFT (2004) and T3 TFT (2006). In cooperation with a network of universities, this is creating new solutions for rules and model-based testing systems. Within this he is also considering near-shore/off-shore concepts in terms of software development. In his role as speaker at conventions and specialist author in well-known magazines, he periodically presents his conclusions, results, and concepts at national and international level.

> about the author

-- SQL Tuning Advisor Report, TASK_NAME from TUNINGTASK.sql/* Level: BASIC, TYPICAL, ALL Section to specific areas: SUMMARY, FINDINGS, PLAN, INFORMATION, ERROR, ALL */set long 10000000 longchunksize 1000000 linesize 200 head off feedback off echo offvariable report clobexecute :report := DBMS_SQLTUNE.REPORT_TUNING_TASK(task_name => 'TASK_TUNING_SPA',- type => 'TEXT',- level => 'TYPICAL',- section => 'ALL');print report-- with Views:set pause onselect TASK_NAME, RECOMMENDATION_COUNT from dba_advisor_tasks order by execution_end desc;-- if recommendations thenselect dbms_sqltune.script_tuning_task('TASK_TUNING_SPA') from dual;

Script example: Request a SQL Tuning Advisor report