validata performance tester deliver flexible, scalable performance testing across platforms
TRANSCRIPT
Validata Advanced Testing Suite
Validata Testing Approach
Validata Testing Methodology
Validata Performance Tester Overview
1
2
3
4
5
Benefits6
Business Challenges
Validata Performance Tester Case Study7
Testing the performance of web-based applications can easily miss the mark. It’s easy to design unrealistic scenarios. Easy to collect and measure irrelevant performance data. And, even if you manage to design sound scenarios and collect the right data, it’s easy to use the wrong statistical methods to summarize and present the results.
Traditional performance testing approaches, involve performance testing teams very late in the implementation lifecycle. Furthermore, applications are tested and tuned at the latest stages of the project, whereas business needs are not successfully met due to the constant environment changes.
Therefore deep, flexible and efficient testing coverage can not be achieved with traditional testing tools.
Validata Advanced Testing Suite (ATS) provides a full end-to-end automated testing capability that adapts easily to changes in the application under test, ensuring higher quality and reduced costs and effort.
Validata ATS is a truly integrated and business process management solution.
Validata ATS is the first model – driven test automation tool for Functional, Technical and Continuous Regression Testing.
Validata focuses on the analytics (the context and the content) thus providing root cause analysis linking requirements and testing. Full reporting is on-demand from the Executive Dashboard Module.
Project Success
Efficient Testing
• Reduced Testing time - - Less time to develop, Shortened application life cycle and Faster time to market
• Reduced QA Cost - - Upfront cost of automated testing is easily recovered over the lifetime of the product. The cost of performing automated testing is much lower, many times faster and with fewer errors
Effective Testing
• Greater Coverage -The productivity gains delivered by automated testing enable more and complete testing. Greater coverage reduces the risk of malfunctioning or non-compliant software
• Improved testing productivity - Test suites can be run earlier and more often
Improve Process•Consistent test procedures - Ensuring process repeatability, resource independence, eliminates manual errors
•Replicating Testing - Across different platforms is easier using automation
•Results Reporting - Automated testing produces convenient reporting and analysis with standardized measures allowing more accurate interpretations
Better Use of resources
•Using Testing Effectively -Testing is a repetitive activity. Automation of testing processes allows machines to complete the tedious, repetitive work while resources are diverted to perform other tasks
Test team members can focus on quality
Validata Performance Tester fulfills the needs of organizations for performance testing of web-
based applications. It is fully integrated with Validata ATS, incorporates SWIFT, ARC IB and
other Internet Banking applications and is designed to deliver a faster and more cost-effective
approach to test the reliability and scalability of critical IT systems. The T24 specialized
adapters and the pre-built test scenarios library accelerates the performance testing element of
the project by 75%.
Objectives of Performance Testing: Ensure that the system provides adequate response times (verify performance requirements) Determine maximum number of concurrent users (current system capacity) Meet end-user expectations Determine optimal hardware and application configuration Identify performance bottlenecks Verify the scalability of the system Assess the impact of any hardware or software changes on site performance, new features, or
functions
Validata ATS has the ability to perform Parallel Testing on multiple environments using the unique test engine adapter
Performance Tester
Load Testing
Expected number of users with average user interaction times, over short period of time, and load conditions that will occur in a live production environment.
Focuses on: Number of users accessing the server Combination of business transactions that are executed Impact on different environment components
Stress Testing
Worst-case scenarios for a short period of time
Focuses on: Locating the point at which server performance breaks down Steadily increasing the number of simulated users until a breaking
point is reached Identifies performance issues that might not otherwise be seen Verifies that web site/application will perform as expected under
peak conditions
On Line Testing
Socket Based Transactions (Interfaces) ATM & POS & Mobile
Browser Based Transactions (http) Module Executions Mixed Executions
IB Application Transactions (http) Module Executions Mixed Executions
Batch Offline Testing
Batch File Transactions
COB Testing
(Daily, Monthly, Quarterly)
Report Generation and Interest Accruals Account Statements Noise Transaction Generation (T24 Browser)
Requirements AnalysisRequirements Analysis Test PlanningTest Planning Test DesignTest Design Test ExecutionTest Execution ReportingReporting
Indentify the required stake holders, business analysts, Infrastructure managers
Organize and gather the business requirements
Convert the business requirements into performance requirements and metrics
Run workshops for knowledge transfer
Indentify the required stake holders, business analysts, Infrastructure managers
Organize and gather the business requirements
Convert the business requirements into performance requirements and metrics
Run workshops for knowledge transfer
Collect the business critical transactions
Determine the required test volumes
Prepare entry and exit criteria
Prepare the schedules for testing and testing estimations
Check infrastructure availability
Collect the business critical transactions
Determine the required test volumes
Prepare entry and exit criteria
Prepare the schedules for testing and testing estimations
Check infrastructure availability
Identify the pre – test and post – test procedures
Determine the test customization requirements and prerequisites
Isolate monitoring requirements and metrics to be collated
Create and Review the test cases
Create and review the workload scenarios
Identify the pre – test and post – test procedures
Determine the test customization requirements and prerequisites
Isolate monitoring requirements and metrics to be collated
Create and Review the test cases
Create and review the workload scenarios
Execute Smoke Testing
Setup the required environment monitors
Execute the test and collate the results
Share the test results with the Project Team
Schedule the next execution cycle after the resolutions of the issues
Execute Smoke Testing
Setup the required environment monitors
Execute the test and collate the results
Share the test results with the Project Team
Schedule the next execution cycle after the resolutions of the issues
Correlate Test results from different test cycles
Prepare the test summary document
Test Summary presentation to the stake holders
Sign Off
Correlate Test results from different test cycles
Prepare the test summary document
Test Summary presentation to the stake holders
Sign Off
Performance Testing per Applications
Mix of critical transactions for performance testing
Create cycles per transaction or group of transactions
Clone of cycles for multiple executions
Execution of cycles with Validata pre-built T24 adapters
Pre-built performance test cases
Transaction Based Metrics Server Based Metrics
Metrics to be Captured Comments
Throughput (per Sec) Transactions per sec
Response Times / Elapsed Times
Time taken to process a transaction
Types of Errors Totals for different types of errors for a particular test
Count of Errors Total errors for a particular test.
Transaction Count Total transactions processed for the time period
Metrics to be Captured Comments
CPU Usage User%System%Idle%Wait%Logical CPU
Memory Usage Used%
Used in GB
Memory available
Disk I/O Disk Read KB/sec
Disk Write KB/sec
IO/sec
Network Activity MB/sec
Packets/sec
Size of packets
Bandwidth used
Aggressive project plan
Performance testing of a
consistently changing
environment
Need of a performance
testing where the test cases
could easily be updated to
reflect the new environment.
Challenges
Developed a performance testing strategy and plan including all aspects of environment testing
Designated their resources and collaborate with those of the bank and Temenos, to manage the process, development and execution of the tests
Deployed Validata Performance Tester solution to produce all management reporting for project progress and defect management
Delivered the full solution on a fixed fee basis
Solution Outline
Easily manage from start to finish all performance testing processes.
Identify design issues and performance bottlenecks and overcome them.
Effectively concentrate their resources, by exploiting the product and resources encompassed by Validata.
Efficiently manage monthly costs for testing
Enjoy a cost efficient solution that incorporated both resource and product
Benefits Realized
With a network of over 40 branches and many Internet Banking customers, Mauritius Commercial Bank (MCB) required a robust environment to continue to provide to their customers the level of service enjoyed prior to the implementation of Temenos T24. As such they identified a need for a performance testing tool to assist them with validation of the configuration of the environment.
Distributions and Scalability Executions
Scalability from 50 to 2500 Virtual Users
1 Hour Continuation of Executions
Executions per Modules
Transaction Mixed Executions
Achieve full test coverage
Decrease total time of performance testing up to 60-70%
Maximum reusability on the tests assets with minimum effort to maintain them
Script less creation of scenarios, achieving 100% automation
Truly de-skilled reducing the turn around time by 50%
Less time to prepare, faster time to market by 50%
On the Cloud: Remote access & Multiple site support