the four hats of load and performance testing with special guest mentora
DESCRIPTION
Performance testing may be the most critical function to assuring business success and continuity under unexpected application stress conditions. Professionals in this domain develop several key skills to model realistic workloads, develop robust scripts, monitor complex environments, and deliver actionable results. In this webinar hear how good teams effectively utilize the skills associated with the four hats of performance testing: - Business Analyst, for effective test planning - Developer, for creating maintainable scripts - Systems Engineer, to identify and configure resource monitors - Data Analyst, to interpret and report results Dan Downing, Managing Principal at Mentora, is a veteran performance tester, teacher, author, and presenter, with 30 years of enterprise testing expertise. Join Dan and fellow test industry veteran, Brad Johnson, SOASTA’s VP of Product, as they explore these four key areas where skills and expert tools must intersect to deliver speed and quality in today’s fast moving companies. About the presenters: Dan Downing, Managing Principal, Application Testing, Mentora Dan leads the Enterprise Application Performance Testing practice and serves as the principal consultant for quality assessments and large enterprise projects. He has 30 years technical and leadership experience as programmer, sales engineer, product manager, senior manager, and has led enterprise load testing projects for a variety of industries. Dan is widely regarded as a subject matter expert in load testing and created the 5-Steps of Load Testing methodology taught at Mercury Interactive. He is a frequent presenter at software quality conferences such as STAR, STPCon, and Workshop on Performance and Reliability for which he is one of the organizers. Brad Johnson, VP Product, SOASTA Brad Johnson has been supporting testers since the turn of the last century as head of monitoring and test products at Compuware, Mercury Interactive and Borland. He joined the new school of testing in 2009 when he signed on with SOASTA to deliver cloud testing on the CloudTest platform to a skeptical and established software testing market. Now, with the experience of tens-of-thousands of tests and hundreds of companies embracing the cloud, and using the same for mobile test automation, he’s helping expand the horizons of testers everywhere.TRANSCRIPT
1© 2013 SOASTA. All rights reserved.
The Four Hats of Load & Performance Testing
Webinar
Present
How to plan, execute & deliver actionable results that matter!
with
2© 2013 SOASTA. All rights reserved.
Utilizing diverse skills for effective, realistic performance tests
In This Webinar
TODAY’S PRESENTERS
Dan Downing: Managing Principal, Mentora - @dandowning_ma
Brad Johnson: VP Product Marketing, SOASTA- @bradjohnsonsv
Ed Salazar: Sr. Performance Engineer, SOASTA
Agenda:
• Poll question• Understanding and Illustrating the “four hats”• CloudTest demonstration
Questions: Submit in chat box during event
3© 2013 SOASTA. All rights reserved.
o First End-to-End Quality as a Service Platform• 1st Cloud-Based Load Testing Solution
• 1st and Largest Global Test Cloud (17 Countries, 54 Locations, 800K Cloud Servers)
• 1st Continuous Mobile Test Automation Solution
• 1st “real-time” Real User Monitoring (RUM) Solution for web and mobile apps
o Over 400 Global Corporate Customers• 20,000 Mobile Developers and Testers use SOASTA Cloud Services
• Over 2,500 Mobile and Web Apps have been Tested with SOASTA.
o Award Winning & Patented Technology • Industry Leader: Gartner Magic Quadrant & IDC Cloud Testing
• Wall Street Journal Top 50 Hottest Companies three years running
o Global Offices• San Francisco, New York, London, Mumbai, Shanghai & Tokyo
SOASTATurning The Spotlight on the User Experience
4© 2013 SOASTA. All rights reserved.
About Mentora
• Thought leadership in performance testing since 2001
• Acquired by Forsythe in 2012
• Specialize in performance testing of large-scale ecommerce, mobile and enterprise ERP systems
• SOASTA delivery partner
We bring the tools, infrastructure and subject matter expertise needed for any project.
5© 2013 SOASTA. All rights reserved.
The Four Hats of Performance Testing
o Business Analyst: for effective test planning
o Developer: for creating realistic, maintainable scripts
o Systems Engineer: to identify and configure resource
monitors
o Data Analyst: to interpret and report results
They all must fit!
6© 2013 SOASTA. All rights reserved.
Discovering RequirementsCapturing business needs for the PROJECT ROADMAP
Business Analyst
o Understand business drivers and derive test objectives in dialogue with the business and technical sponsors
o Select, prioritize and quantify the use cases to be tested with the business SMEs
o Define the approach, Key Performance Indicators and "success"
o Recruit the team: "It takes a small village to execute a successful performance test"
o List the key activities, owners and schedule
SKILLS: Translate business risk to performance requirements, project management
7© 2013 SOASTA. All rights reserved.
Key Performance IndicatorsDetermining what to test
Scalability Through-put
Capacity Workload Achieved
Business Analyst
8© 2013 SOASTA. All rights reserved.
Consult RealityReal users and real experiences speak volumes
Business Analyst
9© 2013 SOASTA. All rights reserved.
Consult RealityReal users and real experiences speak volumes
Business Analyst
Is this a good SLA?
Is 12 seconds
a problem
?
Did we test for Chrome 30? At what point do users
really leave?
Did we test for
Canada?
Did we know this page was
so popular?
10© 2013 SOASTA. All rights reserved.
Development Standards
o Project directory
• Use cases, scripts, script data, load scenarios, results, analyses/reports
o Naming
• Scripts: e.g., UC1_Browse, UC2_Search, UC3_AddToCart
• Dynamic value parameters: p_searchPhrase, p_productId
• Transaction timers: Common ones -- home, login, logout and script-specific steps -- UC1_01_enterPhrase, UC1_02_search, UC1_03_prodDetail
o Script data -- Group related data values into a file
• environment.dat (env, url, port); login.dat (loginId, pswd)
o Script modularity
• Initialization, end: one-time steps
• business logic: logically grouped, iterative navigation and action steps
SKILLS: Structure a team development project
Building reusable, logical, readable tests
Developer
11© 2013 SOASTA. All rights reserved.
Visual, Object Oriented Test DevelopmentThe best tests use minimal (no) coding
Developer
12© 2013 SOASTA. All rights reserved.
Network Topology and Monitors
o Create a logical diagram of the system-under-test and the load generating environment
o Define the component configuration, operating system, software
o Choose monitoring tools
o Select and configure monitoring points
o Execute test
SKILLS: Overlay applications onto hardware, Unix and Windows performance monitor configuration
Knowing what to monitor, and when
Systems Engineer
13© 2013 SOASTA. All rights reserved.
Environment TopologySystems Engineer
System Under Test
14© 2013 SOASTA. All rights reserved.
Environment TopologySystems Engineer
System Under Test
15© 2013 SOASTA. All rights reserved.
Environment Topology Systems Engineer
System Under Test
16© 2013 SOASTA. All rights reserved.
Environment TopologySystems Engineer
System Under Test
17© 2013 SOASTA. All rights reserved.
Environment TopologySystems Engineer
System Under Test
Load injectionMonitoring
18© 2013 SOASTA. All rights reserved.
Monitoring PointsSystems Engineer
CPU, IO, JVM heap, DB connection Pool CPU, IO, memory,
queued requests
CPU, IOPS, DB locking contention
Bandwidth throughput
Load injectionMonitoring
19© 2013 SOASTA. All rights reserved.
CloudTest Architecture
Customers
Systems Engineer
SUT/AUT
Web Server
Web Server
App Server
Web Server
Web Server
App Server
App Server
App Server
Load Balancer
Cache
Cache
Database
20© 2013 SOASTA. All rights reserved.
CloudTest Architecture
“Conductor”
Analytics
CloudTest
Database
Main
Systems Engineer
Perf Tester 1
SUT/AUT
Web Server
Web Server
App Server
Web Server
Web Server
App Server
App Server
App Server
Load Balancer
Cache
Cache
Database
21© 2013 SOASTA. All rights reserved.
CloudTest Architecture
Analytics
CloudTest
Database
Main
Systems Engineer
Perf Tester 1
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
East Coast (AWS)
SUT/AUT
Web Server
Web Server
App Server
Web Server
Web Server
App Server
App Server
App Server
Load Balancer
Cache
Cache
Database
22© 2013 SOASTA. All rights reserved.
CloudTest Architecture
Analytics
CloudTest
Database
Main
Systems Engineer
Perf Tester 1
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
East Coast (AWS)
SUT/AUT
Web Server
Web Server
App Server
Web Server
Web Server
App Server
App Server
App Server
Load Balancer
Cache
Cache
Database
23© 2013 SOASTA. All rights reserved.
CloudTest Architecture
Analytics
CloudTest
Database
Main
Systems Engineer
Perf Tester 1
SUT/AUT
Web Server
Web Server
App Server
Web Server
Web Server
App Server
App Server
App Server
Load Balancer
Cache
Cache
Database
Data sources: Native (SSH), JMX, PerfMon, CA Introscope,
AppDynamics, New Relic, CloudWatch,….
“Conductor”
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
East Coast (AWS)
24© 2013 SOASTA. All rights reserved.
CloudTest Architecture
Analytics
CloudTest
Database
Main
Systems Engineer
Perf Tester 1
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
East Coast (AWS)
SUT/AUT
Web Server
Web Server
App Server
Web Server
Web Server
App Server
App Server
App Server
Load Balancer
Cache
Cache
Database
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
San Francisco (GoGrid)
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
Chicago (Rackspace)
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
Tokyo (AWS)
AnalyticsLoad
GeneratorLoad Generator
Amsterdam (Azure)
AnalyticsLoad
GeneratorLoad Generator
Virginia (IBM)
25© 2013 SOASTA. All rights reserved.
CloudTest Architecture
Analytics
CloudTest
Database
Main
Systems Engineer
Perf Tester 1
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
East Coast (AWS)
SUT/AUT
Web Server
Web Server
App Server
Web Server
Web Server
App Server
App Server
App Server
Load Balancer
Cache
Cache
Database
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
San Francisco (GoGrid)
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
Chicago (Rackspace)
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
Tokyo (AWS)
AnalyticsLoad
GeneratorLoad Generator
Amsterdam (Azure)
AnalyticsLoad
GeneratorLoad Generator
Virginia (IBM)
SOASTA Global Test
Cloud
26© 2013 SOASTA. All rights reserved.
CloudTest Architecture
Analytics
CloudTest
Database
Main
Systems Engineer
Perf Tester 1
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
East Coast (AWS)
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
San Francisco (GoGrid)
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
Chicago (Rackspace)
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
Tokyo (AWS)
AnalyticsLoad
GeneratorLoad Generator
Amsterdam (Azure)
AnalyticsLoad
GeneratorLoad Generator
Virginia (IBM)SUT/AUT
Web Server
Web Server
App Server
Web Server
Web Server
App Server
App Server
App Server
Load Balancer
Cache
Cache
Database
SOASTA Global Test
Cloud
Data sources: Native (SSH), JMX, PerfMon, CA Introscope,
AppDynamics, New Relic, CloudWatch,….
“Conductor”
27© 2013 SOASTA. All rights reserved.
CloudTest Architecture
Analytics
CloudTest
Database
Main
Systems Engineer
Perf Tester 1
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
San Francisco (GoGrid)
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
Tokyo (AWS)
AnalyticsLoad
GeneratorLoad Generator
Amsterdam (Azure)
Data sources: Native (SSH), JMX, PerfMon, CA Introscope,
AppDynamics, New Relic, CloudWatch,….
SUT/AUT
Web Server
Web Server
App Server
Web Server
Web Server
App Server
App Server
App Server
Load Balancer
Cache
Cache
Database
AnalyticsLoad
GeneratorLoad Generator
Virginia (IBM)
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
Chicago (Rackspace)
SOASTA Global Test
Cloud
Access for all
AnalyticsLoad
GeneratorLoad GeneratorLoad
Generator
East Coast (AWS)
“Conductor”
28© 2013 SOASTA. All rights reserved.
Visualize, Analyze, Interpret, Report
o Collect: Response times, errors, resources, anecdotes
o Aggregate: average, max, 95th percentile, end-to-end, at varying granularities
o Visualize: Response times, resources, bandwidth “over load”
o Interpret: Make observations, create and test hypotheses, support with data, draw conclusions
o Assess: Compare to acceptable results, make recommendations
o Report: Executive summary, supporting detail; assemble stakeholders and do read-outs
SKILLS: Pattern recognition, Excel pivots, visualization tools, basic statistics
Getting to actionable results is why we test.
Data Analyst
29© 2013 SOASTA. All rights reserved.
Testing is ALL About ResultsAnd the best time is REAL TIME
Cloud Test Demonstration
Data Analyst
30© 2013 SOASTA. All rights reserved.
Summary
Great performance testing needs to master four sets of complementary skills – recruit your team accordingly!
o Without solid business requirements much hard work may yield little value
o Poorly designed scripts make results analysis harder and maintenance difficult
o Testing without monitoring is like flying an airplane without instruments
o Well interpreted, actionable results delivers the business value – this is where you earn your stripes
Skills Matters
31© 2013 SOASTA. All rights reserved.
Thanks
Contact SOASTA:[email protected]
866.344.8766
Follow us:
twitter.com/cloudtest
facebook.com/cloudtest
Knowledge CenterWhite Papers
Webinar RecordingsCase Studies
Additional Resources
CloudLink CommunitySupportTutorials
Video
Contact Mentora:[email protected]
866.636.8672
Follow us:
twitter.com/MentoraGroup
Blog.Mentora.com
www.mentora.com
For whitepapers, presentations and resources visit
Mentora.com/Resources
www.SOASTA.com
www.soasta.com/FREE
Get Our Free Products