blazemeter effective performance reporting

16
EFFECTIVE PERFORMANCE REPORTING USING APACHE JMETER JULY 31, 2012

Upload: billycina

Post on 29-Jun-2015

1.153 views

Category:

Technology


2 download

DESCRIPTION

This topic focuses on effective reporting and its associated challenges while using JMeter. It delves into the importance of metrics and KPIs for effective performance reporting, followed by a brief overview of JMeter's built-in listeners (reporting elements) like Aggregate Listener, Graph Listeners etc. The 3rd and the final part covers the inadequacies of these listeners and use of third party/external reporting tools that provide enhanced reporting (ant + xslt). The new BlazeMeter reporting plugin is introduced as a quick and ready to use solution for JMeter reporting. Sub-topics: * Importance of effective performance test reporting * Typical performance testing metrics * JMeter reporting entities (Listeners) * Shortcomings of existing JMeter reporting elements * Generating advanced JMeter reports using ant + xslt * Building reporting tools frameworks * How the blazemeter reporting plugin can alleviate the challenges in JMeter reports * Details on the blazemeter reporting plugin

TRANSCRIPT

  • 1. EFFECTIVE PERFORMANCE REPORTING USINGAPACHE JMETER JULY 31, 2012

2. THE LOAD TESTING CLOUDA DEV-TEST CLOUD SERVICE 100% COMPATIBLE WITH THE OPEN-SOURCE APACHE JMETER 3. AGENDAPerformance AttributesUnderstanding Performance KPIsCreating Load Test ReportsJMeter Reporting ElementsGenerating Advanced JMeter ReportsBlazeMeter Reporting Plugin 4. PERFORMANCE ATTRIBUTES Speed / Responsiveness How fast does the page load? How quickly can the system process a transaction? Scalability Can the application handle the expected end user load? Does the application throughput degrade as the user load increases? 5. PERFORMANCE ATTRIBUTES Efficiency and Capacity Planning Are you using the right resources Can your infrastructure carry theload? Reliability/Availability/Recoverability What is the mean time betweenfailure (MTBF)? Does the application recover aftera crash? Does it lose user dataafter crash? 6. UNDERSTANDING PERFORMANCE KPISSystem MetricsServerPlatform Metrics CPU DB Memory App-server Disk / IO Application NetworkResponse TimeRequests / secInternetUser Load User LoadApplication Metrics Browser Rendering Metrics* Response Time Total Rendering Time Throughput Heavy Images/CSS/JS Error Rate DNS LookupEnd User 7. UNDERSTANDING PERFORMANCE KPIS Response TimeThroughputDB Inter Response Time Web App Servernet ServerServerDB Server Total Response Time = Throughput = Network latency + Application latency + [TRANSACTIONS] / Second Browser Rendering TimeMeasured from the end-user perspective Transactions are specific to applicationsTime taken to completely respond to requestIn its simplest form, it is requests / secTTLB TTFBError Defined in terms of the success of the request Error at HTTP level (404, 501) Application level error 8. CREATING LOAD TEST REPORTSCapture Application MetricsCapture Server Metrics Response Time CPU / Memory / Disk / IO Throughput 1. Capture Network Errors Application PlatformCorrelate Application Metrics 2. Correlate Correlate System Metrics User Load - Response Time User Load - Server Metrics User Load - Throughput User Load - Network User Load - Errors User Load - Platform3. Plot / TabulateTables Graph / Charts Response Time Scatter / Line(avg/min/max/%/stddev) 4. Trends / Overlay Throughput (average) Thresholds Errors (success % / types) 5. Customize /Trends / ThresholdsSummarize Summarize Response Time Trends Overall Performance Throughput Trends Important Trends Threshold Violation Threshold Violations6 . Compare Utilization (Server Metrics) Trends 9. SAMPLE REPORT ELEMENTS (SNAPSHOTS)Photo Credits: http://msdn.microsoft.com/en-us/library/bb924371.aspx Sanitized past projects 10. JMETER REPORTING ELEMENTS (LISTENERS) JMeter Listeners JMeter elements that display performance test metrics / output Various types of Listeners (Raw / Aggregated / Graphical) Doesnt have inherent capability to measure system metrics* Useful for basic analysis 11. GENERATING ADVANCED JMETER REPORTSJMeter Report using xslt stylesheetOther Reporting Options JMeter CSV results + Excel Style-sheet under extras folder Process results programmatically .jtl output must be in xml format (perl / python etc.) jmeter.save.saveservice.output.for BlazeMeter Reporting Plug-inmat=xml Integrate using ant Photo Credits: http://www.programmerplanet.org/pages/projects/jmeter-ant- task.php 12. WHAT HAPPENED?TO LABEL A AND KPI B AT TIME C 13. BLAZEMETER REPORTING PLUGINBENEFITS Store a report per test run,including Script that was used to run thetest Logs & JTL file Compare results of two test runs See an improvement trend Compare current with previous inreal time Share with co-workers 14. KPIS AVAILABLE IN A JMETER TESTRESPONSE TIME - THE TIME IT TAKES A REQUEST TO FULLY LOAD Indicates the performance level of the entire system under test (web server +DB). Represents the average response time during a specific minute of the test. 15. BLAZEMETER REPORTING PLUGINCOMPARE TWO REPORTS 16. HTTP://BLAZEMETER.COM/BlazeMeter - Startup Offers BlazeMeter - Code probing, not BlazeMeter - Changing theJMeter Cloud Load Testing at Angry Birds will define clouds Economics of Load Testing via theScale successCloud THANK YOU!