beyond traditional mobile testing
TRANSCRIPT
QA Challenges in Mobile Tes1ng Cap Gemini Quality Report (Mobile Testing - 2014)
• Diversity of pla6orm OS • Device fragmenta.on • Different mobile app types • Mobile tes.ng tool availability & related
knowledge • Industry standards • Need for skilled QA specialist in automa.on
tes.ng • Device availability & cloud compu.ng
12000 Android Devices in 2013
Mobile Apps Tes1ng Strategy
STRATEGY
Selec1ng Target Devices
Simulators/Cloud Test Labs/Physical Devices
ü Does your applica.on consistently use device
peripherals? ü Does your applica.on use outside applica.ons? ü What are the features in your applica.on that
will generate the most network ac.vity? ü At what speed do users actually execute the
commands in your applica.on?
Connec1vity Op1ons
Manual vs Automated Tes1ng
Selec1ng a Test Automa1on Tool
Mobile Apps Tes1ng Aspects
ASPECTS
Code Quality Security Tes1ng
Performance Tes1ng
Interface Tes1ng
Services Tes1ng
Func1onal/Manual Tes1ng
Test Automa1on
Usability/UX Tes1ng Compliance/
Conformance Tes1ng
Compa1bility Tes1ng
Opera1onal Tes1ng
Network Tes1ng
Accessibility Tes1ng
Installa1on Tes1ng
Code Quality
Code Quality
Performance Maintainability Extendibility
PlaMorm Tool/Library Features iOS • OCLint
• Clang • Simian (Similarity Analyzer)
ü Unused code ü Complicated code (Cyclic Dependency Analysis, Viola.ons-‐Style, Viola.ons-‐
Technical) ü Reuse Analysis, ü Copy-‐Paste Viola.ons ü Code Duplica.ons ü Bad prac.ces(Code Best Prac.ces Viola.ons, Comments, Coding Standards)
Android • FindBugs • Checkstyle • PMD • Sonar • Simian
ü Unused code ü Complicated code (Cyclic Dependency Analysis, Viola.ons-‐Style, Viola.ons-‐
Technical) ü Bad prac.ces(Code Best Prac.ces Viola.ons, Comments, Coding Standards) ü Mul.threaded correctness ü Unnecessary object crea.on
Titanium & Mobile-‐web
• JSHint • JSLint • PMD
ü Syntax errors ü Bugs due to implicit type conversion ü Unused variables ü Unnecessary object crea.on ü Copy-‐paste-‐detector
Performance Tes1ng
• Mobile App Performance Tes.ng is the process of determining the speed or effec.veness of a mobile app
Key Factors + Tools of mobile performance tes1ng
Factor Tools/Libraries
Memory U1liza1on
NewRelic, Memory Usage, Mem free, DDMS, Xcode Instruments – Alloca.on & Leaks
CPU Performance
NewRelic, CPU Usage Monitor, Micro CPU Monitor, Traceview, Xcode Instruments – Ac.vity Monitor
Bandwidth Usage Bandwidth Monitor, Up Down Meter Free, Network Monitor Mini, Xcode Instruments – Network
Energy Consump1on Xcode Instruments – Energy Diagnos.cs
Response Time NewRelic
Performance Tes1ng – Energy Consump1on
What is Connec1on Manager? Connec.on Manager controls the radio connec.on and power u.liza.on of a mobile device. Connec.on manager has 3 stages
IDLE(Standby mode) – no radio -‐ zero radio energy DCH (Full Power mode) – high throughput for data, but high radio energy FACH (Low Power mode) – low data throughput -‐ low radio energy
Performance Tes1ng – Energy Consump1on Understanding Connec1on Manager behavior? ü When you make a connec.on, your phone goes from IDLE to DCH, and this can take up to two
seconds
ü Once you are in DCH, the data flows like a river. If no more data comes through the connec.on, the radio drops to a lower power, lower bandwidth state, FACH.
ü Small amounts of data can transverse the FACH radio state, but if lots of data begin flowing, it is an easy jump back to DCH
ü If no data flows in FACH for 12 seconds, the radio turns off (back to IDLE) State transi.ons control end-‐user experience and device energy consump.on (baaery life) Calcula1ng Energy Consump1on What if your applica1on pings the network every 30 seconds? § You now know that sending even one packet of data means that the phone radio is on for 19
seconds (2+5+12 = 19) § That means that the radio is on for 2/3 of the .me your app is running § And that means that your users’ liale baaeries are going to dry up faster
Demo: Performance Tes1ng
Ø Demo 1: Mobile Response Time Capture
Ø Demo 2: Mobile Energy Consump1on Capture
Ø Demo 3: Mobile Memory U1liza1on Capture
Ø Demo 4: Mobile CPU Performance Capture
Ø Demo 5: Mobile Crash Analy1cs
Ø Demo 6: Mobile Device/OS/Version Usage Capture
Ø Demo 7: Mobile Method Execu1on Time/Throughput Capture
Ø Demo 8: Mobile Global HTTP Request Health Analy1cs
Ø Demo 9: Mobile Network Failure Capture
Ø Demo 10: Mobile Traffic by App Version
Security Tes1ng
Security Threat / Risk Preven1on Tips
Insecure data storage § Store ONLY what is absolutely required § Never use public storage areas (E.g -‐ SD card) § Leverage secure containers and pla6orm provided file
encryp.on APIs § Do not grant files world readable or world writeable
permissions § Never save data in public SD card
Insufficient transport layer protec.on
§ Solid cryptography § Use SSL § Use 2-‐way SSL (enterprise apps)
Broken cryptography § Encoding != encryp.on § Obfusca.on != encryp.on § Serializa.on != encryp.on § Leverage baale-‐tested crypto libraries vice wri.ng your
own § Introduce SALT and PEPPER techniques (avoid dic.onary
aaacks, rainbow table)
Security Tes1ng (Contd.)
Security Threat / Risk Preven1on Tips
Informa.on disclosure § Use MDM/MAM
Client side injec.on (XSS, Javascript, SQL)
§ Sani.ze or escape untrusted data before rendering or execu.ng it
§ Use prepared statements for database calls
Side channel data leakage § Never log creden.als or other sensi.ve data to system logs
§ Remove sensi.ve data before screenshots are taken § Carefully review any third party libraries you introduce
and the data they consume
Poor authen.ca.on & authoriza.on (IMEI, UUID)
§ Never use device ID as sole authen.cator
Denial of service § Private services -‐ Use “Access Control List” and configure acceptable IP addresses
§ Public services – have a comprehensive audit and iden.fy untrusted IP addresses
Mobile Test Automa1on Why Mobile Test Automa1on? § Reduce human involvement in repe..ve or redundant tasks § Reliable: elimina.ng human error § Reusable § Beaer Quality Sopware § Fast § Cost Reduc.on
How to chose a Test Automa1on Tool? § Ease of integra.on § Compa.bility § Performance § Maintainability § Affordability Two kinds of automa1on tools are available to test mobile apps:
• Object based mobile tes.ng tools
• Image based mobile tes.ng tools
Test Requirements
Test Scenarios
Test Cases
Test Automa1on
Scripts
Test Results
Mobile Test Automa1on Process
Demo: Mobile Test Automa1on
Ø Demo 1: iOS Test Automa1on
Ø Demo 2: Monkey Talk – Cross PlaMorm Test Automa1on
Other Mobile Tes1ng
Usability/UX Tes1ng: To make sure that the mobile app is easy to use and provides a sa.sfactory user experience to the customers
Compliance/Conformance Tes1ng:
A methodology used in mobile engineering to ensure that a mobile app meets a defined set of standards
Compa1bility Tes1ng: This entails valida.ng the applica.on for different mobile devices, OS versions, screen sizes, and resolu.ons as per the requirements, checking if integra.on server changes, checking for the app isola.on with other apps on the device.
Other Mobile Tes1ng (Contd.) Interface Tes1ng:
This covers valida.on of each screen, its layout and widget distribu.on -‐ buaons, text inputs etc.
Opera1onal Tes1ng:
This entails checks for back-‐up of necessary informa.on in the app, save and recovery plan if baaery goes down, data lost in case of app upgrada.on from appstore market, app access if user gets any alarm, call, message, reminder, etc., and baaery power usage while app is being accessed
Network Tes1ng:
Check weak networks and check whether image loading, .meouts etc. ERRORS can be introduced to the network (error rate), and check how the app responds in higher error rate. “Network Link Condi.oner” [OS X LION] can be used to introduce error rate to the network
Other Mobile Tes1ng (Contd.)
Accessibility Tes1ng:
This is a type of systems tes.ng designed to determine whether individuals with disabili.es will be able to use the system
Installa1on Tes1ng:
Valida.on of the applica.on by installing /uninstalling it on the devices.