testing in the web ‘space’ vonnie faiumuoct 2001

34
Testing in the Web ‘Space’ Vonnie Faiumu Oct 2001

Upload: brianne-eaton

Post on 23-Dec-2015

218 views

Category:

Documents


0 download

TRANSCRIPT

Testing in the Web ‘Space’

Vonnie Faiumu Oct 2001

• “The business community in general is more aware of the benefits of effective and efficient test practices throughout the Software Delivery Life Cycle ….”

• “…deliver a quality product in the shorter time frames expected by the business..”

Agenda

• Differences and Similarities

• Some extra considerations for web development

• Experiences from Project # 1

• Some things I learned

• Experiences from Project # 2

• Some thoughts to take to the next project….

What’s different ?

• Customer expectation of rapid delivery• High visibility throughout lifecycle• Broad user base (the world !)• Broad range of User interface (Operating Systems

and Browsers)• Large variation in user experience• High user expectations (everyone’s an expert) • Greater reliability needed (24 x 7)

What’s the same?

• Still need firm business requirements and reusable test scripts

• Interface with new and existing systems

• Non functional testing is needed - eg Availability and Reliability

• Need to transition to Business as Usual

What’s in the Test Component?• Strategy/Planning

– Agree the terminology– Agree environments and responsibilities– Evaluate use of Test Tools– Plan approach– Test phases - entry and exit criteria

• Detailed Test Plans– What and who– Test analysis, script creation, set up test data– Code migration process - release management

• Test execution• Analysis and reporting• Handover to next phase

Extra considerations• Large range of browsers and O/S

• Individual browser settings

• Web technology and protocols– secure sessions - 128 bit encryptions– cookies - javascript off and on

• Navigation and links, search indexes

• Content Management

• Usability is important - can’t train the users

• Repeatable tests scripts - multiple browsers

• Load test - test to break if possible

Project # 1• Project Structure

– Internet Online Banking application– Mixed development team on customer premises– Front end and back end development – Back end development - in house– Content management and web functionality

• Formal and structured test processes• Existing test tools• Joined team near the start of development

Initial challenges

• Detailed Test Plans– Multiple Requirements Documents at varying

levels of detail- which is the master?– Navigation - where does the back button go?

(The wall !!!!)– Requirements documents can’t keep up with the

‘prototype’– Error handling and messaging not confirmed

Items to be tested

• Functional Testing of Use Case scenarios• Validation of web pages

– navigation and links– page content and layout (as per design)

• Access via supported web browsers• Performance and Load Testing• Availability and reliability – (connectivity

and system stability)• Exception handling - messages and alerts

Planned approach for test execution

• Use back end emulator used by developers• Test bench – different O/S browser

combinations per PC• Test different functional areas on different

combinations to get full coverage early• Testers to enter defects directly into SQA

manager• Test Scripts to be reusable and used for

phases through to UAT

The plan goes awry….• Test bench desktop setup• Test entry criteria not met

– Not all development completed– No documented unit test plans or results– Testing resources not 100% available

• Unable to use emulator – needed to write stubs

• Error messages not finalised• Page design not finalised• All links not working

Way Forward

• Page content and layout excluded

• Staggered testing as code and stubs available

• Accept more frequent code release

• Reduced OS/browser coverage

• Paper based defect sheet with centralised entry into SQA Manager

• Daily reporting and review of progress

‘Participated’ in End to End Testing

• Managed test execution – resourced with back end developers

• No visibility of back end system test results• Still tweaking requirements• Error messages still being defined • Now in main test window - less control over

environment and data - wider usage• Dynamic update, therefore not easily

repeated

Things that worked well

• Co-location with developers

• Early involvement with UAT pers

• Test Document Repository and approval process

• Using SQA Manager from outset

• Test script format - excel

• Daily Reporting during test execution

• QA process - encouraged stocktake and review

What caused the most pain?

• Incomplete business requirements

• Test bench set up

• Test environment

• Identification and set up of test data

• Recovery and Reliability

• Changes in available testers

• That old chestnut - too many tests and never enough time…

Project # 2• Project Structure

– Internet application– Development team on customer premises– Front end and back end development– Back end development - third party– Content management and web functionality

• Less formal existing test processes

• No existing test tools

• Joined team during design phase

I wanted to achieve ….• Earlier review of business requirements• Comprehensive plan earlier

– Clearly defined responsibilities for environments, test phases and tasks

• Early selection of test data• Schedule and enforce proper reviews of

– Requirements

– Design

– Test cases etc

• Better use of resources – OS / Browser combos• More accurate schedule assessment for design and

execution

Processes

• Requirements captured in Use cases

• Complex business rules recorded in separate documents

• Visio diagrams for page flow and menu items

• Word docs with page layout and link destinations defined

• Central repository for clarification of reqts

• Early agreement to keep documentation current

Planned Test Phases• Code Verification and Unit Test – to agreed

standard checklist - developers • Function and Testing

– Front End including end to end – test team

– Back end – existing IM support team

– Navigation and Boundary Testing

• Non-Functional Testing ––test team in conjunction with the operational support team.  

• Usability Test – User experience analyst  • Process Test – process owners.• Production Test – ‘Comfort Test’ on production

environment – in lieu of acceptance test - BAU

Approach for Function Test

• Test cases and scripts grouped by Use Case

• VMWare for OS/Browsers

• Test all functionality on one OS/Browser combination first

• Test major functionality over a range of other OS/Browsers

• Test data input pages and some navigation on alternate browsers as time allowed

O/S Browser Combinations• Fully tested

– Win 95 with IE 4.0 and Netscape 4.0

• Part Coverage – major functionality– Win 98 with IE 5.0 and Netscape 4.5

• Sample Coverage – Win 95 with IE 3.0 and Netscape 3.0

– Win 98 with IE 4.0, IE 5.5, Netscape 4.5

– Win 2000 IE 5.5 and Netscape 4.5

• Extra coverage – NT – Development environment

– MAC – because lot of people use it

Some Challenges

• Development delayed - test functional areas as they become available

• Changes identified from Usability testing – required change in design– re-development – rework in test scripts

• Metrics and reporting requirements - not defined

• Performance requirements - what does acceptable mean?

• Meaningful Volume testing without tools….

Tests by developers and test team• Developers

– Code Verification and Unit Test - including– 128 bit encryption – Secure session / public pages

• Test Team– functional scripts (front and back end)– navigation and links– boundary testing on all input pages– final test of major functionality on the

production environment configuration and back end test

Tests by developers and test team

Non-Functional Testing –o Application Availability testing was carried out by

the test team, assisted by the development and operational support teams.  

o Volume testing - conducted by the development team simulating incremental users across the production environment using Perl scripts. Results were monitored using the BCM monitoring tools set up in the IM team for Business as Usual monitoring and reporting. 

Tests done by project team• Usability Test – ease of use – designed

and conducted by User Experience Analyst  • Process Test – Processes were

documented by the business, verified by peer review, and were validated in training.

 • Production Test – ‘Comfort Test’ – On ‘go

live’, a limited number of tests were run against the production environment.

Tests done by other groups• High Availability and Load Balancing

external to the Firewalls. • Back Up and Recovery• Network Availability and Security testing• Ethical hack• New input and enquiry screens for back

end• Reporting and Statistics – not defined –

passed to BAU• Green screen maintenance on back end

system for new data fields

Things That Worked Well• Co-location with rest of team - one room

• Early identification of test responsibilities

• Central repository for clarification of reqts

• Work co-operatively with developers (not over the fence)

• VM Ware - flexible - some overheads

• Make early contact with operational group - non functional testing

• Use of same Defect Management Tool from Testing through to Business as Usual

Lessons Learned• Inspection of Use Cases and navigation –

enforces standards

• Encourage walk throughs at all stages

• Usability test early

• Plan for a requirements revision delay from Usability

• More analysis of the functional groupings – not always use case to test case

• Test cases first – objectives, scenarios etc – leave detailed scripts until later in the phase

Some facts of life….

• You don’t always have sole control over the test environment (or the data)

• Online update means test data is dynamic (not static)

• Availability and Reliability testing needs the production environment to be in place…..

• Ethical hack needs live environment with external web address…..

… Some facts of life….

• Support staff of interfacing systems have their own Business as Usual issues - the project is not always their priority

• Email addresses need somewhere to land• Don’t forget handover to BAU• Lots of people use MACs – eg: cyber cafes• You can’t do everything• ….. yourself• In parallel development/test phases - a half

hour code migration takes more than half an hour…...

What about Portals then?

• A website multiplied by n (n being the number of external contributing systems)

• which includes– Clarification and agreement of requirements– Synchronisation between test windows– Defect management and prioritisation– Release scheduling throughout test phases– ………

Any Questions or Discussion?

Test Tool Selection

Considerations

• Ease of implementation and use

• What phase can the best value be gained?– Development, IST, Regression – Record and playback, Stress test, Broken Links

• Cost to business

• Resource overheads to maintain vs benefit to business