testing basics of testing presented by: vijay.c.g – glister tech
Post on 31-Dec-2015
213 Views
Preview:
TRANSCRIPT
Software Reliability
• Probability that a software system will not deviate from the required behavior for a specified time under specified conditions
• Reliability enhancing techniques– Fault avoidance– Fault detection– Fault tolerance
Fault Avoidance
• Development methodologies– Avoid faults by minimizing their introduction
into models and code
• Configuration management– Avoid faults by minimizing undisciplined
changes to the models
• Verification techniques– Only useful for simple software projects
• Code reviews
Fault Detection
• Debugging– Correctness debugging– Performance debugging
• Testing– Component testing– Integration testing– System testing
Fault Tolerance
• Allows a system to recover in the case of failure– Atomic transactions– Modular redundancy
What Is Testing?
1. The process of demonstrating that errors are not present
2. The systematic attempt to find errors in a planned way
Component
• Part of the system that can be isolated for testing– Function(s)– Object(s)– Subsystem(s)
Failure
• A deviation between the specification of a component and its behavior– Caused by one or
more errors
Test Stub and Driver
• Stub provides a partial implementation of components on which the tested component depends
• Driver provides a partial implementation of a component that depends on the tested component
Correction
• A change made to a component in order to repair a fault– May introduce additional faults– New faults can be used to handle new
faults• Problem tracking• Regression Testing• Rationale Maintenance
Inspecting Components
• Code reviews find faults in components
• Can be formal or informal• Can be more effective than testing,
but finds different types of faults• Can be time consuming
Inspecting Components (Cont.)
• Fagan’s Inspection Method– Overview
• Author presents purpose and scope of component
– Preparation• Reviewers become familiar with implementation
– Inspection meeting• Recommendations are presented
– Rework– Follow up
Inspecting Components (Cont.)
• Active Design Review– Similar to Fagan’s method
• Less formal
– Not all reviewers need to be involved– Reviewers make recommendations during
the preparation phase– There is no inspection meeting
• Author meets with reviewers individually if at all• Reviewers fill out questionnaires
Equivalence Testing
• Black box technique• Possible inputs are partitioned into
equivalence classes• A typical input and an invalid input
are selected for each class• Test cases are written using the
input values selected
Equivalence Testing (Cont.)
• Class1 (31 day months) (1, 3, 5, 7, 8, 10, 12)• Class2 (30 day months) (4, 6, 9, 11)• Class3 (February)• Class4 (Leap years)• Class5 (Non leap years)
Boundary Testing
• Special case of equivalence testing• Deals with boundary conditions
and special cases• Detects “off by one” and “fence
post” faults
Path Testing
• White box technique• Every path is exercised
• Unable to detect omissions• Does not work well for object oriented
programs– Polymorphism makes paths dynamic– Shorter, related functions must be tested
together
State Testing
• Designed to work with object oriented systems
• Compares the resulting state of the system with the expected state
• Currently not used in practice– Lengthy test sequences required to put
objects in desired state– Not yet supported by automated testing
tools
Big Bang Testing
• All components unit tested first• All components integrated and
tested as a whole• Requires no stubs or drivers• Difficult to pinpoint cause of failure
– Unable to distinguish failures in the interface from failures inside the components
Bottom-Up Testing
• Lower level components tested first
• Lower level components are integrated with components from the next level
• Process continues until all levels have been integrated
d rawC irc le () d rawS quare()
d rawS hap e()
m a in ()
Top-Down Testing
• Upper level components tested first
• Upper level components are integrated with components from the next level
• Process continues until all levels have been integrated
d rawC irc le () d rawS quare()
d rawS hap e()
m a in ()
Sandwich Testing
• Combines bottom-up and top-down testing strategies
• System is divided into three layers– Top layer is unit tested, and then top-down
integration testing commences– Bottom layer is unit tested, and then bottom-
up integration testing commences
• Test stubs and drivers not needed for the top and bottom layers
• Does not adequately test the target layer
Modified Sandwich Testing
• Each layer is unit tested first– Top layer uses stubs for the target layer– Target layer uses drivers for the top layer,
and stubs for the bottom layer– Bottom layer uses drivers for the target
layer
• Bottom-up and top-down testing can then reuse the test cases that were used in the individual layer tests
System Testing
• Functional testing• Performance testing• Pilot testing• Acceptance testing• Installation testing
Functional Testing
• Tests the functional requirements specified in the RAD
• Black box technique– Test cases are derived from the use
cases.
• Test cases should exercise both common and exceptional behavior
Performance Testing
• Tests the design goals specified during system design, and the nonfunctional requirements in the RAD
• Types of Performance Testing– Stress Testing– Volume Testing– Security Testing– Timing Tests– Recovery Tests
Pilot Testing
• A selected set of users installs and tests the system
• Phases– Alpha test
• System is tested in the development environment
– Beta test• System is tested in the target environment
Acceptance Testing
• Benchmark tests• Competitor tests
– New system is tested against an existing system or a competitor’s product
• Shadow tests– Both the new system and the old
system are run in parallel and results compared
Installation Testing
• Performed in the target environment
• Repeats the test cases that were run during functional and performance testing
Test Planning
• Test cases should be developed early– As soon as their models become stable– Tests must be changed when the models
change
• Parallelize tests whenever possible– Component tests should all run at the same
time– Integration tests can start when some
components tests have passed
Documentation
• Test Plan– Documents scope, approach,
resources, and schedule of testing process
• Test Case Specification– Contains the input, drivers, stubs, and
expected results– Documents test procedure
Documentation (Cont.)
• Test Incident Report– Records the differences between the
actual output and the expected output or each execution of a test case
• Test Summary Report– Lists all failures discovered during
testing
Assigning Responsibilities
• Testers should not be the same developers who built the system
• Dedicated testing teams are used when there are stringent quality requirements
• Subsystem teams can test subsystems developed by other teams when there are not stringent quality requirements
top related