software faults eec 521: software engineering –complex formulas
TRANSCRIPT
11/10/09 EEC 521: Software Engineering 1
EEC 521: SoftwareEngineering
Verification and ValidationSoftware Testing
11/10/09 EEC 521: Software Engineering 2
Software Faults
• Quite inevitable• Many reasons
– Software systems with large number of states– Complex formulas, activities, algorithms– Customer is often unclear of needs– Size of software– Number of people involved– …
11/10/09 EEC 521: Software Engineering 3
Types of Faults
• Algorithmic• Syntax• Computation/
Precision• Documentation• Stress/Overload• Capacity/Boundary
Logic is wrongCode reviews
Wrong syntax; typosCompiler
Not enough accuracy
Misleadingdocumentation
Maximum loadviolated
Boundary cases areusually special cases
11/10/09 EEC 521: Software Engineering 4
Types of Faults
• Timing/Coordination• Throughput/Performance• Recovery• Hardware and related software• Standards
Synchronization issuesVery hard to
replicate
System performsbelow expectations
System restarted fromabnormal state
Compatibility issues
Makes for difficultmaintenance
11/10/09 EEC 521: Software Engineering 5
Inspections
Process of analyzing and reviewing
artifacts - code, documentation, user
interfaces, etc. with the intent of
reducing possibilities of faults.
11/10/09 EEC 521: Software Engineering 6
Manual Inspections
• Group reviews– Owner presents artifact for review– Facilitator convenes meeting– Reviewers study artifact prior to meeting– Group reviews artifact in meeting to provide
feedback to owner– Multiple iterations, if necessary
11/10/09 EEC 521: Software Engineering 7
Automated Inspections
• Tools for static analysis• Can identify several common errors
– Buffer overruns– Data races– Dead code (control flow)
11/10/09 EEC 521: Software Engineering 8
Software Testing
Testing is the process of exercising a
program with the specific intent of
finding errors prior to delivery to the
end user.
11/10/09 EEC 521: Software Engineering 9
Who Tests the Software?
Developer Independent Tester
Understands the system but, will test "gently"and, is driven by "delivery"
Must learn about the system,but, will attempt to break itand, is driven by quality
11/10/09 EEC 521: Software Engineering 10
How to Test?
• No point in testing without plan– Too much wasted time and effort
• Need a strategy– Dev team needs to work with Test team– “Egoless Programming”
11/10/09 EEC 521: Software Engineering 11
When to Test?
Unit Test
Unit Test
Unit Test
IntegrationTest
FunctionTest
PerformanceTest
AcceptanceTest
InstallationTest
Com
pone
ntco
deCo
mpo
nent
code
Com
pone
ntco
de
Design SpecsSystem
functionalrequirements
Othersoftware
requirements
Customerrequirementsspecification
Userenvironment
Tested component
Test
ed c
ompo
nent Integrate
d modulesFunctioning
system
Verified,validatedsoftware
Acceptedsystem
System
in use!
11/10/09 EEC 521: Software Engineering 12
Verification andValidation
Verification:
Are we building the product right?
Validation:
Are we building the right product?
11/10/09 EEC 521: Software Engineering 13
Validation andVerification
ActualRequirements
SWSpecs
System
Validation VerificationIncludes usabilitytesting, userfeedback
Includes testing,inspections, staticanalysis
11/10/09 EEC 521: Software Engineering 14
V&V Activities
validation
verification
11/10/09 EEC 521: Software Engineering 15
You can’t always get whatyou want
ever
Correctness properties are undecidablethe halting problem can be embedded in
almost every property of interest
DecisionProcedure
Property
Program
Pass/Fail
11/10/09 EEC 521: Software Engineering 16
Decidability
• A problem is decidable, if– it has a solution,– and the solution can be found in a finite
amount of time
• Otherwise the problem is undecidable
11/10/09 EEC 521: Software Engineering 17
Turing’s Proposition
• “There must be problems that areundecidable”
• Example: The Halting Problem (HP)– “Given a program and an input to the
program, determine if the program willeventually stop when it is given that input.”
• Simply running the program with thegiven input is not a valid approach. Why?
11/10/09 EEC 521: Software Engineering 18
HP is Undecidable[Turing, 1936]• Suppose that there is a solution, called H
• Program P is just a string of characters,and so P can be provided as input to H
HProgram P
Input I
“Halts”or
“Loops”
HProgram P
Program P
“Halts”or
“Loops”
11/10/09 EEC 521: Software Engineering 19
HP is undecidable
• Construct a new program K:– if H outputs “loops”, then halt– Else loop forever
• K does the opposite of H
HProgram PLoop?
loopK11/10/09 EEC 521: Software Engineering 20
HP is Undecidable
• Since K is a program, we can provide K asinput to itself– If H says K halts, then K itself would loop– If H says K loops, then K will halt
• In either case, H gives the wrong answerfor K
• Thus, H cannot work for all cases!
11/10/09 EEC 521: Software Engineering 21
Getting what you need ...• optimistic inaccuracy: we may
accept some programs that donot possess the property (i.e.,it may not detect allviolations).– testing
• pessimistic inaccuracy: it is notguaranteed to accept aprogram even if the programdoes possess the propertybeing analyzed– automated program analysis
techniques• simplified properties: reduce
the degree of freedom forsimplifying the property tocheck
11/10/09 EEC 521: Software Engineering 22
Example: UnmatchedSemaphore Operations
synchronized(S) { ...
...}
Staticcheckingfor match isnecessarilyinaccurate...
if ( .... ) { ... lock(S);}...if ( ... ) { ... unlock(S);}
Java prescribes amore restrictive,but staticallycheckableconstruct.
original problem simplified property
11/10/09 EEC 521: Software Engineering 23
Terminology
• Safe: A safe analysis has no optimisticinaccuracy, i.e., it accepts only correctprograms.
• Sound: An analysis of a program P with respectto a formula F is sound if the analysis returnstrue only when the program does satisfy theformula.
• Complete: An analysis of a program P withrespect to a formula F is complete if theanalysis always returns true when the programactually does satisfy the formula.
11/10/09 EEC 521: Software Engineering 24
Views of Test Objects
• Black box or Closed box– Testing based only on spec
• White box or Open box– Testing based on actual source code
• Grey box– Partial knowledge of source code