software testing2

34
software testing (st):- 17-12-2007 http://qtp.blogspot.com/2007/08/qtp-tutorials-5-page-checkpoint.html it is a process is to identify correctness, completeness under measuring quality of software application. software testing is independent of environment. quality of software application:- meet customer requirement meet customer expectation cost product time to release. software quality mainly depends on above factors. meet customer requirements in terms of functionality. meet customer expectation in terms of performance, easy to operate, security, etc.. cost of product (with in budget) time to release. need of software testing:- 1)detecting defects 2)finding reliability of an application 3)to stay in business 4)early finding the defects which reduce cost to fix those defects. error: - error is a mistake performed by developer at coding face. ex: - syntax error. bug: - it is a deviation between expected value and actual value found by test engineer. defect: - when developer accepted defect to fix is called real defect in the application. [email protected]

Upload: api-3856633

Post on 11-Apr-2015

283 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Software Testing2

software testing (st):- 17-12-2007 http://qtp.blogspot.com/2007/08/qtp-tutorials-5-page-checkpoint.html

it is a process is to identify correctness, completeness under measuring quality of software application.

software testing is independent of environment.

quality of software application:-

meet customer requirementmeet customer expectationcost producttime to release.

software quality mainly depends on above factors.

meet customer requirements in terms of functionality.meet customer expectation in terms of performance, easy to operate, security, etc..cost of product (with in budget)time to release.

need of software testing:-

1)detecting defects2)finding reliability of an application3)to stay in business4)early finding the defects which reduce cost to fix those defects.

error: - error is a mistake performed by developer at coding face.

ex: - syntax error.

bug: - it is a deviation between expected value and actual value found by test engineer.

defect: - when developer accepted defect to fix is called real defect in the application.

[email protected]

Page 2: Software Testing2

types in software testing:-

manual testing:-

1)software development life cycle(sdlc)2)software testing life cycle(stlc)3)some testing terminology

automation testing:-

1)quality test professional (qtp 9.2) 2)win runner3)load runner (performance tools)4)test drive (td)/ quality control (qc) (test management tools).

roles & responsibility of testing unit:-

1)understanding functionality of an application using software requirement specification(s/wrs).

2)prepare test cases based on test scenarios. a)test scenario: - it’s specify end to end functionality in application.b)test case: - it describes step by step process how we validate application functionality.

step name description i/p required expected results

actual values

status

step 1 open login window

- ‘ok’ button disabled

enabled fail

step 2 verify cancel button

- enabled enabled pass

step 3 enter valid uid

uid = ‘mindq’ ok button disabled

enabled fail

step4 enter valid password

pwd=’mercury’ ok button enabled

enabled pass

step 5 click ‘ok’ button

- main window should exist

expectation pass

fail

Page 3: Software Testing2

3)creating automation test scripts w.r.t. test cases(if automation possible)4)executing test cases & automation test script.5)defect reporting & prioritizing defects.6)participating in a retesting & regression testing.

quality assurance (qa):- they monitoring, measuring & improving strength of developed process based on organization standards.

quality control (qc):- validating output on product w.r.t. standards define by qa people.

s/w testing: - it is a process validating application w.r.t. customer requirements & expectations.

project: - when we developed a application based on specific client requirements & that application can be single application.

product: - when we developed a application based on market requirements & that application can use multiple end users.

software development life cycle (sdlc): -

to develop an application organizations follows systematic approach is called sdlc.it contains following faces.

1)information gathering 2)analysis3)design4)coding or implementation5)testing 6)release & maintenance.

18-12-2007

[email protected]

Page 4: Software Testing2

information gathering (client requirement by ba) ↓ (brs / urs / crs) system analysis → analysis → feasibility │─ cost │─ time (s/wrs) ↓ │ ─ resources │→ frs │ →srs design → tdd │→hld (leaf structure) │ →lld (date flow & e-r diagram) ↓ developer → coding source code white box ↓ test engineer → testing black box client ↓ user acceptance test release & maintenance → port testing

information gathering: - in this phase business analyst or marketing people interact with the client collects basic requirements & they prepared brs / urs / crs documents.after preparation of brs documents system analyst or project manager will verify the feasibility factors in feasibility analysis, like

a). financial benefits for organization. b).with in time is it possible to deliver. c). in terms of resources is it possible to develop.

system analysis: - in this phase system analysis taking brs documents as input. he prepares s/wrs documents.s/wrs documents contain two sub documents: -

1)functional requirement specification (frs): - it describes complete functionality of an application.

2)system requirement specifications (srs): - it describes environment to develop that application.

design: - taking s/wrs document as input designer will prepare technical design documents (tdd). it contains two sub documents.

a). high level design (hld): - it describes number of modules required to develop an

Page 5: Software Testing2

application & relation between that modules. ex: - hms ∕ │ \ ∕ │ \ ∕ │ \ front desk pharmacy clinical / \ / \ reg billing

b). low level design (lld): - to construct each module it gives the complete information like data flow diagrams & entity-relative diagrams.

4) coding: - taking tdd as input developer starts physical construction of an application using source code.

5)testing: - after compiling source code developer validates each program working properly or not using white box testing techniques. after receiving application test engineer validates application working or not w.r.t customer requirements using block box testing techniques.to collect feed back from end user we conduct user acceptance test (uat).

6)release & maintenance: - after receiving right feed back from end user we release

application to the client after conducting port testing.

port testing: - it is performed by release team at a client environment whether the application working properly or not. in release team few developers, few testing engineers & few hardware engineers will participate.

maintenance: - after releasing application based on current requirement client will sent change request that we have to test an application & we release modified version to the end user.

note: - when client imagine he required software for his needs is the starting phase for sdlc. when that application no longer in use that is end phase for sdlc.

software development life cycle models:-based on complexity of an application & the standards of an application they use any one of the following sdlc models.

water fall model: - for small projects, routine type of projects & well known requirements we preferred water fall model. it contains all sdlc phases, it is oldest model.

[email protected]

Page 6: Software Testing2

disadvantages: -

1). during development phase it wont allow any change request. 2). there is no feed back mechanism between phases.

proto type model: -

information gathering ↓ analysis ↓ design → develop prototype → given demo to client ⌡ │ ⌡ │ collect feed back ← no yes coding ↓ testing ↓ release & maintenance

in general we preferred this model when requirements are known but the implementation unknown. in this approach we develop proto types & we give demo to the client based on feed back we follow remaining sdlc phases.

disadvantages: - proto types are not reusable.

note: - proto type is a sample application with out functionality.

spiral model: - when application complex to develop under one sdlc we follow spiral model. in this approach taking critical module we follow one sdlc like all the modules we develops.

disadvantages: - time consuming & more planning is required.

ex: - .

Page 7: Software Testing2

fish model: - it gives the theoretical mapping between sdlc & stlc. sdlc → analysis design coding testing

i/f gathering test s/w changes stlc → review review wbt bbt s/w testing verification + validation

verification: - to verify correctness & completeness of the documents we conduct verification. in this we use to take next, walk through & inspection.

validation: - while execution time of program or application verifying the correctness of the output values w.r.t input data is called validation.

v- model: - v stands for verification & validation. in general organizations follows v- model to develop an application.

i/f gathering brs user acceptance testing review ↓ ↑ system analysis srs….. system testing (bbt) ↓ ↑ high level design ………. integration testingverification ↓ ↑ {wbt} low level design……….unit testing review ↓ ↑ coding validation

* meeting point is coding level.

[email protected]

Page 8: Software Testing2

testing methodologies: -

to validate software application organizations use 3 – testing methodologist.

block box testing: - with out having any programming knowledge testing in it validates application functionality w.r.t customer requirements.

white box testing: - testing internal structure using programming knowledge is called white box testing. in general it is performed by developers.

gray box testing: - it is a combination of white box & block box testing techniques. to conduct gray box testing programming knowledge & functional knowledge is required.

ex: - in general to validate web application during construction time we use gray box testing.

levels of testing: -

unit level testing: - after small unit construction developer verifies that unit working properly or not. in this approach he verifies the following factors.

1). statement coverage: - verify all the statements are participating or not during execution time of program.

2). loops coverage: - verifying loops are terminating properly or not.

3). program technique coverage: - verify whether the developer used right technique or not to achieve the performance.

4). output coverage: - verify that unit or component is giving expected output or not.

*5). cyclomatic complexity: - it is a metric to measure number of independent path to perform one transaction. ex: - a paths: - aef, abcef, abdef ↓ b │ │ c d │ │ ↓ e ↓ f

Page 9: Software Testing2

integration testing (it): - after unit level testing verifying the interface ness between units or components or modules is called integration testing. to conduct integration testing developer use high level design documents. in integration testing we use two techniques.

1). big – bang approach: - when all the modules construction is completed developer verify interface between that modules in a single stage that approach is called big-bang approach.

2). incremental approach: - it is a level by level testing technique. in this approach based on availability modules we verify that interface ness. in this approach we use following testing techniques.

a). top –down approach: - when same of sub modules still under construction to verify the interfaces ness between constructed modules we use temporary program called “stab”. stab is temporary program which is sent back to the control to main module with intermediate results.

main

sub1 stab ↓ sub2

b). bottom-up approach: - when main module still under construction to verify the interface ness between contracted sub modules we use temporary program called “driver”. driver is a temporary program which gives the connectivity to the sub modules.

main ↑ driver

sub1 sub2 sub3 sub4

c). hybrid or sandwich approach: - when main module & some of sub modules still under construction to verify the interface ness between constructed modules we use stab & driver.

[email protected]

Page 10: Software Testing2

main ↓ driver sub1 sub2 ↑ stab sub3 ↓ sub4

note: - white box testing, unit testing & integration testing are called structural testing.

system testing: - after integration testing development team release or build to the test engineer. then test engineers validate functionality factors & non-functionality factors w.r.t customer requirements & expectations are called system testing.

functionality testing: - in this test we verify following factors.

1). changes in properties of objects: - verify object properties are changing or not based on operations performed an application.

ex: - when we enter value in the edit box an open order window ok button should enable.

2). error handling coverage: - verify application invoking any popup message or not to prevent negative operations.

ex: - when we enter invalid date in user id edit box login window should invoke popup message.

3). input – domain coverage: - validating input objects w.r.t customer expected data. to validate input objects test engineer use two techniques to prepare test data.

a). boundary value analysis(bva): - to validate whether the developer given right operations or not at source code. test engineer concentrate boundary values & there adjacent value.

ex : - 4 ≤ x ≤ 50

Page 11: Software Testing2

bva ( size / range)min = passmax = passmin -1 =3 = failmax -1=49 = passmin +1 = 5 = passmax +1 = 51 = fail

b). equalance class partition (ecp): - we prepare test date based on type of characters in terms of valid & invalid. ecp (type)

valid invalid0 – 9 special characters

a – z

a - z

note: - bva & ecp is called data matrix which used to validate input objects.

ex: - prepare data matrix to validate name edit box which allows 4 – 16 characters alphabets only.

data matrix bva ( size) ecp (type)min 4 = pass valid invalid

max 16 = pass a-z 0 – 9

min -1 = 3= fail a – z special characters

max -1 = 15 = pass

min +1 = 5= pass

max +1 +17= fail

[email protected]

Page 12: Software Testing2

ex:- prepare data matrix for age edit box which allows 18 – 80 numeric content only.

data matrix bva(range) ecp ( type)min 18 = pass valid invalid

max 80 = pass 0 – 9 a-z

min – 1 = 17 = fail a-z

max -1 = 79 = pass special characters

min +1 =19= pass

max +1 =81= fail

ex: - prepare data matrix to validate mobile number edit box which allows 10 digits.

min 10 & max10

data matrix bva ( range) ecp(type)min 10 = pass valid invalid

max 10 = pass 0 – 9 a –z

min -1=9 = fail a –z

max – 1= 9 = fail special characters

min +1=11 = fail

max +1 =11 = fail

• verify first digit should not allow zero.

4). calculation or out put coverage: - verify correctness of output values based on input data.

ex: - verify total text field content w.r.t tickets & price text field content.

Page 13: Software Testing2

5). back end or data base coverage: - verify data base content based on front end operations performed an application.

ex: - insert, update & delete operation.

ex: - verify insert button functionality in sample window.

sample back end

name: -

abc → data base → table o. no c.name1 smith

2 rk

3 abc

insert

step1: - collect table content before insert operation.step2: - perform insert operation using insert button.step3: - collect content from table.step4: - compare data if valid mis-matches give the status as passed.

6). link coverage: - in web application we verify a links are working properly or not.for any application functionality testing is mandatory. in general organization maintains separate functional testing team to verify functionality factors.

to conduct functionality testing we can use some automation tolls like qtp, win runner, silk, rational robot, test parameter………etc.

note: - in general test engineers spend 70% of time to validate functionality factors.

[email protected]

Page 14: Software Testing2

non- functionality testing: - in this test we verify non functionality factors w.r.t customer expectations. in this test we use following testing techniques to validate non-functionality factors.

1). usability test or user interface testing:- verify user friendliness of an application is called usability test. in this test we verify following factors.

a). look and feel factors like back ground colour, font size, spell checking etc..b). alignment of controls.c). sharp navigations.d). help document is meaning full or not.

2). performance testing: - in this test we verify speed of process to complete transaction. in performance testing we use 3 techniques.

a). load testing or scalability testing: - verify application supports customer expected load or not at customer expected configure systems.

b). stress testing: - estimating peak limit of load an application is called stress testing.

note: - in general to conduct load testing & stress testing we use automation tolls like load runner or web load.

c). data volume testing: - verify maximum storage capacity in an application database.

3). security testing: - verify privacy to user operations. in this testing test engineer concentrate two factors.

a). authorization: - verify whether the application allowing valid users & preventing invalid users.

b). access control: - verify application providing right services or not for valid users.

4). recovery testing or reliability testing: - verify application able to come back normal state or not from abnormal state with the help of recovery procedures & also we estimate recovery time. normal ex: - atm

after inserting debit card power off that time. abnormal ← recovery procedure

↓ normal state

Page 15: Software Testing2

5). compatibility testing or portability testing: - verify application supports customer expected operating systems, network environment, browsers etc….in compatibility testing we use two techniques.

a). forward compatibility: - in this test we verify application supports future versions of operating systems or not.

b). backward compatibility: - we verify application supports previous versions of operating systems or not.

ex: - s/w application

back word os forward w-95 w-2000 xp,vista

6). configuration testing: - verify application supports different technology hardware devices or not.

ex: - different technology printers.

*7). inter system testing or end to end testing: - verify how well our software exists to the already existing software using common resources. in this approach we perform one transaction from login session to logout session.

8). installation testing: - in this test we verify following factors. a). license availability. b). setup programs working properly or not. c). memory space required. d). user interface ness during installation time.

9). sanitation testing: - finding extra futures in the application which are not specified in the client requirements.

10). comparative testing or parallel testing: - to estimate strength & week ness of an application with compare our product with competitive product in the market.

note: - older version with new versions.

[email protected]

Page 16: Software Testing2

4). user acceptance testing: - to collect feed back from end user we conduct user acceptance test. it is performed by other than testing team like client or end user or project manager etc…

based on type of application we use two techniques under acceptance test.

ά test β test project product

real user virtual user

production environment work environment

testing terminologies: -

1). sanity testing: - after receiving application from development team test engineer verifies basic functionalities of an application to estimate stability. to conduct sanity testing test engineers prepared check lists.

2). smoke testing: - before releasing application development team verifies basic functionalities to estimate stability of an application.

develop setup program s/w application ↓ fail smoke testing ← development team ↓pass release application to test team ↓ fail ← sanity testing ← test engineer ↓ pass system testing

note: - in general test engineers spends 4 – 5 hours of time during sanity testing. whenever we receive application from development team we conduct sanity testing.

s

sanitation testing sanity testing smoke testing

3). adhoc testing: - when requirements are not cleared test engineers used past experience to validate application functionality. in this approach test engineers collects the domain knowledge

Page 17: Software Testing2

from other resources based on that he prepare test cases.

4). jump or monkey testing: - due to lack of time test engineers validates main activities of an application is called jump or monkey test. in this approach we select high priority test cases to validate application functionality.

5). exploratory testing: - due to lack of domain knowledge while learning application functionality test engineer conduct testing is called exploratory testing.

6). re-testing: -

s/wrs → test engineers all fail test cases ↓ developer → prepare test cases modified build ↓ application → bug resolve → bug review ← bug report developer pass fail out put

re-executing test cases on modified version to ensure bug fix one is called re-testing.

7). regression testing: - to estimate impact of modifications or earlier passed functionalities test engineer re-execute test passed test cases on modified version.

s/wrs → test engineers some of passed test cases ↓ developer → prepare test cases modified build ↓ application → bug resolve → bug review

[email protected]

Page 18: Software Testing2

← bug report developer pass fail out put

8). manual testing: - with out using any automation tool test engineer validates application functionality is called manual testing.

analyze requirement │ → prepare test scenarios → prepare test cases → execute test cases → analyze results

in this approach after analyzing requirements test engineer identify test scenarios of futures to test & prepares step by step process how we validate application functionality.

after receiving application test engineer performs operation verifies actual behavior of an application w.r.t test cases.

9). automation testing: - some times test engineer use some automation tools to validate functionality of an application.

analyze requirement │ → prepare test scenarios → identify automation test scenarios → prepare automation test script → execute test script → analyze results

in automation testing test engineer identifies automation test scenarios (repeatable test cases) for

Page 19: Software Testing2

that scenarios test engineers prepare automation test script based on availability of tool.

after preparation of automation test script we execute test script. during execution time automation tool performs operation verifies actual behavior of an application & its provides test results. test engineer has to analyze that result based on requirements.

advantages of automation testing: - 1). test script is reusable. 2). speed in test execution. 3). reliability.4). consistence.

testing process: -

in general organization maintains separate functionality testing team to validate functionality factors.

information gathering (brs) ↓ development team analyze testing team ↓ (s/wrs) ↓ design & coding test planning (tdd) (sco) ↓ ↓ test case design unit & integration testing release initial build ↓ retesting & regression level 0 (sanity testing/tat/quick test) ↑ ↓modified build prepare automation test script ↑ ↓developer set test batches based on different functionality defect mis matches ↓ independent batches report suspand execution select one batch start execution no mis match test cases ↓ ↑ final regression test uat

[email protected]

Page 20: Software Testing2

testing process: - it involves following phases.

1). test initialization: - in this phase project manager initialize testing process in this approach he analyzes following factors.

a). need analyze on requirements. b). functional point analysis. c). risk analysis. - schedule risk - technology risk - resource risk - support riskd). effort estimation e). prepare test strategy.

2). testing planning: - in this phase project manager or test lead prepare some test plan documents it contains following information. a). team formation & task allocation. b). application understanding. c). clarification. d). training needs.

3). test case design: - after test plan documents test engineer prepare test cases. in this phase we perform following transactions or operations.

a). understanding functionality using s/wrs & use cases.b). preparing test scenarios. c). preparing test data. d). preparing test cases.e). preparing traceability matrix.

4). test execution: - a). sanity testing b). system testing c). re-testing d). regression testing

5). defect management: -

a). defect reporting. b). defect analysis. c). defect tracking.

Page 21: Software Testing2

6). test closure: - a). prepare final report. b). project debrief. c). warehouse of documents. d). project analysis report. e). final test report.

1). test initialization: - in general testing process starts with test initialization in this phase project manager prepare test strategy documents after s/wrs documents are free zed. to prepare test strategy document project manager analyze.

→ scope of testing→ budget→ test engineer required & → risk in the project during testing time.

2). test planning: - taking test strategy documents & software requirements test lead prepares test plan documents. it describes

→ what to test?→ how to test?→ who to test?→ when to test?test plan document is root map for test engineer to conduct testing. it contains following component.

introduction: - it specifies objective of testing & scope of project.

features to be tested: - it describes requirements need to test under current test plan documents.

features not to be tested: - it describes requirements & modules not to test under current test plan documents.

pass or fail criteria: - it’s specifies a condition based on application when testing is pass.

entry & exit criteria: - it specifies pre condition & post condition to test an application.

ex: - to conduct performance testing functionality testing should pass.

test strategy: - it describes test factors to test. ex: - correctness, authorization, access to control, data integrity, performance.

[email protected]

Page 22: Software Testing2

testing approach: - (how to test?) it specifies testing techniques to validate test factors.

roles & responsibility: - (who to test?)

it specifies test engineer name & allocated modules to test.

automation tools: - it specifies to conduct testing availability in the organization to conduct testing.

schedule: - it specifies time to test application functionality.

tester deliverable: - it specifies necessary documents should prepare by test engineer each phase of testing.

training need: - based on list during testing phase trainings required for testing team.

ex: - lack of domain knowledge. lack of automation knowledge. lack of communication.

i/p process o/p

s/wrs team formation finalize test plan docs task allocation risk analysis training needs

brs prepare test plan docs

test strategy review of test plan docs

after preparation of test plan document test lead will send test plan documents to all the project members like test engineers, project manager & customer.

3). test case design: - after training sessions test engineers prepare test cases for allocated modules. to prepare test cases test engineer use s/wrs & use cases.

before writing test cases test engineer should identify test scenarios based on that we prepare test cases to validate application.

in general functionality test engineer will prepare two types of test cases.

a). functionality test cases: - to verify the functionality factors test engineer prepares positive

Page 23: Software Testing2

flow test cases & negating flow test cases.

b). gui test cases: - to validate user friendliness of an application we prepare test cases.

to prepare test cases test engineer should analyze following factors: -

1). analyze initial condition to test application.2). analyze functionality factors in the given requirement.3). analyze dependence of the functionality.4). prepare test data based on input values.5). analyze output of the functionality.6). analyze flow of functionality.

ex: - write test cases for successful inserting record in sample window.

sample name

age insert clear

name edit box allows 4-16 characters alpha-numeric lower case.age list box contains 18-80 numeric contains & it has insert & clear buttons.

test case1: - verify label of window as sample.

test case 2: - verify customer expected objects are available or not. name edit box, age list box, insert & clear buttons.

test case 3: - verify insert button disable after open sample window. verify clear button disable after open sample window.

test case 4: - verify cursor position in the name edit box or not.

test case 5: - verify tab operation in the sample window.

test case 6: - successful inserting value in the name edit box.

[email protected]

Page 24: Software Testing2

data matrix

bvs (size) ecf(type)min 4 = pass valid invalid

max 16 = pass 0-9 a-z

min -1 = 3 = fail a-z special characters

max – 1 = 15 = pass

min +1 = 5 = pass

max +1 = 17 = fail

test case 7: - verify age list box contain a numeric content specified range values (18-80).

test case 8: - verify clear button should enable after entering the value in the name edit box.

test case 9: - verify insert button disable when mandatory fields are not filled.

test case 10: - verify insert button enable after filling the value in the mandatory fields.

test case 11: - verify when we click insert button with invalid data in the name edit box it should involve error message.

test case template: -

test case name

test case id module

written by version id

written on project

priority

Page 25: Software Testing2

test procedure: -

step name description i/p required expected results actual results status

← during test case design → ←during test case execution →

priority: - it specifies importance of functionality w.r.t customer requirements. it has classes like high, medium & low.

test procedure: - it described step by step process how we validate functionality in terms of description, test data & expected results.

ex: - write test cases for login window.

step no

description expectation

tc 01 verify the label name of the login name login

tc 02 enter valid user name (mindq) & verify in user name test box

‘mindq’

tc 03 enter a valid password(mercury) & verify/visibility in the password box

*******(7 stars)

tc 04 enter a valid user name & valid password successful login

tc 05 enter a valid user name & invalid password & click ok error message display

tc 06 enter a invalid user name & valid password & click ok error message display

tc 07 enter a invalid user name & invalid password & click ok error message display

[email protected]

Page 26: Software Testing2

uid pwd results valid valid pass

invalid valid fail

valid invalid fail

invalid invalid fail

ex: - write test cases for pen drive.

test case 1: - verify storage capacity of pen drive.

test case 2: - verify its supports multiple operating systems or not.

test case 3: - verify standards of company.

test case 4: - verify data transfer functionalities like copy, paste, delete etc.

test case 5: - verify it providing any intimation or not when we inserting in to system.

test case 6: - verify it providing any intimation or not in the system.

test case 7: - verify case of pen drive.

test case 8: - verify color of pen drive.

test case 9: - verify size of pen drive.

ex: - write test cases for water bottle.

step no description expectation1 verify for the company name of the bottle bisleri

2 verify for the price of the bottle rs. 12

Page 27: Software Testing2

3 verify for the expiry date on the bottle 1/1/2008

4 verify for the mfg date on the bottle 1/1/2007

5 verify for the purity of the water pure

6 verify for the quantity of the water 1 liter

7 verify the bottle seal sealed

8 verify for the isi symbol should exist

9 verify for the color of the bottle white

10 verify for the color of the cap white

11 verify for the logo of the cap should exist

• date (dd-mm-yy) field must be exist.

ex: - test case for date.step no

description expectation

1 verify the len (date) field

2 verify the len(date of the month i.e. dd)

3 verify the len(year)

4 verify the len (month of the year i.e. mm)

5 verify for the existence (“-“) symbol 2 times like dd-mm-yy

6 verify date = 00-00-00 & tab out should not accept

7 enter date = 00-00-01 & tab out should not accept

8 enter date = 00-01-01 & tab out should not accept

9 enter date = 01-00-00 & tab out should not accept

10 enter date = 01-01-00 & tab out should accept

11 enter date = 01-01-01 & tab out should accept

[email protected]

Page 28: Software Testing2

12 enter date = 01-01-99 & tab out should accept

13 enter date = 01-val(>12)-99 & tab out should not accept

14 enter date = val(>31)-val(>12)-99 & tab out should not accept

15 enter date = val(between[01,31])-val(between[01,12])-val(between[01,99]) & tab out

should not accept

ex: - write test cases for mobile phones.step no

description expectation

1 verify for the company name on the mobile

nokia

2 verify for the model 6600

3 verify for the color silver

4 verify for the speaker on the phone should exist on the top

5 verify for the display unit color white

6 verify for the up & down arrows should exist (each one)

7 verify for the key pad should exist

8 verify for the numbers on the key between(0,9)

9 verify for the alphabets on the key between(a,z)

10 verify for the special symbols on +,*,……

11 verify for the power on & off should exist

12 verify for the battery placement should exist

13 verify for the sim card placement should exist

14 verify for the recharging connection socket

should exist

15 verify for the fm recharging connection socket

should exist

16 press power on the button verify the display unit

light should glow & nokia welcomes you

17 now place a call with a valid number dail tone should exist

Page 29: Software Testing2

traceability matrix

requirements functionality/test case scenarios test case type test case idmail login

compose

attachment

functionality ex (1)

functionality ec(2)

functionality ex(3)

tc 01tc 02tc 03

tc 04tc05tc 06

tc 07 tc 08tc 0

after analyzing test case if test lead felt written test cases are sufficient he give the permission to validate application w.r.t test cases.

4). test execution: - it is a process validating application functionality w.r.t test cases.

re-testing & modified version regression testing ↑ development → build → testing → sanity test/ → comprehensive → pass uat team team → tat/bvt pass test (s.t) ↓ ↑ fail ↓ ↑ defect report mismatches↓ test closure

after receiving application from development team test engineers verifies basic functionality of an application to verify the stability w.r.t prepared check list.

during sanity testing if application fails we reject application to development team after receiving modified version again we conduct sanity testing if it pass then we conduct comprehensive testing or system testing.

during comprehensive testing if test engineer stand found any mismatches w.r.t test cases we send defects to developer.

after receiving modified version we conduct re-testing & regression testing after sanity test. then we continue remaining comprehensive testing.

[email protected]

Page 30: Software Testing2

if test is pass to collect feed back from end user we conduct user acceptance test (uat). when he given right feed back test lead analyzes test report then he closes the testing process.

receiving application by test engineers: -

v 1.0 check out process v 1.1 check in process local servers ftp ↓ configuration local management ↓ ↓ ↓ server (vss) t1 t2 t3 d1 d2 d3 development environment testing environment(distributed environment) (non-distributed environment)

after unit & integration testing development team prepares set of programs & they release application in to configuration management tool & they give the intimation to all project members through mail.

from configuration management tool test engineers receives the application in to their local servers using file transfer protocol (ftp) then they conduct test execution.

a). copying a file in to vss (visual source sage) is called check in process.b). receiving a file from vss is called check out process.

5).defect management: - after finding defects test engineer sends to test lead, he verifies the defects then he intimate to the project manager. project manager allocates that work to team lead.

Page 31: Software Testing2

defect submission process: -

project manager ↑ team lead team lead ↓ ↑ report developer test engineers bug tracking

defect template: -

summary priority version (high, medium, low)defect id severity project (high, medium, low)defected by status (new/open/closed/rejected)defected on module

reproducible test procedure:-

step name description i/p required expected results actual results status

approved by: - resolved on

assigned: - resolution type

re producible defect: - when defect raises at any time at any environment its called re-producible defects.

priority: - in describes importance of the functionality w.r.t customer requirements. it has classes like critical, high, medium & low.

saveority: - it describes impact of the defect or seriousness of the defect to execute remaining

[email protected]

Page 32: Software Testing2

test cases. it has classes like high, medium & low.

resolution type: - this field filled by developers after conducting defect review based on resolution type test engineers knows whether the developer accepted defects or not.

ex: - accepted, rejected, enhancement, differed or hold.

status: - it indicates defect status.

bug life cycle: -

detect defect ↓ re-producible ↓ ← defect report ↓ new ← t.e. defect resolve re-open → ↓ open ← developer re-testing fail ↓ ↓ pass ← closed (t.e)

test engineer: - developer: - • new open• closed reject• re-open differed

latest defect: - defects are identify at latest stages that defects are called as latest defect.

defect age: - time gap between defect resolved & defect identified.

test harness: - it is combination of test environment + test documents.

test environment test documents

Page 33: Software Testing2

• software application test cases• customer expected• configuration systems

6). test closure: - after conducting acceptance test test lead analyze following factors to stop testing process.

a). all the test cases execution is completed or not.b). defects are solved or not.c). right feed back from end user.d). defect ratio is less than acceptance level.e). passed out testing team.f). time factor.

some importance questions of manual testing: -

1). explain need of software testing?2). explain difference between developer testing & test engineers testing? 3). explain difference between verification & validation4). which process your organization follows to develop an application?5). explain v-model? & what is the meeting point in v-model?6). when we preferred spiral model to develop an application?7). explain testing methodologies?8). explain cyclomatic complexity?9).explain difference between stab & driver?10). who will conduct integration testing?11). explain difference between system testing & functionality testing?12). explain compatibility testing?13). explain performance testing?14). explain inter system testing? 15). explain difference between smoke testing & sanity testing?16). explain difference between re-testing & regression testing?17). do you prefer manual testing or automation testing?18). explain testing process in your organization?19). are you participated in test plan preparation?20). how will you prepare test cases?21). explain importance of the priority in test case documents?22). explain test execution process?23). how much time will you take to write one test case?24). how many test cases you can execute in a day?25). explain bug life cycle? explain difference between seveority & priority?26). how will know written test cases are sufficient to validate application of functionality?27). how will you prepare test data to validate input object?28). explain defect status?

[email protected]

Page 34: Software Testing2

29). how will know whether the developer accepted you defects or not?30). explain acceptance test?31). explain when we stop testing?32). explain scope of your project?33). in which module you conducted testing?34). explain defects which you found in that module?35). if defect is not reproducible then what information you will provide to developer36). if developer is not accepted your defect then what you will do?