Professional Documents
Culture Documents
Verification Verification is the process of evaluating products of a development phase to find out whether they meet the specified requirements. The objective of Verification is to make sure that the product being develop is as per the requirements and design specifications. Validation Validation is the process of evaluating software at the end of the development process to determine whether software meets the customer expectations and requirements. The objective of Validation is to make sure that the product actually meet up the users requirements and check whether the specifications were correct in the first place. !ollowing activities are involved in validation" Testing like black box testing white box testing gray box testing etc. Validation is carried out by testing team.
Verification is carried out by &' team to check whether implementation software is as per specification document or not. (xecution of code is not comes under Verification. Verification process explains whether the outputs are according to inputs or not.
(xecution of code is not comes under Validation. Validation process describes whether the software is accepted by the user or not.
Verification is carried out before the Validation activity is carried out just Validation. after the Verification. !ollowing items are evaluated during Verification" )lans #equirement *pecifications Design *pecifications +ode Test +ases etc +ost of errors caught in Verification is less than errors found in Validation. !ollowing item is evaluated during Validation" 'ctual product or *oftware under test.
%t is basically manually checking the %t is basically checking of developed of documents and files like program based on the requirement requirement specifications etc. specifications documents , files.
Regression Testing
#egression testing is a type of software testing that intends to ensure that changes like defect fixes or enhancements to the module or application have not affecting unchanged part.
Retesting
#etesting is done to make sure that the tests cases which failed in last execution are passing after the defects against those failures are fixed.
#egression testing is not carried out on #etesting is carried out based on the specific defect fixes. %t is planned as defect fixes. specific area or full regression testing. %n #egression testing you can include the test cases which passed earlier. .e can say that check the functionality which was working earlier. #egression test cases we use are derived from the functional specification the user manuals user tutorials and defect reports in relation to corrected problems. %n #etesting you can include the test cases which failed earlier. .e can say that check the functionality which was failed in earlier build. Test cases for #etesting cannot be prepared before start testing. %n #etesting only re-execute the test cases failed in the prior execution.
'utomation is the key for regression /ou cannot automate the test cases testing. for #etesting. $anual regression testing tends to get more expensive with each new release. #egression testing is right time to start automating test cases. Defect verification is not comes under #egression testing. 0ased on the availability of resources the #egression testing can be carried out parallel with #etesting. Defect verification is comes under #etesting. )riority of #etesting over #egression testing is higher so it is carried out before regression testing.
Priority:
Priority means how fast it has to be fixed. Priority is related to scheduling to resolve the problem. Severity means how severe it is affecting the functionality. Is largely related to Business or Marketing aspect. It is a pointer towards the importance of the bug. he priority status is set based on the customer re!uirements. Is related to technical aspect of the product. It reflects on how bad the bug is for the system. Priority means how urgently the issue can be fixed. Product manager is to decide the Priority to fix a bug. Based on "Pro#ect Priorities the product fixes are done. he Priority status is set by the tester to the developer mentioning the time frame to fix a defect. If $igh priority is mentioned then the developer has to fix it at the earliest.
Severity:
It is totally related to the !uality standard or devotion to standard. Severity means how severe it is affecting the functionality. Severity is associated with standards. he severity type is defined by the tester based on the written test cases and functionality. Is related to technical aspect of the product. It reflects on how bad the bug is for the system. It is totally related to the !uality standard or devotion to standard. Severity means how big functionality is affecting of the product.
he est %ngineer can decide the severity level of the bug. Based on Bug Severity the product fixes are done. &lso we can say he Severity status is used to explain how badly the deviation is affecting the build.
Severity:
0n the home page of the company*s web site spelling mistake in the name of the company is surely a $igh Priority issue. In terms of functionality it is not breaking anything so we can mark as )ow Severity' but making bad impact on the reputation of company site. So it highest priority to fix this.
Lo
1.
Lo
Priority & Lo
Severity:
1. Spelling mistake in the confirmation error message like -3ou have registered success/ instead of successfully' success is written. (. 4eveloper is missed remove cryptic debug information shortcut key which is used developer while developing he application' if you pressing the key combination )%5 6&) 7)%5 68 .)7.I9$ 68 .)7.I9$ 6&) 75:751; for 1 mins <!unny na=.
User Acceptance testing is the software testing process where system tested
for acceptability + validates the end to end business flow. Such type of testing executed by client in separate environment <similar to production environment= + confirms whether system meets the re!uirements as per re!uirement specification or not. he 'cceptance testing is -black box/ tests' means ,& users doesn*t aware of internal structure of the code' they #ust specify the input to the system + check whether systems respond with correct result.
he Business .e!uirements should be available. he development of software application should be completed + different levels of testing like ,nit esting' Integration esting + System esting is completed. &ll $igh Severity' $igh Priority defects should be verified. ?o any Showstoppers defects in the system. 8heck if all reported defects should be verified prior to ,& starts. 8heck if raceability matrix for all testing should be completed. Before ,& starts error like cosmetic error are acceptable but should be reported. &fter fixing all the defects regression esting should be carried out to check fixing of defect not breaking the other working area. he separate ,& environment similar to production should be ready to start ,& . he Sign off should be given by System testing team which says that Software application ready for ,& execution.
about the load time of Website is realistic. if &de!uate extBtoBBackground 8ontrast is present. if font si@e + spacing between the texts is properly readable. if website has its C;C page or any custom designed ?ot 5ound page. if appropriate &) tags are added for images.
if if if if if if
user is effortlessly recogni@es the website navigation. navigation options are understandable + short. number of buttonsDlinks are reasonable the 8ompany )ogo Is )inked to $omeBpage style of links is consistent on all pages + easy to understand. site search is present on page + should be easy to accessible.
if if if if if if if
,.)s &re Meaningful + ,serBfriendly $ M) Page itles &re %xplanatory 8ritical 8ontent Is &bove he 5old %mphasis <bold' etc.= Is ,sed Sparingly Main 8opy Is 8oncise + %xplanatory Ma#or $eadings &re 8lear + 4escriptive Styles + 8olours &re 8onsistent
,sability testing finds important bugs and potholes of the tested application which will be not visible to the developer. ,sing correct resources' usability test can assist in fixing all problems that user face before application releases. ,sability test can be modified according to the re!uirement to support other types of testing such as functional testing' system integration testing' ,nit testing' smoke testing etc. Planned "sa#ility testing becomes very economic' highly successful and beneficial. Issues and potential problems are highlighted before the product is launched.
1C. Severity <.anges from 1 to := 1:. Status 1F. Bug I4 1G. &ttachment 1H. est 8ase 5ailed < est case that is failed for the Bug=
What is the difference between S ftware Testing and !"alit# Ass"rance $!A%& Software Testing involves operation of a system or application under controlled conditions and evaluating the result. It is oriented to 'detection'. Quality ssurance !Q " involves the entire software development #$%&'SS( monitoring and improving the process) making sure that any agreed(upon standards and procedures are followed) and ensuring that problems are found and dealt with. It is oriented to 'prevention'.
I Model
In the V $odel *oftware Development 5ife +ycle' based on same information <re!uirement specification document= the development + testing activity is started. Based on the re!uirement document developer team started working on the design + after completion on design start actual implementation and testing team starts working on test planning' test case writing' test scripting. Both activities are working
parallel to each other. In Waterfall model + IBmodel they are !uite similar to each other. &s it is most popular Software esting )ife 8ycle model so most of the organi@ation is following this model. he IBmodel is also called as Verification and Validation model. he testing activity is performing in the each phase of Software esting )ife 8ycle phase. In the first half of the model validations testing activity is integrated in each phase like review user re!uirements' System 4esign document + in the next half the Ierification testing activity is come in picture. ypical IBmodel shows Software 4evelopment activities on the )eft hand side of model and the .ight hand side of the model actual esting Phases can be performed. In this process -4oBProcedure/ would be followed by the developer team and the -8heckBProcedure/ would be followed by the testing team to meets the mentioned re!uirements. In the IBModel software development life cycle different steps are followed however here we will take a most common type of IBmodel example. he IBmodel typically consists of the following phases> 1. (. E. C. ,nit esting> Preparation of ,nit est 8ases Integration esting> Preparation of Integration est 8ases System esting> Preparation of System test cases &cceptance esting> Preparation of &cceptance est 8ases
#equirement 'nalysis"
(ntry +riteria
'ctivities
Deliverable
5ollowing Prepare the list of !uestions or !ueries and get )ist of !uestions documents should resolved from Business &nalyst' System with all answers to be available> &rchitecture' 8lient' echnical ManagerD)ead be resolved from etc. business i.e. B .e!uirements testable Specification. Make out the list for what all ypes of ests re!uirements performed like 5unctional' Security' and B &pplication Performance etc. &utomation architectural feasibility report 4efine the testing focus and priorities. <if applicable= &long with above documents )ist down the est environment details where &cceptance testing activities will be carried out. criteria should be 8heckout the &utomation feasibility if re!uired well defined. + prepare the &utomation feasibility report.
Test )lanning"
(ntry +riteria .e!uirements 4ocuments <,pdated version of unclear or missing re!uirement=. &utomation feasibility report. 'ctivities 4efine 0b#ective + scope of the pro#ect. )ist down the testing types involved in the S )8. est effort estimation and resource planning. Selection of testing tool if re!uired. 4efine the testing process overview. 4efine the test environment re!uired for entire pro#ect. Prepare the test schedules. Deliverable est Plan or est strategy document. esting estimatio n document.
4efine the control procedures. 4etermining roles and responsibilities. )ist down the testing deliverable. 4efine the entry criteria' suspension criteria' resumption criteria and exit criteria. 4efine the risk involved if any.
est &utomation .eBre!uisite test data preparation for executing Scripts <if test cases. re!uired=.
&naly@e the re!uirements and prepare the list of est Software + hardware re!uired to set up test %nvironment will environment. be ready with Smoke est cases test data. are available. Setup the test environment. .esult of Smoke est data is 0nce the est %nvironment is setup execute the est cases. available. Smoke test cases to check the readiness of the test environment.
Test (xecution"
(ntry +riteria
'ctivities
est Plan or est Based on test planning execute the test strategy document. cases. est cases. est data.
Mark status of test cases like Passed' 5ailed' 4efect report. Blocked' ?ot .un etc. &ssign Bug Id for all 5ailed and Blocked test cases. 4o .etesting once the defects are fixed. rack the defects to closure.
est case execution %valuate cycle completion criteria based on est 8losure is completed est coverage' 2uality' 8ost' ime' 8ritical report Business 0b#ectives' and Software Prepare est case %xecution test metrics based on the above parameters. est metrics report Prepare est closure report 4efect report Share best practices for any similar pro#ects in future
of software. -ailures may also arise because of human error in interacting with the software. /. &ost of defects. (
0. $igorous testing is necessary during development and maintenance to identify defects) in order to reduce failures in the operational environment and increase the +uality of the operational system. 1. *e may also be re+uired to carry out software testing to meet contractual or legal re+uirements) or industry(specific standards. 2. Testing helps to measure the +uality of software in terms of the number of defects found) the tests run) and the system covered by the tests. 3. *hen testing finds defects) the +uality of the software system increases. 4. The ISTQ5 glossary definition of +uality covers not 6ust the specified re+uirements but also user and customer needs and expectations. 17.
Viewpoint Software
Quality is fitness for use. Quality can have sub6ective aspects and not 6ust +uantitative aspects. Quality is based on good manufacturing processes) and meeting defined re+uirements. It is measured by testing) Inspection and analysis of faults and than
*e will measure the attributes of the software) e.g. its reliability in terms of mean time between failures !85T-") and release when they reach a specified level e.g. 8T5- of 19 hours. *e will ask the users whether they can carry out their tasks: if they are satisfied that they can we will release the software. *e will use a recogni;ed software development process. *e will only release the software if there are few than five outstanding high(priority defects once the
failures.
planned tests are &omplete. *e have time(boxed the testing to two weeks to stay in the pro6ect budget.
'xpectation of value for money. affordability) and a value(based trade(off 5etween time) effort and cost aspects. *e can afford to buy this software and *e expect a return on investment. Transcendent feelings ( this is about the feelings of an individual or group of individuals towards a product or a Supplier.
We like this software! It is fun and its the latest thing! So what if it has a small local farm few problems? We want to use it r. anyway...
11. The more rigorous our testing) the more defects we'll find.
19.