You are on page 1of 32

Software Testing Concepts

There are two ways of Testing.

1. MANUAL TESTING.
2. AUTOMATION TESTING.
1). MANUAL TESTING

It is a process in which all the phases of Software Testing Life Cycle like Test Planning, Test
Development, Test Execution, Result Analysis, Bug Tracking and Reporting are accomplished
successfully manually with human efforts.

Drawbacks of Manual Testing

1. More number of people is required.


2. Time consuming.
3. No accuracy.
4. Tiredness.
5. Simultaneous actions are almost impossible.
6. Repeating the task in a same manner is not so easy manually.

2). AUTOMATION TESTING


It is a process in which all the drawbacks of manual testing are addressed (over come) properly and
provides speed and accuracy to the existing testing phase.
Note:
Automation Testing is not a replacement for manual testing it is just a continuation for a manual testing
in order to provide speed and accuracy.

Drawbacks of Automation Testing


1. Too cost.
2. Cannot automat all the areas.
3. Lake of expatriation.

AUTOMATION TOOL
Automated Tool is an Assistance of test engineers, which works based on the instructions and
information.
Definitions
Project: It is something developed based on a particular customer requirement and used by that
particular customer only.

Product: Product is some thing that is developed based on the company’s specifications and used by
multiple customers.

Quality: Quality is defined as not only the justification of the requirement but also the present of
value (user friendly).

Defect: Defect is defined as deviation from the requirements.

Testing: Testing is a process in which the defects are identified, isolated (separated), subjected
(sending) for rectification and ensured that the product is defect free in order to produce a quality
product in the end and hence customer satisfaction.
(Or)

Testing is the process of executing a program with the intent of finding errors.
(Or)
Verifying and validating the application with respect to customer requirements.
(Or)
Finding the differences between customer expected and actual values.
(Or)
Testing should also ensure that a quality product is delivered to the customer.

Process of Developing Project in the Software Company.

BIDDING THE PROJECT: Bedding the project is defined as request for proposal, estimation and signoff.

KICK OF MEETING:
It is a initial meeting conducted in the software company soon after the project is signed off in order to discus
the over view of the project and to select a project manager for the project.

Usually High Level Manager, Project Manager, Technical Manager, Quality Managers, Test leads and Project
leads will be involved in this meeting.

PIN (Project Initiation Note)


PIN is a mail prepaid by the project manager and send to the CEO of the software company in order to get the
permission to start the project development.
SDLC (Software Development Life Cycle)

It contains 6 phases.

 INITIAL PHASE / REQUIREMENT PHASE.


 ANALYSIS PHASE.
 DESIGN PHASE.
 CODING PHASE.
 TESTING PHASE.
 DELIVERY AND MAINTENANCE PHASE.
Initial Phase

Task Interacting with the customer and gathering the requirements.


Roles BA (Business Annalist)
EM (Engagement Manager)
Process
First of all the business annalist will take an appointment from the customer, collects the templates
from the company meats the customer on the appointed date gathers the requirements with the
support of the templates and comeback to the company with a requirements documents. Then the
engagement manager will check for the extra requirements if at all he finds any extra requirements
he is responsible for the excess cast of the project. The engagement manager is also responsible for
prototype demonstration in case of confused requirements.

Template
It is defined as a pre-defined format with pre-defined fields used for preparing a document
perfectly.

Prototype
It is a rough and rapidly developed model used for demonstrating to the client in order to gather
clear requirements and to win the confidence of a customer.

Proof
The proof of this phase is requirements document which is also called with the following name
FRS - (Functional Requirement Specification)
BRS - (Business Requirement Specification)
CRS - (Client/Customer Requirement Specification)
URS - (User Requirement Specification)
BDD - (Business Design Document)
BD - (Business Document)
Note
Some company’s may the over all information in one document called as ‘BRS’ and the detailed
information in other document called ‘FRS’. But most of the company’s will maintain both of
information in a single document.
Analysis Phase

Task Feasibility study.


Tentative planning.
Technology selection.
Requirement A\analysis.

Roles System Annalist (SA)


Project Manager (PM)
Team Manager (TM)
Process
(I) Feasibility study It is detailed study of the requirements in order to check whether all the
requirements are possible are not.
(II) Tentative planning the resource planning and time planning is temporary done in this section.
(III) Technology selection the lists of all the technologies that are to be used to accomplish the
project successfully will be analyzed listed out hear in this section.
(IV) Requirement analysis The list of all the requirements like human resources, hardware,
software required to accomplish this project successfully will be clearly analyzed and listed out
hear in this section.

Proof The proof of this phase is SRS (Software Requirement Specification).

Design phase

Tasks HLD (High Level Designing) LLD (Low Level Designing)

Roles HLD is done by the CA (Chief Architect). LLD is done by the TL (Technical Lead).
Process
The chief architect will divided the whole project into modules by drawing some diagrams and
technical lead will divided each module into sub modules by drawing some diagrams using UML
(Unified Modeling Language).
The technical lead will also prepare the PSEUDO Code.

Proof The proof of this phase is TDD (Technical Design Document).


Pseudo Code It is a set of English instructions used for guiding the developer to develop the
actual code easily.
Module
Module is defined as a group of related functionalities to perform a major task.
Coding Phase
Task Programming / Coding.
Roles Developers / Programmers.
Process
Developers will develop the actual source code by using the PSUEDO Code and following the
coding standards like proper indentation, color-coding, proper commenting and etc…

Proof The proof of this phase is SCD (Source Code Document).


Testing Phase
Task Testing.
Roles Test Engineer.
Process

 First of all the Test Engineer will receive the requirement documents and review it for under studying
the requirements.

 If at all they get any doubts while understanding the requirements they will prepare the Review Report
(RR) with all the list of doubts.

 Once the clarifications are given and after understanding the requirements clearly they will take the test
case template and write the test cases.

 Once the build is released they will execute the test cases.

 After executions if at all find any defects then they will list out them in a defect profile document.

 Then they will send defect profile to the developers and wait for the next build.

 Once the next build is released they will once again execute the test cases

 If they find any defects they will follow the above procedure again and again till the product is defect
free.

 Once they feel product is defect free they will stop the process.

Proof The proof of this phase is Quality Product.

Test case Test case is an idea of a Test Engineer based on the requirement to test a particular feature.

Delivery and Maintenance phase


Delivery
Task Installing application in the client environment.
Roles Senior Test Engineers / Deployment Engineer.

Process
The senior test engineers are deployment engineer will go to the client place and install the
application into the client environment with the help of guidelines provided in the deployment
document.
Maintenance
After the delivery if at all any problem occur then that will become a task based on the problem the
corresponding roll will be appointed. Based on the problem role will define the process and solve
the problem.
Where exactly testing comes in to picture?
Which many sort of testing are there?
There are two sorts of testing.
1. Unconventional testing 2. Conventional testing
Unconventional Testing
It is a sort of testing in which quality assurance people will check each and every out come document right from
the initial phase of the SDLC.

Conventional Testing
It is a sort of testing in which the test engineer will test the application in the testing phase of SDLC.
TESTING METHODOLOGY (OR) TESTING TECHNIQUES

There are 3 methods are there

 Black Box Testing.


 White Box Testing.
 Gray Box Testing
1 Black Box Testing
It is a method of testing in which one will perform testing only on the functional part of an application with out
having any structural knowledge. Usually test engineers perform it.

2 White box Testing (Or) Glass box Testing (Or) Clear box Testing
It is a method of testing in which one will perform testing on the structural part of an application. Usually
developers are white box testers perform it.

3 Gray box Testing


It is a method of testing in which one will perform testing on both the functional part as well as the structural
part of an application.
Note:
The Test engineer with structural Knowledge will perform gray box testing.

LEVELS OF TESTING

There are 5 levels of testing.

1) Unite level testing


2) Module level testing
3) Integration level testing
4) System level testing
5) User acceptance level testing
1) Unit level testing

If one performs testing on a unit then that level of testing is known as unit level testing. It is white box testing
usually developers perform it.
Unit: -It is defined as a smallest part of an application.

2) Module level testing

If one perform testing on a module that is known as module level testing. It is black box testing usually test
engineers perform it.
3) Integration level testing

Once the modules are developing the developers will develop some interfaces and integrate the module with the
help of those interfaces while integration they will check whether the interfaces are working fine or not. It is a
white box testing and usually developers or white box testers perform it.

The developers will be integrating the modules in any one of the following approaches.
i) Top Down Approach (TDA)
In this approach the parent modules are developed first and then integrated with child modules.
ii) Bottom up Approach (BUA)
In this approach the child modules are developed first and the integrated that to the corresponding parent
modules.
iii) Hybrid Approach
This approach is a mixed approach of both Top down and Bottom up approaches.
iv) Big bang Approach
Once all the modules are ready at a time integrating them finally is known as big bang approach.
STUB
While integrating the modules in top down approach if at all any mandatory module is missing then that module
is replaced with a temporary program known as STUB.

DRIVER
While integrating the modules in bottom up approach if at all any mandatory module is missing then that
module is replaced with a temporary program known as DRIVER.
4) System level testing

Once the application is deployed into the environment then if one performs testing on the system it is known as
system level testing it is a black box testing and usually done by the test engineers.

At this level of testing so many types of testing are done.

Some of those are

 System Integration Testing


 Load Testing
 Performance Testing
 Stress Testing etc….
5) User acceptance testing.

The same system testing done in the presents of the user is known as user acceptance testing. It s a black box
testing usually done by the Test engineers.

ENVIRONMENT

Environment is a combination of 3 layers.


Presentation Layer.
Business Layer.
Database Layer.

Types of Environment
There are 4 types of environments.

1. Stand alone Environment / One – tire Architecture.


2. Client – Server Environment / Two – tire Architecture.
3. Web Environment / Three – tire Architecture.
4. Distributed Environment / N – tire Architecture.

1) Stand alone environment (Or) One – Tire Architecture.

This environment contains all the three layers that is Presentation layer, Business layered and Database layer in
a Single tier.

Client – Server Environment (Or) Two – Tire Architecture

In this environment two tiers will be there one tier is for client and other tier is for Database server. Presentation
layer and Business layer will be present in each and every client and the database will be present in database
server.
Web Environment

In this Environment three tiers will be there client resides in one tier, application server resides in middle tier
and database server resides in the last tier. Every client will have the presentation layer, application server will
have the business layer and database server will have the database layer.

Distributed Environment

It is same as the web Environment but the business logic is distributed among application server in order to
distribute the load.
Web Server: It is software that provides web services to the client.
Application Server: It is a server that holds the business logic.
Ex: Ton tact, Tomcat, Web logic, web Spear etc………

SOFTWARE DEVELOPMENT MODELS

There are 6 models.

 Water fall Model (or) Sequential Model


 Prototype Model
 Evolutionary Model
 Spiral Model
 Fish Model
 V - Model
1) Water fall Model (or) Sequential Model

INITIAL Req. Gathering BRS

ANALYSIS Sys. Design SRS

DESIGN S/W Design TDD, GUI

Unit Test UTR


CODING Implementation
Int. Test ITR

Mod. Test MTR


TESTING Black box Testing
Sys.Test STR

UAT UATR
Del & Maint Delivery to Client

Advantages:
It is a simple model and easy to maintain project implementation is very easy.

Drawbacks:
Can’t incorporate new changes in the middle of the project development.

Client Environ
2) Prototype Model
S R S Doc
Base lined
Unclear Req

H/W Prototype Demo to Client

Prototype
S/W Prototype
Demo to Client
e

B R S Doc
Req .are Refined
Base lined
Advantages:

When ever the customer with the requirements then this is the best model to gather the clear requirements.

Drawbacks:
It is not a complete model.
Time consuming model
Prototype has to be Build Company’s cost
The user may strict to the prototype and limit his requirements.

3) Evolutionary Model

Initial Req.

Development
Feed back
N With new req

User
Application User Values Acceptance

App is
Y Base lined
Advantages

Whenever the customer is revolving the requirements this is the best suitable model.

Drawbacks

Dead lines are not clearly defined


Project monitoring and maintenance is difficult.
4) Spiral Model

The spiral model is a software development process combining elements of both design and prototyping- in-
stages, in an effort to combine advantages of top-down and bottom-up concepts. Also known as the spiral
lifecycle model, it is a systems development method (SDM) used in information technology (IT). This model of
development combines the features of the prototyping model and the waterfall model. The spiral model is
intended for large, expensive and complicated projects
Advantages

 Estimates (i.e. budget, schedule, etc.) become more realistic as work progresses, because important issues
are discovered earlier.
 It is more able to cope with the (nearly inevitable) changes that software development generally entails.
 Software engineers (who can get restless with protracted design processes) can get their hands in and start
working on a project earlier.

Advantages
This is the best-suited model for highly risk-based projects.
Drawbacks
Time consumed model, costly model and project monitoring and maintenance is difficult.
5) Fish Model
Verification:
Verification is a process of checking conducted on each and every role of an organization in order to check
whether he is doing his tasks in a right manner according to the guidelines or not. Right from the starting of the
process tiles the ending of the process. Usually the documents are verified in this process of checking.
Validation
Validation is a process of checking conducted on the developed product in order to check whether it is working
according to the requirements or not.
Delivery &
Analysis Coding
Design Maintenance

Requiremen System
ts gathering Testing

HLD
SRS SCD
LLD

BRS Black box


Review Testing

SRS TDD White Box Test S/W


Review Review Testing Changes

Verification

Advantages Validation
As the verification and validation are done the outcome of a Fish Model is a quality product.
Drawbacks: Time consuming and costly model.

6) V – Model

Verification Validation
BRS Prepare Pro. Plan
Initial
& Prepare Test Plan
SRS Req. Phase Testing
Analysis

Design TDD Design phase Testing


&
Coding SCD Program phase Testing

System Testing
Testing S/W Test Management process
Build User Acceptance Testing

Delivery Port Testing


& S/W Efficiency
Maintenance Test S/W Changes
Advantages
As the verification and validation are done along with the Test Management. The out come of V-Model is a
quality product.
Drawback
Time consuming and costly model.

TYPES OF TESTING

There are 18 types of testing.


1. Build Verification Testing.
2. Regression Testing.
3. Re – Testing.
4. α - Testing.
5. β - Testing.
6. Static Testing.
7. Dynamic Testing.
8. Installation Testing.
9. Compatibility Testing.
10. Monkey Testing
11. Exploratory Testing.
12. Usability Testing.
13. End – To – End Testing.
14. Port – Testing.
15. Reliability Testing
16. Mutation Testing.
17. Security Testing.
18. Adhoc Testing.
1) Sanitary Testing / Build Verification Testing / Build Accepting Testing.
It is a type of testing in which one will conduct overall testing on the released build in order to check weather it
is proper for further details testing or not.

Some companies even call it as Sanitary Testing and also Smoke Testing. But some company’s will say that just
before the release of the built the developer’s will conduct the overall testing in order to check weather the build
is proper for detailed testing or not that is known as Smoke Testing and once the build is released once again the
testers will conduct the over all testing in order to check weather the build is proper for further detailed testing
or not. That is known as Sanity Testing.
2) Regression Testing
It is a type of testing in which one will perform testing on the already tested functionality again and again this is
usually done in scenarios (Situations).
Scenario 1:
When ever the defects are raised by the Test Engineer rectified by the developer and the next build is released to
the testing department then the Test Engineer will test the defect functionality and its related functionalities once
again.
Scenario 2:
When ever some new changes are requested by the customer, those new features are incorporated by the
developers, next built is released to the testing department then the test engineers will test the related
functionalities of the new features once again which are already tested. That is also known as regression testing.
Note: Testing the new features for the first time is new testing but not the regression testing.
3) Re – Testing:
It is a type of testing in which one will perform testing on the same function again and again with multiple sets
of data in order to come to a conclusion whether the functionality is working fine or not.

4) α - Testing:
It is a type of testing in which one (I.e., out Test Engineer) will perform user acceptance testing in our company
in the presents of the customer.
Advantages:
If at all any defects are found there is a chance of rectifying them immediately.
5) β - Testing:
It is a type of testing in which either third party testers or end users will perform user acceptance testing in the
client place before actual implementation.

6) Static Testing:
It is a type of testing in which one will perform testing on an application or it’s related factors with out
performing any actions.
Ex: GUI Testing, Document Testing, Code reviewing and etc…

7) Dynamic Testing:
It is a type of testing in which one will perform testing on the application by performing same action.
Ex: Functional Testing.

8) Installation Testing:
It is a type of testing in which one will install the application in to the environment by following the guidelines
given in the deployment document and if the installation is successful the one will come to a conclusion that the
guidelines are correct otherwise the guidelines are not correct.

9) Compatibility Testing:
It is a type of testing in which one may have to install the application into multiple number of environments
prepared with different combinations of environmental components in order to check whether the application is
suitable with these environments or not. This is use usually done to the products.

10) Monkey Testing:


It is a type of testing in which one will perform some abnormal actions intentionally (wanted) on the application
in order to check its stability.

11) Exploratory Testing:


It is a type of testing in which usually the domain expert will perform testing on the application parallel by
exploring the functionality with out having the knowledge of requirements.

12) Usability Testing:


It is a type of testing in which one will concentrate on the user friendliness of the application.

13) End – To – End Testing:


It is a type of testing in which one will perform testing on a complete transaction from one end to another end.

14) Port Testing:


It is a type of testing in which one will check weather the application is comfortable or not after deploying it into
the original client’s environment.

15) Reliability Testing (or) Soak Testing:


It is a type of testing in which one will perform testing on the application continuously for long period of time in
order to check its stability.

16) Mutation Testing:


It is a type of testing in which one will perform testing by doing some changes

For example usually the developers will be doing any many changes to the program and check it’s performance
it is known as mutation testing.

17) Security Testing:


It is a type of testing in which one will usually concentrate on the following areas.
i) Authentication.
ii) Direct URL Testing.
iii) Firewall Leakage Testing.
I) Authentication Testing:
It is a type of testing in which a Test Engineer will enter different combinations of user names and passwords
in order to check whether only the authorized persons are accessing the application or not.

ii) Direct URL Testing:


It is a type of testing in which a test engineer will specified the direct URL’s of secured pages and check
whether they are been accessing or not.

iii) Firewall leakage Testing:


It is a type of testing in which one will enter as one level of user and try to access the other level
unauthorized pages in order to check whether the firewall is working properly or not.

18) Adhoc Testing:


It is a type of testing in which one will perform testing on the application in his own style after understanding
the requirements clearly.

S O F T WA R E T E S T I N G L I F E C Y C L E

It contains 6 phases.

1. TEST PLANNING.
2. TEST DEVELOPMENT.
3. TEST EXECUTION.
4. RESULT ANALYSIS.
5. BUG TRACKING.
6. REPORTING.
1) TEST PLANNING
Plan:
Plan is a strategic document, which describes how to perform a task in an effective, efficient and optimized way.

Optimization:
Optimization is a process of reducing or utilizing the input resources to their maximum and getting the
maximum possible output.

Test Plan:
It is a strategic document, which describe how to perform testing on an application in an effective, efficient and
optimized way. The Test Lead prepares test plan.

CANTENTS OF THE TEST PLAN

1.0 INTERDUCTION.
1.1 Objective.
1.2 Reference Document.
2.0 COVERAGE OF TESTING.
2.1 Features to be tested.
2.2 Features not to be tested.
3.0 TEST STRATEGY.
3.1 Levels of Testing.
3.2 Types of Testing.
3.3 Test Design Technique.
3.4 Configuration Management.
3.5 Test Metrics.
3.6 Terminology.
3.7 Automation Plan.
3.8 List of Automated Tools.
4.0 BASE CRITERIA.
4.1 Acceptance Criteria.
4.2 Suspension Criteria.
5.0 TEST DELIVARABLES.
6.0 TEST ENVERONMENT.
7.0 RESOURCE PLANNING.
8.0 SHEDULING.
9.0 STAFFING AND TRAINING.
10.0 RISKS AND CONTINGENCES.
11.0 ASSUMPTIONS.
12.0 APPROVAL INFORMATION.
1.0 INTERDUCTION.
1.1 Objective.
The main purpose of the document is clearly described here in this section.

1.2 Reference Document.


The list of all the documents that are referred to prepare the test plan will be listed out here in this section.

2.0 COVERAGE OF TESTING.


2.1 Features to Be Tested
The list of all the features with in the scope are mentioned here in this section

2.2 Features Not To Be Tested


The lists of all the features that are not planed for testing based on the following criteria are mentioned here
in this section.
 Out of scope features
 Low risk areas
 Future functionalities.
 The features that are skipped based on the time constraints.

3.0 TEST STRATEGY


It is defined as an organization level term, which is used for testing all the projects in the organization.
TEST PLAN
It is defined as a project level term, which is describes how to test a particular project in an organization.
Note:
Test strategy is common for all the projects. But test plan various from project to project.

3.1 Levels of Testing


The lists of all the levels of testing that are maintained in that company are listed out here in this section.

3.2 Types of Testing


The lists of all the types of testing that are followed by that company are listed out here in this section.
3.3 Test Design Technique
The lists of all the techniques that are followed by that company during the test case development are listed
out here in this section.
Ex: BVA (Boundary Value Analysis)
ECP (Equable Class Partition)

3.4 Configuration Management

3.5 Test Metrics


The lists of all the tasks that are measured and maintain in terms of metrics are clearly mentioned here in this
section.

3.6 Terminologies
The list of all the terms and the corresponding meanings are listed out here in this section

3.7 Automation plan


The lists of all the areas that are planed for automation in that company are listed out her in this section.

3.8 List of Automated Tools


The lists of all the automated tools that are used in that company are listed out here in this section.

4.0 BASE CRITERIA


4.1 Acceptance Criteria.
When to stop testing in full pledged manner thinking then enough testing is done on the application is clearly
described here in this section.

4.2 Suspension Criteria.


When to stop testing suddenly and suspended the build will be clearly mentioned here in this section.

5.0 TEST DELIVERABLE.


The lists of all the documents that are to be prepared and deliver in the testing phase are listed out here in
this section.

6.0 TEST ENVIRONMENT.


The customer specified environment that is about to be used for testing is clearly describes here in this
Section.

7.0 RESOURCE PLANNING.


Who has to do what is clearly described here in this section.

8.0 SCHEDULING.
The starting dates and the ending dates of each and ever task is clearly described here in this section.

9.0 STAFFING AND TRAINING.


How much staff is to be requited what kind of training is to be provided is clearly planned and mentioned
here in this section.

10.0 RISK AND CONTINGENCES.


The list of all the potential risks corresponding solution plans are listed out here in this section.

Risks
 Unable to deliver the software with in the dead lines.
 Employees may leave the organization in the middle of the project development.
 Customer may impose the dead lines.
 Unable to test all the features with in the time.
 Lake of expatriation.
Contingences
 Proper plan endurance.
 People need to be maintained on bench.
 What not to be tested has to be planed properly.
 Severity priority based execution.
 Proper training needs to be provided.

11.0 ASSUMPTIONS.
The list of all the assumptions that are to be assumed by a test engineer will be listed out here in this section.

12.0 APPRUVAL INFORMATION.


Who will approve what is clearly mentioned here in this section.

2. TEST DEVELOPMENT.

TYPES OF TEST CASES


Test cases are broadly divided into two types.
1) G.U.I Test Cases.
2) Functional test cases.
Functional test cases are further divided into two types.
1. Positive Test Cases.
2. Negative Test Cases.

GUIDELINES TO PREPARE GUI TEST CASES:


 Check for the availability of all the objects.
 Check for the alignments of the objects if at all customer has specified the requirements.
 Check for the consistence of the all the objects.
 Check for the Spelling and Grammar.
 Apart from these guidelines anything we test with out performing any action will fall under GUI test
cases.

GUIDELINES FOR DEVELOPING POSITIVE TEST CASES.


 A test engineer must have positive mind setup.
 A test engineer should consider the positive flow of the application.
 A test engineer should use the valid input from the point of functionality.

GUIDELINES FOR DEVELOPING THE NEGATIVE TEST CASES:


 A test engineer must have negative mind setup.
 He should consider the negative flow of the application.
 He should use at least one invalid input for a set of data.

Test Case Template:


1. Test Objective :
2. Test Scenario :
3. Test Procedure :
4. Test Data :
5. Test Cases :

1. Test Objective:
The purpose of the document is clearly described here in this section.

2. Test Scenarios:
The list of all the situations that are to be tested, that are listed out here in this section.

3. Test Procedure:
Test procedure is a functional level term, which describes how to test the functionality. So in this section one
will describe the plan for testing the functionality.

4. Test Data:
The data that is required for testing is made available here in this section.

5. Test Cases:
The list of all the detailed test cases is- listed out here in this section.
Note:
Some companies even maintain all the above five fields individually for each and every scenario. But some
companies maintain commonly for all the scenarios.

3. TEST EXECUTION.
During the test execution phase the test engineer will do the following.

1. He will perform the action that is described in the description column.


2. He will observe the actual behavior of the application.
3. He will document the observed value under the actual value column.
4. RESULT ANALYSIS.
In this phase the test engineer will compare the expected value with actual value and mention the result as
pass if both are match other wise mentioned the result as fail.

5. BUG TRACKING.
Bug tracking is a process in which the defects are identifying, isolated and managed.

DEFECT PROFILE DOCUMENT


Defect ID:
The sequences of defect numbers are listed out here in this section.

Steps of Reproducibility:
The lists of all the steps that are followed by a test engineer to identity the defect are listed out here in this
section.
Submitter:
The test engineer name who submits the defect will be mentioned here in this section.

Date of Submission:
The date on which the defects submitted is mentioned here in this section.

Version Number:
The corresponding version number is mentioned here in this section.

Build Number:
Corresponding build number is mentioned here is this section.

Assigned to:
The project lead or development lead will mentioned the corresponding developers name for name the defect
is assigned.
Severity:
How serious the defect is, is described in terms of severity. It is classified in to 4 types.

1. FATAL Sev1 S1 1
2. MAJOR Sev2 S2 2
3. MINOR Sev3 S3 3
4. SUGGESION Sev4 S4 4
FATAL:
It is all the problems are related to navigational blocks or unavailability of functionality then such types of
problems is treated to be fatal defect.
Note: It is also called as show stopper defects.
MAJOR:
It at all the problems is related to the working of the features then such types of problems are treated to be
major defects.
MINOR:
It at all the problems is related to the look and feel of the application then such types of problems are treated
to be minor defects.
SUGGITIONS:
If at all the problems are related to the value of the application then such types of problems are treated to be
suggestions.
Priority:
The sequence in which the defects have to be rectified is described in terms of priority. It is classified in to 4
types.
1. CRITICAL 2.HIGH 3.MEDIUM 4.LOW
Usually the FATAL defects are given CRITICAL priority, MAJOR defects are given HIGH priority, MINOR
defects are given MEDIUM priority and SUGGITION defects are given LOW priority sent depending upon
the situation the priority may be changed by the project lead or development lead.
Ex: -
Low Severity High Priority Case:
In the case of customer visit all the look and feel defects, which are usually less savior, are given highest
priority.

High Severity Low Priority Case:


If at all some part of the application is not available because it is under development still the test engineer
will treat team as FATAL defect, but the development lead will give less priority for those defects.

Hold
BUG LIFE CYCLE
Testers Mistake
No
Require As Per Design

Test Yes
Develop really Rectification
Defect

M1 BH # 1 Fixed for verification

BH # 2
Testing

Yes
If
New/Open No
Defect Stop the Testing

Is it
No really Yes
Reopen Closed
rectified
?
New / Open:
When ever the defect is found for the first time the test engineer will set the status as New / Open. But some
companies will say to set the status as only new at this situation and once the developers accept the defect
they will set the status as open.

Reopen and Closed:


Once the defects are rectified by the developer and the next build is released to the testing department then
the testers will check whether the defects are rectified properly or not.

If they feel rectified they will set the status as Closed. Other wise they will set the status as Reopen
Fixed for Verification / Fixed / Rectified.
When ever the test engineer raises the defects, accepted in the developers. Rectified then they will set the
status as Fixed.
Hold:
Whenever the developer confused to accept or Reject the defect he will set the status as hold.

Testers Mistake / Testers Error / Rejected.


Whenever the developer is confused it is not at all a defect then he will set the status as reject.

As Per Design (This is a rare case)


When ever some new changes are incorporated engineers then the test engineers will raze then as defects but
the developers will set the status as ‘As per Design’.
Error:
It is a problem related to the program.
Defect:
If the test engineer with respect to the functionality identifies a problem then it is called defect.
Bug:
If the developer accepts the defect, that is called as Bug.

Fault / Failure:
The customer identity the problem, after delivery. It is called Fault / Failure.

6. BUG REPORTING.
1). Classical Bug Reporting Process:
Project Lead
Test Lead
Mail

TE1 TE2 TE3


Dev1 Dev2 Dev3

Drawbacks: 1.Time consuming 2. Redundancy. 3. No Security.


2). Common Repository Oriented Bug Reporting Process:

TL
PL

Common Repository

TE1 TE2 TE3


Dev1 Dev2 Dev3
Drawbacks: 1.Time consuming.
2. Redundancy.

3). Bug Tracking Tool Oriented Bug Reporting Process:

TL PL

BTT

TE1 TE2 TE3 Dev1 Dv2 Dve3

Big Tracking Tool:

It is a software application that can be accessed only by the otherwise person and used for managing the
complete bug tracking process by providing all the facilities along with a defect profile template.
Note:

At the end of the testing process usually the test lead will prepare the test summary report which is also
called as test closure.
TEST DESIGN TECHNIQUES:

While developing the test cases if at all the test engineer feels complex in some areas to over come that
complexity usually the test engineer will use test design techniques.
Generally two types of techniques are used in most of the companies.

1. Boundary Value Analysis (BVA).


2. Equableness Class Partition (ECP).
1). Boundary Value Analysis (BVA).
When ever the engineers need to develop test cases for a range kind of input then they will go for boundary
value analysis. This describes to concentrate on the boundary of the range.
Usually they test with the following values.
LB-1 LB LB+1 MV UB-1 UB UB+1

2). Equableness Class Partition (ECP).


When ever the test engineer need to develop test cases for a feature which has more number of validation
then one will go for equableness class partition. Which describe first divide the class of inputs and then
prepare the test cases?

Ex: Develop the test cases for E-Mail Test box whose validations are as follows.

Requirements:
1. It should accept Minimum 4 characters Maximum 20 characters.
2. It should accept only small characters.
3. It should accept @ and _ special symbols only.

Boundary Value Analysis:


LB-1 LB LB+1 MV UB-1 UB UB+1

3ch 4ch 5ch 12ch 19ch 20ch 21ch


Equableness Class Partition (ECP).

Valid Invalid

4char 3char
5char 21char
12char A–Z
19char 0–9
20char All the Special Symbols apart
a–z form @ and _.
@ Alpha Numeric.
_ Blank Space
Dismal Numbers.

Test Case Document:

Test
Case ID Test Case Description Expected Value
Type

1 +ve Enter the value as per the VIT It should accept.

2 -ve Enter the value as per the IIT It should not accept.
Valid Input Table (VIT). Invalid Input Table (IIT).

Sl NO Input Sl No Input

1 abcd 1 abc

2 ab@zx 2 ABCD

3 abcdabcd@ab_ 3 ABCD123

4 abcdabcddcbaaccd_@z 4 12345.5

5 abcdabcdabcdabcdz@_x 5 abcd abcd abcd abcd

6 abcdabcdabcdabcd_xyz 6 abcdabcd-----abc*#)

PROCESS OF DEVELOPING LOGIN SCREEN TEST CASES.

Use Case

It is a description of functionality of certain feature of an application in terms of actors, actions and response.
Preparation of use case

Input information requires to preparing the use cases.

Snap Shot:
User Name
Password
Connect To
OK Clear Cancel

Functional Requirements

1. Login screen should contain user name, password connect to fields, login, clear and cancel buttons.
2. Connect to field is not a mandatory field but it should allow the user to select a database option who ever
request it. So that he can connect to the mentioned database while login in.
3. Upon entering valid username, valid password and clicking on login button corresponding page must be
displayed
4. Upon entering some information into any of the fields and clicking on clear button all the fields must be
cleared and the cursor should be placed in the user name field.
5. Upon clicking on cancel button login screen must be closed.
Special Requirements / Business Rules / Validation.

1. Initially whenever the login screen is invoked (opened) the login, clear button must be disabled.
2. Cancel button must be always enabled.
3. Upon entering user name and password the login button must be enabled.
4. Upon entering some information into any of the field the clear button must be enable.
5. The tabbing must be User name, Password, Connect to, Login, Clear and Cancel.

Use case Template:

Name of the Use case :


Brief discretion of the Use case :
Actors involved :
Special Requirements :
Pre - Conditions :
Post – Conditions :
Flow of events :

Use case Document:


Name of the Use case : Login Use Case
Brief discretion of the Use case : This use case describes the functionalities of all the
Features of login screen.
Actors involved : Normal User, Admin User.

Special Requirements There are two types of special requirements.

1. Implicit Requirements.
2. Explicit Requirements.
Implicit Requirements:
The Requirements that are analyzed by the business analyst with out the permission of customer in order to
increase the value of the application are known as implicit req2uirments.
Explicit Requirements:
The customer specified special requirements are known as explicit requirements.
Implicit Requirements:
1. Whenever the login screen is invoked initially the cursor should be placed in the user name field.

2. Upon entering invalid user name and valid password and clicking on login button an error message should
be displayed as follows.

“Invalid Username Please Try Again”.

3. Upon entering valid user name and invalid password and clicking on login button an error message should
be displayed as follows.
“Invalid Password Please Try Again”.

4. Upon entering both invalid username and password and clicking on login button an error message should be
displayed as follows.

“Invalid Username and Password Please Try Again.


Explicit Requirements:
1. Login screen should contain user name, password connect to fields, login, clear and cancel buttons.

2. Connect to field is not a mandatory field but it should allow the user to select a database option who ever
request it. So that he can connect to the mentioned database while login in.

3. Upon entering valid username, valid password and clicking on login button corresponding page must be
displayed

4. Upon entering some information into any of the fields and clicking on clear button all the fields must be
cleared and the cursor should be placed in the user name field.

5. Upon clicking on cancel button login screen must be closed.

Pre - Conditions:
Login screen must be available.
Post – Conditions:
Either home page or admin page for valid users and error message for invalid users.
Flow of events:
Main Flow

Action Response
Actor invokes the application. Login screen is displayed with the following felids
username, password, connect to, login, clear and
cancel.
Actor enters valid username, valid password and Authentication, either home page or admin page is
clicks on login button. displayed depending upon the actor entered.

Actor enters valid username, valid password, Authentication, either home page or admin page is
selects a database option and clicks on login button. displayed with the mentioned data base connection
depending upon the actor entered.
Actor enters invalid username, valid password and Go to alternative flow table 1.
clicks on login button.
Actor enters valid username and invalid password Go to alternative flow table 2.
and clicks on login button
Actor enters invalid username and invalid password Go to alternative flow table 3.
and clicks on login button.
Actor enters some information into any of the fields Go to alternative flow table 4.
and clicks on the clear button.
Alternative Flow Table 1. (Invalid Username)

Action Response
Actor enters invalid username, valid password and Authenticates, an error message is displayed
clicks on login button. “Invalid Username Please Try Again”.

Alternative Flow Table 2. (Invalid Password)


Action Response
Actor enters valid username, invalid password and Authenticates, an error message is displayed
clicks on login button. “Invalid Password Please Try Again”.

Alternative Flow Table 3. (Invalid Username and Password)

Action Response
Actor enters invalid username, invalid password Authenticates, an error message is displayed
and clicks on login button. “Invalid password and username”.

Alternative Flow Table 4. (Clear Click)

Action Response
Actor enters some information into any of the fields All the fields are cleared and the cursor is placed on
the user name.

Alternative Flow table 5. (Cancel Click)

Action Response
Actor clicks on the cancel button Login screen is closed.

Guidelines to be followed by a test engineer once the use case document is give to him.

1. Identify the module to which the use case belongs.


2. Identify the functionality of the use case with respect to the total functionality.
3. Identify the functionality points and prepare the functional point document.
4. Identify the actors involved in the use case.
5. Identify the inputs required to perform testing.
6. Identify whether the use case is linked with any other use case.
7. Identify the pre – condition.
8. Identify the post – condition.
9. Understand the main flow of the use case.
10. Understand the alternative flow of the user case.
11. Understand the special requirements.
12. Document the test cases for the main flow of the application.
13. Document the test cases for the alternative flow of application.
14. Document the test cases for special requirements.
15. Prepare the crass reference matrix (or) Tracebility matrix.

Functional points

The point where a user can perform some action is known as functional point.

The sequence of the document prepare during the test process.


FRS FPD MTCD DTCD DPD
1.User entry ……… ………
UCD 2.Passwor Vall L.C ……… ………
entry ……… ………
3.DB selection Vall C.C ……… ………
4. Login click
……… ……...
5. Clear click Vall Ca.C
6. Cancel click
……... ………

Tracebulity matrix (or) Cross-reference matrix.

It is a document which contains a table of linking information used for tracing back for the reference in any
kind of questionable or confusion situations.

UCD FPD MTCD DTCD DPD

26 6 3 38 1

32 8 4 42 3

Requirements Tracebulity Matrix. Defects Tracebulity Matrix.

UCID TCID TCID DPD


1 2 38 1
42 2
2 4 34 3

Test Case Template:

TC
ID TYPE Description Expected Value Actual Value Result Severity Priority Reference
All the objects are
Check for the availability of All the objects must be available as per
all the objects as per the available as per the
the login obj
Login Obj Tab login obj tab
1 GUI tab Pass
All the objects must be
consistent with each All the objects are
Check for the constancy of other consistent with
2 GUI all the objects each other Pass
Check for the spellings of
all the objects as per the All the objects must be
spelled properly as per
Login obj TabLogin Obj All the objects are
the login obj tab
3 GUI Tab.xls spelled properly Pass

Initially login, clear must


Check for the enable be disabled and cancel Login, clear and
property of login, clear and button must be enabled cancel buttons
4 GUI cancel buttons are enabled Fail
Initially the cursor must
be positioned in the user Initially the cursor
Check for the initial position name field is present in the
5 GUI of the cursor username field Pass
Enter some information in
to user name and password
Login button must be
field and check for the
enabled
enabled property of login Login button is
6 Positive button enabled Pass
Enter some info in to any of
the fields and check for the Clear button must be
enabled property of clear enabled Clear button is
7 Positive button enabled Pass
Corresponding page Corresponding
Enter user name, Password must be displayed as pages are
as per the VIT and click on per the VIT displayed as per
8 Positive login button valid in puts table Pass
Corresponding page Corresponding
pages are
Enter username, Password must be displayed as displayed as per
as per the VIT and select a per the VIT With the the VIT with the
data base option and click mentioned data base mentioned data
9 Positive on login connection base connection Pass
All the fields are
All the fields must be
cleared but the
cleared and the Cursor
Enter some information in cursor is not
should be placed in the
to any of the fields and placed in the user
user name fields
10 Positive clear button name field. Fail

Login screen should be


closed Login screens
11 Positive Click on the cancel button closed Pass
Tabbing order is
Tabbing order must be as follows
as follows User name, Username,
Password, Connect to, Password,
Check for the tabbing order Login, Clear, Cancel Connect to Login,
12 Positive of all the objects clear, Cancel Pass
Corresponding Corresponding
Enter username, Password messages should be error messages
as per the IVIT and click displayed as per IVIT
are not displayed
13 Negative on login button as per IVIT Fail
Enter some information
only in to the user name
Login button must be
field and check for the
disabled
enabled property of login Login button is
14 Negative button enabled Fail
Enter some information
only in to the password field Login button must be
and check for the enabled disabled Login button is
15 Negative property of login button enabled Fail

LOGIN OBJ TAB

Sno Obj Name Type

1 User name Text Box

2 Password Text Box

3 Connect To Combo box

4 Login Button

5 Clear Button

6 Cancel Button

Valid Inputs Table

Sno User Name Password Expected Page Actual value Result

1 Suresh qtp Admin Admin Pass

2 Raja rani Home page Home page Pass

3 Chiru Valid Inputs


mbbs HomeTable
page (IVIT)
Home page Pass
Sno User Name Password Expected Page Actual value
4 Praveen puppy Home page Home page Pass Result
1 Suresh1 qtp Invalid User name Plz try Admin
5 NTR illu again page
Home Home page Pass Fail
Raja2 rani Invalid User name Plz try Invalid User name Plz
6 Admin Admin Admin
again Admin Pass
try again Pass
Invalid password Plz try Invalid password Plz
Chiru DADA again try again Pass
Invalid password Plz try Invalid password Plz
4 Praveen TOPI again try again Pass
Invalid user name & Invalid user name &
5 NTR1 illu4 password Plz try again password Plz try again Pass
Invalid user name & Invalid user name &
6 Admin5 Admin6 password Plz try again password Plz try again Pass
Defect Defect Steps for Date Of Version Build Assign
Id Description Reproducibility Submitter Submission NO No to SeverityPriority Status
Initially Login, clear
buttons are
Not Applicable
enabled instead of
1 being disabled Sri Balaji 11-Feb-08 1.0.0 1

1.Enter some
information in to any
Upon Clicking on of the fields
clear button all the 2.Click on clear
fields are cleared button
but the cursor is 3.Observe that the
not placed in the cursor is not placed
user name field in the username
fields after clearing
all the fields
2 Sri Balaji 11-Feb-08 1.0.0 1
1. Enter Suresh1 in
Upon entering to the user name
Suresh1 as field 2.Enter
username and qtp http in to the
as password and password field
clicking on login 3.Click on login
button admin page button
is displayed 4.Observe admin
instead of error page is displayed
message instead of error
message.
3 Sri Balaji 11-Feb-08 1.0.0 1
1. Enter some
information in to
Up on entering the username field
information only in 2.Check for the
to the user name enabled property of
field login button is login button. 3.
enabled instead of Observe that login
being disabled button is enabled
instead of being
disabled.
4 Sri Balaji 14-Feb-08 1.0.0 1
1.Enter some
information in the
Upon entering the password field
information only in 2.Check for the
to the password enabled property of
field login button is login button
enabled instead 3.Observe that login
being disabled button to enable
instead of being
disabled
5 Sri Balaji 11-Feb-08 1.0.0 1
Defect Profile Document:
Defect- Id: -
The list of defect numbers are mentioned here in this section

Defect Description: -
What exactly the defect is clearly described here in this section

Steps for reproducibility: -


The list of all the steps that are followed by the test engineer to identify the defect will be listed out here in
this section
Submitter: -
The name of the test engineer who has submitted the defect will be mentioned here in this section.

Date Of Submission: -
The date on which the defect is submitted is mentioned here in this section.

Version No: -
Corresponding Version number is mentioned here in this section.

Build No: -
Corresponding build number is mentioned here in this section.
Assigned To: -
The development lead will fill the developer’s name for whom the defect is assigned.
Severity: -
How serious the defect is defined in terms of severity, Severity is classified in to four types:

1. Fatal (Sev1) or S1 or 1
2. Major (Sev2) or S2 or 2
3. Minor (Sev3) or S3 or 3
4. Suggestion (Sev4) or S4 or 4

1. Fatal:
If at all the problems are related to the navigational blocks or unavailability of functionality then such type of
defects is treated to be fatal defect.

Val1
Main Menu Val2

Result UN
AVAILABL
ADD E
Next
Next Next
2. Major: -

If at all the problems are related to the working of major functionalities then such types of defects are treated
to be major defects.

10
Val1
20
Val2

Result -10

ADD
Val1
3. Minor: -
Val2
If at all the problems are related to the Look & Feel of the application then such type of defects are treated
to be minor defects
Result

BAD
4. Suggestions: -

If at all the problems are related to the value of the application then such type defects are treated to be
suggestions.

Invalid
Some
+Ve integer Box
alphabets Entry
Plz Try
again
Priority: -

Priority defines the sequence in which the sequence in which defects has to be rectified. It is classified in to
four types

1. Critical (Pri1) or P1 or 1
2. High (Pri2) or P2 or 2
3. Medium (Pri3) or P3 or 3
4. Low (Pri4) or P4 or 4

Usually the Fatal defects are given critical priority, Major defects are given High priority, Minor defects are
given Medium Priority and suggestions are given Low Priority, but depending up on the situations the
priority will be changing.

I - Case:
Low severity-High Priority Case: -
Up on customer visit to the company all the look and feel defects are given highest priority.
II - Case:
High severity –Low Priority Case: -

When ever 80% of the application is released to testing department as 20% is missing the test engineers will
treat them as fatal defect but the development lead will give least priority for those defects as features are
under development.

You might also like