You are on page 1of 47

gcrindia@gmail.

com

Manual Testing
Software Quality:
Software satisfies quality only when it meets to Customer
Requirement / Customer Satisfaction / Customer Expectations. Meets
Customer Requirement refers to proper output & Customer
Expectations refers to extra characteristics, good interface, speed,
privacy, security, easy to operate, good functionality.
Non-technical reasons: Cost of product & Time to market
Software Quality Assurance:
SQA are the Concepts to be followed by company to develop the
software. An SQA team is responsible for Monitoring & Measuring the
strength of developing processes.
Software Project:
A set of problems assigned by the client, who can be solved by
the software people through the process of software Engineer called
Software project. In short, the problem, the people, the process called
project. Software related problem is solved by software engineers
through software engineer process is software project.
Software Development Life Cycle / Life Cycle Development:
Stages involved in software project development
1)
2)
3)
4)
5)
6)

Information gathering; Customer requirement


Analysis; Customer requirement v/s Solutions
Design; Dividing the project into modules & coupling them
Coding; Physical construction of project
Testing
Maintenance

Information gathering stage:

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

gcrindia@gmail.com

In this stage, Business Analyst studies the requirement of the


client /customer and they prepare Business Requirement Specification
(BRS) documents.
Analysis:
In this stage, Sr. Analyst prepares Software Requirement
Specification (S/w RS) document with respect to corresponding BRS
document. This document consists of two sub-documents System
Requirement Specification (SRS) & Functional Requirement
Specification (FRS). SRS contain details about software & hardware
requirement. FRS contains details about the functionality to be used
in project.
Designing:
In designing phase, Designers creates two documents High
Level Document (HLD) & Low Level Document (LLD). HLD consists of
main modules of the project from root to leaf and multiple LLDs. LLD
consists of sub-modules of main module along with Data flow
diagrams, ER-Diagrams, etc., are prepared by technical support
people or designers called Internal designers.

Black box tester should have knowledge


requirement
Black box testing tests BRS & SRS
Testing external interfacing is Black box testing
Testing internal interfacing is White box testing
White box testing is done w.r.t design documents

of

customer

Testing:

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

gcrindia@gmail.com

Development

V MODEL TESTING

Testing
Assesment development plan
Prepare Test plan
Requirement phase testing

Information

Design phase testing


Programme phase testing

Design & Coding

Install Build
Maintenance

Functional system testing


User acceptance testing
Test documentation
Port testing
Test Software changes
Test efficiency

* Test plan is developed based on Development plan


Formula for test efficiency: DRE = A / A + B
DRE = Defect Removal Efficiency
A -> Bugs found at the testing side
B -> Bugs found at the client side
DRE= 0.8 0.9 good
0.7 0.8 requires improvement
< 0.7 poor
Refinement Form of V-Model

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

gcrindia@gmail.com

BRS / URS/ CRS

User Acceptance Test

S/w RS

Black box Testing

HLDs

Integration Testing

LLDs

Unit Testing

Coding

From the above refinement form of V-Model, small & medium


scale organizations are maintaining separate testing team for
Functional & System testing stage.
1)

Reviews during Analysis

In general, Software development process starts with


Information Gathering & Analysis. In this stage Business Analyst
category people are preparing BRS & S/w RS documents and after
completion documents preparation, they conduct reviews on the
documents for completeness & correctness. This review focuses on
below factors:
1)
2)
3)
4)
5)

Are
Are
Are
Are
Are

they
they
they
they
they

complete
met with right requirements of client / customer
achievable w.r.t technology
reasonable w.r.t time & cost
testable

2) Reviews during Design

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

gcrindia@gmail.com

After completion of Analysis phase & their reviews, our Projectlevel designers will start logical design of application in terms of
External & Internal design (HLDs & LLDs). In this stage, they
conduct reviews for completeness & correctness of designed
documents. This review focuses on below factors:
1)
2)
3)
4)
5)
3)

Are they understandable


Are they met with right requirements of client / customer
Are they complete
Are they follow able
Does they handle errors

During Unit Testing

After completeness of Design & their reviews, software


programmers will starts coding the logical design to physical
construction of software. During these coding stage programmers is
conducting Unit Testing through a set of White box testing
techniques, Unit Testing also known as Module / Component /
Program / Micro testing
White box Testing:
There are three possible White box testing techniques
1) Execution Testing
Basic path coverage Execution of all possible blocks in a
program
Loops coverage Termination of loop statements
Programmer technique coverage less no of memory cycles &
CPU cycles
2)
Operation Testing Running the application on Customer
expected platforms

3) Mutation Testing

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

gcrindia@gmail.com

Mutation means that a change program. White box testers are


performing the change in the program to estimate test coverage on
that program. Mutation testing can decide whether the test coverage
is correct or wrong
4) Integration Testing
After completion of dependent modules of development &
testing, Programmers combine them to form a System. In this
Integration, they are conducting Integration testing on the compiled
modules w.r.t HLD.
There are three approaches to conduct Integration testing
a) Top-Down Approach

Mai
n
Stub

Sub 1

Sub 2

Top-Down Approach
In this approach, testing is conducted on Main module without
conducting testing to some of sub-modules. From the above diagram,
a Stub is a temporary program instead of under constructive submodule, it is known as called program.

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

gcrindia@gmail.com

b)

Bottom-Up Approach

Main
Driver

Sub 1

Sub 2
Bottom-Up Approach
In this approach, testing is conducted on sub-modules without
conducting testing on main modules. From the above diagram, a
Driver is a temporary program instead of main module, it is known as
calling program.
c) Sandwich or Hybrid Approach
Main
Driver

Sub 1

Sub 2

Stub

Sub 3
Sandwich / Hybrid Approach

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

gcrindia@gmail.com

In this approach, testing is conducted taking both Top-Down &


Bottom Approaches.
* Build: A finally integrated all modules set *.exe form file is called
build.
4)

Functional & System testing (* imp)

After completion of final integration of modules as a system,


Testing Engineers are planning to conduct Functional & System
testing through Black box testing techniques, these techniques
classified into four categories.
1)
2)
3)
4)

Usability Testing
Functional Testing
Performance Testing
Security Testing

From Above 1 & 2 are Core level and 3 & 4 are Advance level
During Usability testing, Testing team validates UserFriendliness of screens.
During Functional testing, TT validates the correctness of
customer requirements
During Performance testing, TT estimates speed of processing
During Security testing, Testing team validates privacy to User
operations
1)

Usability Testing

In general, TT starts with test execution with Usability testing.


During test, Testing team validates User-Friendliness of screens of
build. During Usability testing, TT applies two types of sub-test:
a) User Interface Test
Easy of use (Understandable screens)
Look & Feel (Attractiveness & Pleasantness)
Speed in Interface (Less no of event to complete a test, easy
short navigation)

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

gcrindia@gmail.com

b) Manual Support Test


Context sensitiveness of user manuals
Manual support test are conducted at the end of all testing &
before release
2)

Functional Testing

The major part of Black box testing is Functional testing, during


this test, Testing team1 concentrates on meet customer
requirements. This Functional Testing is classified into below subtest.
a) Functional / Requirement Testing
During this test, Test Engineers validates correctness of every
functionality in terms of below coverages.
Behavioral coverage (changes in object properties)
Input domain coverage (size & type of every input & output
object)
Error handling coverage (preventing negative navigations)
Calculations coverage (correctness of output)
Back-end coverage (impact of Front-end operation on back-end
tables contents)
Service levels coverage (order of functionalities)
b) Input Domain Testing
It is a part of Functionality testing; Test Engineers are
maintaining special structures to define size & type of input object
c) Recovery Testing
It is also known as Reliability testing. During this test, Testing
team validates whether the application is changing from abnormal
state to normal state or not.
d) Compatibility Testing
It is also known as Portability testing. During this test, Testing
team validates whether application build run on customer expected

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

gcrindia@gmail.com

platforms or not. During this test, Testing Engineers are finding


Backward compatibility at maximum.
Forward compatibility -> application is ready to run but
Operating system is not supporting.
Backward compatibility -> Operating system is supporting but
the application has some internal coding problems to run on
Operating system
e) Configuration Testing
It is also known as Hardware compatibility testing. During this
test, Testing team validates whether application build supports
different technology Hardware devices or not.
f) Inter-Systems Testing
During this test, Testing team validates whether application
build co-existence with other existing softwares or not and also test
whether any Dead lock situation occurs or not.

g) Installation Testing
During this test, Testing team validates whether application
build along with supported softwares into customers site like
configured systems. During this test, Testing team observes below
factors.
Setup program execution to start installation
Easy Interface
Amount of disk occupied after installation
h) Parallel / Comparative Testing
During this test, Testing team compares application build with
competitive products in market.
i) Sanitation / Garbage Testing
During this test, Testing team tries to find extra features in
application build w.r.t customer requirements.

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

10

gcrindia@gmail.com

* Defects
During this test, Testing team reports defects to developers in
terms of below categories
1. Mismatches between expected & actual
2. Missing functionality
3. Extra functionality w.r.t customer requirement
* Manual v/s Automation
A Tester conducts any test on application build without using
any testing tool is called Manual testing, if any testing tool is used
then it is called Automation testing
In common testing process, Testing Engineers are using test
Automation w.r.t test impact & criticality. Impact -> test repetition &
Criticality -> complex to apply test manually. Due to these two
reasons testing people are using test Automation.
j) Re-testing
The re-execution of a test with multiple test data to validate a
function, e.g. To validate multiplication, Test Engineers use different
combinations of input in terms of min, max, -ve, +ve, zero, int, float,
etc.

k) Regression Testing
The re-execution of test on modified build to ensure bug fixing
work & occurrence of any side effects, Test Engineers conducts this
test using Automation
l) Error, Defect & Bug
A mistake in code is Error, due to errors in coding, Test
Engineers are getting mismatches in application build are defects, if
the defects are accepted by developers to be solves then it is Bug.

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

11

gcrindia@gmail.com

Testing Documents

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

12

gcrindia@gmail.com

Test Policy

Test Strategy

Test
Methodology
Test
Plan
Test Cases

Test
Procedure
Test Script

Defect
Report
Final Test Summary
Report
Above Figure, shows the various levels of documents prepared
at project testing. Test Policy is documented by Quality Control. Test
Strategy & Test Methodology are documented by Quality Analyst or
Project Manager. Test Plan, Test Cases, Test Procedure, Test Script &
Defect Report are documented by Quality Assurance Engineers or
Test Engineers.
Test Policy & Test Strategy are Company Level Documents. Test
Methodology, Test Plan, Test Cases, Test Procedure, Test Script,
Defect Report & Final Test Summary Report are Project Level
Documents.

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

13

gcrindia@gmail.com

1)

TEST POLICY:

This document developed by Quality Control people


(Management). In this document Quality Control defines Testing
Objective.

Test Policy Document


Address of the Company
Test Definition

Verification & Validation

Testing Process

Proper planning before starts testing

Testing Standards :
(Functional points)

One defect per 250 lines of code or 10 FP

Testing Measurements

QAM, TTM, PCM


*******
(C.E.O)

QAM: Quality Assurance Measurements, how much quality is


expected
TTM: Testing Team Measurements, how much testing is over & is yet
to complete
PCM: Process Capability Measurements, depends on old project to
the upcoming projects.
2)

TEST STRATEGY:

This is a Company level document & developed by Quality


Analyst or Project Manager Category people, it defines Testing
Approach.
Components:

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

14

gcrindia@gmail.com

a) Scope & Objective: Definition & purpose of testing in


organization
b) Business Issue: Budget control for testing
c) Test Approach: Mapping between development stages & Testing
Issue.
d) Test Deliverables: Required testing documents to be prepared
e) Roles & Responsibilities: Names of the job in testing team &
their responsibilities
f) Communication & Status reporting: Required negotiation
between testing team & developing team during test execution
g) Automation & Testing Tools: Purpose of automation &
possibilities to go to test automation
h) Testing Measurements & Metrics: QAM, TTM, PCM
i) Risks & Mitigation: Possible problems will come in testing &
solutions to overcome
j) Change & Configuration Management: To handle change
request during testing
k) Training Plan: Required training sessions to testing team before
start testing process
Testing Issues:
1. Authorization: Whether user is valid or not to connect to
application
2. Access Control: Whether a valid user have permission to use
specific service
3. Audit Trail: Maintains metadata about user operation in our
application
4. Continuity of Processing: Inter-process communication
5. Correctness:
Meet customer requirement in terms of
functionality
6. Coupling: Co-existence with other existence software to share
resources
7. Ease of Use: User Friendliness of the screens
8. Ease of Operator:
Installation, Un-installations, Dumping,
Uploading, Downloading, etc.,
9. File Integrity: Creation of backup
10.
Reliability: Recover from abnormal state
11.
Performance: Speed of processing
12.
Portable: Run on different platforms
13.
Service levels: Order of functionalities
14.
Maintainable: Whether our application build is long term
serviceable to our customer

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

15

gcrindia@gmail.com

15.
Methodology: Whether our tester are following standards
or not during testing
3) TEST METHODOLOGY:
It is project level document. Methodology provides required
testing approach to be followed for current project. In this level
Quality Analyst select possible approach for corresponding
project testing.

Test
Initiatio
n

Test
Plannin
g

Test
Designi
ng

Test
Executi
on

Test
Closu
re

Test
Reporti
ng
Pet Process:
Process involves experts, tools & techniques. It is a refinement
form of V-Model. It defines mapping between development & Testing
stages. From this model, Organizations are maintaining separate team
for Functional & System testing & remaining stages of testing done by
development people. This model is developed in HCL & recognized by
QA Forum of INDIA.

TESTING PROCESS

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

16

gcrindia@gmail.com

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

17

gcrindia@gmail.com
Information Gathering (Business Requirement
Specifications)
Analysis (Software Requirement
Specification)

Desig
n

Test
Initiation

Codin
g

Test Planning &


Training
Test
Design

Unit &
Integration

Test Cases Selection &


Closure

Level 0
Sanity / Smoke / Tester Acceptance Test / Build
Verification Test
Test
Automation
Create Test Suits / Test Batches /
Test Sets
Defect
fixing &
Resolving

Level 1
Select a batch & starts
execution

Developer
s

If Mismatch, then suspend


the batch
Otherwis
e

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

18

gcrindia@gmail.com

Test
Closure
Level 3
Final Regression / Releasing Testing / Pre-Acceptance / PostMortem testing
User Acceptance
Testing
Sign Off

4)

TEST PLANNING:

After finalization of possible test for current project, Test Lead


category people concentration on test plan document preparation to
define work allocation in terms of What, Who, When & How to test. To
prepare test plan document, test plan order follows below approach;

Development
document
Test
Responsible
Matrix
1]

Team Formation
Identify Tactical Risks
Prepare Test Plan
Review Test Plan

System Test
Plan

Team Formation:

In general, Test planning process starts with testing team


formation. To define a testing team, test plan author depends on
below factors;
1. Availability of testers
2. Test duration
3. Availability of test environment resource
2]

Identify Tactical Risk:

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

19

gcrindia@gmail.com

After Testing team formation Plan author analysis possible &


mitigation (ad hoc testing)
# Risk 1: Lack of knowledge of Test Engineer on that domain
# Soln 1: Extra training to Test Engineers
# Risk 2: Lack of Resource
# Risk 3: Lack of budget {less no of time}
# Soln 3: Increase Team size
# Risk 4: Lack of Test data
# Soln 4: Conduct test on past experience basis i.e., ad hoc testing or
contact client for data
# Risk 5: Lack of developer process rigor
# Soln 5: Report to Test Lead for further communication between
test & development PM
# Risk 6: Delay of modified build delivery
# Soln 6: Extra hours of work is needed
# Risk 7: Lack of communication in between Test Engineer - > Test
team and
Test team - > Development team
3]

PREPARE TEST PLAN:

After completion of testing team formation & Risk analysis, Test


plan author concentrate on Test Plan Document
in IEEE format.
01) Test Plan ID: Unique No or Name e.g. STP-ATM
02) Introduction: About Project description
03) Test Items: Modules / Functions / Services / Features / etc.
04) Features to be tested: Responsible Modules for Test design
(preparing test cases for
added modules)
05) Features not to be tested: Which feature is not to be tested and
Why? (Due to test
cases available for the old modules, so for
these modules no
need to be tested / no test case

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

20

gcrindia@gmail.com

Above (3), (4) & (5) decides which module to be tested >
What to test?
06) Approach: List of selected testing techniques to be applied on
above specified
modules in reference to the TRM(Test Responsible
Matrix).
07) Feature pass or fail criteria: When a feature is pass or fail
description
(Environment is good) (After testing
conclusion)
08) Suspension criteria: Possible abnormal situations rose during
above features testing
(Environment is not good) (During testing
conclusion)
09) Test Environment: Required software & Hardware to be tested
on above features
10) Test Deliverables: Required testing document to be prepared
(during testing, the type
of documents are prepared by tester)
11) Testing Task: Necessary tasks to do before start every feature
testing
Above (6) to (11) specifies -> How to test?
12) Staff & Training: Names of selected Test Engineers & training
requirements to them
13) Responsibilities: Work allocation to every member in the team
(dependable modules
are given to single Test Engineer)
14) Schedule: Dates & Times of testing modules
Above (4) specifies -> When to test?
15) List & Mitigation: Possible testing level risks & solution to
overcome them
16) Approvals: Signatures of Test plan authors & Project Manager /
Quality Analyst
4)

Review Test Plan:

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

21

gcrindia@gmail.com

After completion of plan document preparation, Test plan author


conducts a review of completion & correctness. In this review, Plan
author follows below coverage analysis
BRS based coverage (What to test? Review)
Risks based coverage (When & Who to test? Review)
TRM based coverage (How to test? Review)
5)

TEST DESIGNING:

After completion of Test Planning & required training to testing


team, corresponding testing team members will start preparing the
list of test cases for their responsible modules. There are three types
of test cases design methods to cover core level testing (Usability &
Functionality testing).
a) Business Logic based test case design (S/w RS)
b) Input Domain based test case design (E-R diagrams / Data
Models)
c) User Interface based test case design (MS-Windows rules)
a)

Business Logic based Test Case design (S/w RS)

In general, Test Engineers are preparing a set of Test Cases


depends on Use Cases in (S/w RS). Every Use Case describes a
functionality in terms of inputs, process & output, depends on this
Use Cases Test Engineers are preparing Test Cases to validate the
functionality

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

22

gcrindia@gmail.com

BRS
Test
Cases

Use Cases &


Functional
Cases

HLDs

LLDs
Coding
*.exe

From the above model, Test Engineers are preparing Test Cases
depends on corresponding Use Cases & every test case defines a test
condition to be applied.
To prepare test cases, Test Engineers studies Use Cases in
below approach:
Steps:
1) Collect Use Cases of our responsible module
2) Select a Use Case & their dependencies from the list
2.1) Identify entry condition (Base state)
2.2) Identify input required (Test data)
2.3) Identify exit condition (End state)
2.4) Identify output & outcome (Expected)
2.5) Identify normal flow (Navigation)
2.6) Identify alternative flows & exceptions
3) Write Test Cases depends on above information
4) Review Test Cases for the completeness & correctness
5) Goto step (2) until completion of all Use Cases completion
Use Case I:

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

23

gcrindia@gmail.com

A login process allows user id & password to validate users.


During these validations, login process allows user id in alphanumeric from 4 to 16 characters long & password in alphabets in
lowercase from 4 to 8 characters long.
Case study:
Test Case 1) Successful entry of user id
BVA (Size)
min
->
min-1 ->
min+1 ->
max-1 ->
max+1
max
->

4 chars
=>
3 chars=> fail
5 chars
=>
15 chars
=>
->
17 chars
16 chars
=>

pass
pass
pass
=> fail
pass

ECP
Valid
a-z
A-Z
0-9

Invalid
special chars
blank

Test Case Format:


During Test design Test Engineers are writing list of Test
Cases in IEEE format.
01)
02)
03)
04)
05)

Test Case ID: Unique no or name


Test Case Name: Name of test condition to be tested
Feature to be tested: Module / Function / Feature
Test suit ID: Batch ID, in which this case is member
Priority: Importance of Test Case {Low, Med, High}

P0 -> Basic functionality


P1 -> General functionality (I/P domain, Error handling,
Compatibility etc,)
P2 -> Cosmetic testing (UIT)
06) Test Environment: Required Software & Hardware to execute

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

24

gcrindia@gmail.com

07) Test afford (person / hr): Time to execute this Test Case e.g. 20
minutes
08) Test duration: Date & Time
09)
Test Setup: Required testing task to do before starts case
execution (pre-requisites)
10) Test Procedure: Step by step procedure to execute Test Case

Test Procedure Format:


1)
2)
3)
4)

Step No:
Action:
Input required:
Expected:

5) Actual:
6) Result:
7) Comments:

-> Test Design

-> Test Execution

11) Test Case passes or fails criteria: When this case is pass or fail
Note: Test Engineers follows list of Test Cases along with step by
step procedures only
Example 1:
Prepare Test Procedure for below test cases Successful file save
operation in Notepad .
Ste
p
No
1
2
3

Action

Input
Required

Open Notepad
Fill with text
Click Save Icon or click
File menu
Option & select save
option
Enter File name & Click Unique
Save
File name

Expected
Empty Editor
Save Icon enabled
Save
Dialog
box
appears with
Default file name
Focus to Notepad &
File name
Appears in title bar of
Notepad

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

25

gcrindia@gmail.com

Note: For more examples refer to notes


b) Input Domain based Test Case design (E-R diagrams / Data
Models)
In general, Test Engineers are preparing maximum Test Cases
depends on Use Cases or functional requirements in S/wRS. These
functional specifications provide functional descriptions with input,
output & process, but they are not responsible to provide information
about size & type of input objects. To correct this type of information,
Test Engineers study Data model of responsible modules E-R diagram.
During Data model study, Test Engineer follows below approach:
Steps:
1) Collect Data model of responsible modules
2) Study every input attribute in terms of size, type & constraint
3)
Identify critical attributes in the test, which is participated in
manipulation & retrivals
4) Identify non-critical attributes such as input & output type
A/C No:
Critical
A/C Name:
Balance:
Non-Critical
Address:
5) Prepare BVA & ECP for every input object
Input
Attribute
Xxxx

ECP
BVA(Size / Range)
Vali Inval Minim
Maxim
d
id
um
um
xxx xxxx
xxxx
xxxx
x

DATA MATRIX

Note:
In general, Test Engineers are preparing step by step
procedure based Test Cases for functionality testing. Test Engineers

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

26

gcrindia@gmail.com

prepare valid / invalid table based Test Cases for input domain of
object testing {Data Matrix }
Note: For examples refer to notes
c) User Interface based test case design (MS-Windows rules)
To conduct Usability Testing, Test Engineers are preparing list
of Test Cases depends on our organization User Interface standards or
conventions, Global User Interface rules & interest of customer site
people.
Example: Test Cases
1) Spelling check
2) Graphics check (Screen level Align, Font style, Color, Size &
Microsoft six rules)
3) Meaning of error messages
4) Accuracy of data displayed
5) Accuracy of data in the database are result of user inputs, if
developer restrict the data in
database level by rounding / truncating then the developer must
also restrict the data in
front-end as well
6) Accuracy of data in the database as the result of external factors.
Ex. File attachments
7) Meaningful help messages (Manual support testing)

Review Test Cases:


After completion of all possible Test Cases preparation for
responsible modules, Testing team concentrates on review of Test
Cases for completeness & correctness. In this review, Testing team
applies coverage analysis.
Test Case Review
1)
2)
3)
4)
5)

BR based coverage
Use Cases based coverage
Data model based coverage
User Interface based coverage
TRM based coverage

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

27

gcrindia@gmail.com

At the end of this review, Test Lead prepare Requirement


Traceability Matrix or Requirement Validation Matrix (RTM / RVM)
Business
Requirement
xxxx

Source
(Use
Cases, Test
Data model)
Cases
xxxx
xxxx
xxxx
xxxx

xxxx

xxxx

xxxx
xxxx

xxxx

xxxx
xxxx
xxxx

xxxx

xxxx
xxxx

xxxx

xxxx
xxxx
xxxx

From RTM / RVM model, it defines mapping between customer


requirement & prepared Test Cases to validate the requirement.
6)

TEST EXECUTION:

After completion of Test Cases selections & their review, Testing


team concentrate on Build release from development side & Test
execution on that build.
a) Test Execution Levels or Phases:
See figure in next page for clear understandability of this levels
b) Test Execution Levels v/s Test Cases:
Level 0 -> P0
Test Cases
Level 1 -> All P0, P1 & P2 Test Cases as batches
Level 2 -> Selected P0, P1, & P2 w.r.t modification
Level 3 -> Selected P0, P1, & P2 w.r.t critical areas in the
master build

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

28

gcrindia@gmail.com

A) Test Execution Levels or Phases:


Initial Build
Stable Build

Defect
Fixing

Defect
Resolving

Defect Report

Modified build

Level 0
Sanity / Smoke / .
Level 1
Comprehensive
Level 2
Regression
Level 3
Final Regression

c) Build Version Control:


In general, Testing Engineers are receiving build from
development in below model

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

29

gcrindia@gmail.com

Server

Build
FTP (file transport
Protocol)

Testing Environment

Testers

During Test execution Test Engineer are receiving modified


build from software. To distinguish old & new builds, development
team maintains unique version in system, which is understandable to
Tester / Testing team. For this version controlling developers are
using version control tools (Visual Source Safe)
d)

Level 0 (Sanity / Test Acceptance / Build verification test):

After receiving initial build, Testing Engineers concentrate on


basic functionality of that build to estimate stability for complete
testing. In this Sanity testing, Testing Engineers tries to execute all P0
Test Cases to cover basis functionality. If functionality is not working /
functionality missing, Testing team rejects that build. If Tester decided
stability then they concentrate on Test execution of all Test Cases to
detect defects.

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

30

gcrindia@gmail.com

During this Sanity testing, Testing Engineer observes below


factors on that build
1)
2)
3)
4)
5)
6)
7)
8)

Understandable
Operatable
Observable
Consistency
Controllable
Simplicity
Maintainable
Automation

From the above 8 testable issues, Sanity test is also known as


Testability testing / OCT angle testing.
e)

Test Automation:

If test automation is possible than Testing team concentrate on


Test script creation using corresponding testing tools. Every Test
script consists of navigation statement along with required check
points.
Stable Build
Test Automation
(Select Automation)
(All P0 & Carefully selected P1 Test Cases)
f)

Level 1 (Comprehensive testing)

After completion of Sanity testing & possible test automation,


Testing team concentrates on test batches formation with depended
Test Cases. Test batches are also known as Test suits / sets. During
these Test batches execution, Test Engineers prepare test log
document, this document consist of three types of entries.
1) Passed (Expected = Actual)
2) Failed (Any one Expected != Actual, Any one Expected
variants from Actual)
3) Blocked (Corresponding parent functionality failed)

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

31

gcrindia@gmail.com

Pass

Skip

In queue

In Progress

Blocked

g)

Fail

Closed

Partial
Pass / Fail

Level 2 (Regression testing)

During Comprehensive test execution, Test Engineers are


reporting mismatches as defects to developers. After receiving
modified build from developers, Test Engineers concentrate on
Regression testing to ensure bug-fixing work & occurrences of side
effects.

Resolved Bug Severity

Low

Medium

High

Case I:
If development team resolve bugs severity which is high, Test
Engineers re-execute all P0, P1 & carefully selected P2 Test Cases on
modified build
Case II:
Bugs severity is medium, then all P 0, carefully selected P1 &
some of P2 Test Cases
Case III:
Bugs severity is low, then some P0, P1 & P2
Case IV:

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

32

gcrindia@gmail.com

If development team released modified build due to sudden


changes in project requirement then Test Engineers re-execute all P 0,
P1 & carefully selected P2 Test Cases w.r.t that requirement
modification.
h) Level 3 (Final Regression / Pre-Acceptance testing)

Gather Regression
Requirement
Test Reporting

Final Regression

7)

Effort Estimation

Plan Regression

TEST REPORTING:
Level - 0
Level - 1

Test Reporting

Level - 2
Level - 3
During comprehensive testing, Test Engineer are reporting
mismatches as defects to developers through IEEE format.
1) Defect ID: Unique No or Name
2) Description: Summary of the defect
3) Feature: Module / Function / Service , in these module TE
found the defect
4) Test Case Name: Corresponding failing test condition

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

33

gcrindia@gmail.com

5) Reproducible (Yes / No): Yes -> Every time defect appears


during test execution
No -> Rarely defect appears
6) If Yes, attach test procedure:
7) If No, attach snapshot & strong reasons:
8) Status: New / Reopen
9) Severity: Seriousness of defect w.r.t functionality (high /
medium / low)
10)
Priority: Importance of the defect w.r.t customers (high /
medium / low)
11)
Reported bug: Name of Test Engineer
12)
Reported on: Date of submission
13)
Assign to:
Name of the responsible person in
development team -> PM
14)
Build Version ID: In which build, Test Engineer fount the
defect
15)
Suggested fix (Optional):
Tester tries to produce
suggestion to solve this defect
16)
Fixed by: PM or Team Lead
17)
Resolved by: Developer name
18)
Resolved on: Date of solving
By Developers
19)
Resolution type: check out in next page
20)
Approved by: Signature of Project Manager (PM)
Defect Age: The time gap between reported on & resolved on
Defect submission process:

Quality
Analyst

If high defect
is rejected

Test Mgr

Test Lead

Test Eng

Project Mgr

Team Lead

Developer

Defect Status Cycle:

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

34

gcrindia@gmail.com

New

Open / rejected / deferred

close

reopen

Deferred => Accepted but not interested to resolve in this


version
Defect Resolution:
After receiving defect report from Testers, developers review
the defect & they send resolution type to Tester as a reply
1) Duplicate, rejected due to the defect is same as previously
reported defect
2) Enhancement, rejected due to the defect is related future
requirement of customer
3) H/w limitation, rejected due to the defect raised w.r.t limitation
of H/w devices
4) S/w limitations, rejected due to the defect raised w.r.t limitation
of S/w Techno
5) Not applicable, rejected due to the defect has no proper
meaning
6) Functions as designed, rejected due to coding is correct w.r.t to
designed docs
7) Need more information, not (accepted / rejected) but
developers requires extra information to understand the defect
8) Not reproducible, not (accepted / rejected) but developers
require correct procedure to reproduce the defect
9) No plan to fix it, not (accepted / rejected) but developers want
extra time to fix
10) Fixed, developers accepted to resolve
11) Fixed indirectly, accepted but not interested to resolve in this
version (default)
12) User misunderstanding, need extra negotiation between testing
& development team.
Types of defects:
01) User Interface bugs (low severity):
1) Spelling mistakes (high priority)
2) Improper alignment (low priority)
02) Boundary related bugs (medium severity)
1) Doesnt allows valid type (high priority)
2) Allows invalid type also (low priority)
03) Error handling bugs (medium severity)

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

35

gcrindia@gmail.com

1) Doesnt providing error message window (high priority)


2) Improper meaning of error message (low priority)
04) Calculations bugs (high severity)
1) Final output is wrong (low priority)
2) Dependent results are wrong (high priority)
05) Race condition bugs (high severity)
1) Dead lock (high priority)
2) Improper order of services (low priority)
06) Load conditions bugs (high severity)
1) Doesnt allow multiple users to access / operate (high
priority)
2) Doesnt allow customers accepted load (low priority)
07) Hardware bugs (high severity)
1) Doesnt handle device (high priority)
2) Wrong output from device (low priority)
08) ID control bugs (medium severity)
1) Logo missing, wrong logo, Version No mistake, Copyright
window missing,
Developers Name missing, Tester Name missing
09) Version control bugs (medium severity)
1) Differences between two consecutive build versions
10) Source bugs (medium severity)
1) Mistake in help documents Manual support
8)

TEST CLOSURE:

After completion of all possible test cycle execution, Test Lead


conducts a review to estimate the completeness & correctness of
testing. In this review, Test Lead follows below factors with Test
Engineers
1) Coverage Analysis
a)
b)
c)
d)
e)

BR based coverage
Use Cases based coverage
Data model based coverage
User Interface based coverage
TRM based coverage

2) Bug density
a) Module A has 20% percentile bugs found

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

36

gcrindia@gmail.com

b) Module B has 20% percentile bugs found


c) Module C has 40% percentile bugs found
d) Module D has 20% percentile bugs found
3) Analysis of deferred bugs
Whether deferred bugs are deferrable or not
At the end of this review, Testing team concentrate on high bug
density modules or all modules if time is available.
9)

User Acceptance Testing (UAT)

Organization management concentrate on UAT to collect


feedback, there are two approaches to conduct testing.
1. Alpha () test
2. Beta () test
10) Sign Of
After completion of User Acceptance testing & their
modifications, Test Lead concentrates on final test summary report
creation. It is a part of Software Release Node (S/w RN). This final
test summary report consist of below documents
1)
2)
3)
4)
5)

Test Strategy / Methodology (TRM)


System Test Plan
Requirement Traceability Matrix (RTM)
Automated Test Scripts
Bugs Summary Reports

The below, has to be in horizontal direction

Bug Description
Feature
Found By
Status (closed / deferred)
Comments
CASE STUDY ON A PROJECT TESTING PROCESS

The X is a client/server product. It is running in single computer with windows 2000


as OS this product provides a facility to search matched records in existing database. w.r.t
has given search keys.

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

37

gcrindia@gmail.com

Local host

client

DSN

DB

Windows 2000

This product maintains a default administrator to create new users and every valid
user search data in database.
Activity flow diagram: -

Login

Admin

New
user
creation

Admin
(Or)
Valid user
Search
records

Invalid
user

Re-login

Search keys
Existing DB
FUNCTIONAL POINTS: Login is taking user id and password
User id and password allows alphabets in lower case from 4 to 8 characters long.
New user ids created by administrators only

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

38

gcrindia@gmail.com

New user creation windows allows unique user id and password with create and
cancel buttons
Search records opened for valid users only
Search records window maintains below search keys
Customer id: 6 digits number (collected from design document
First name: one character in upper or lower
Last name: one to eight characters in lower case characters
Date of birth: dd\mm\yy in numeric
Age: from and to in numeric
Search records windows allows below combination of fields to search records.
Last name
Last with first name
Last name with first name and d-o-b
Last name with first name and age
Customer id only
Search records window consists of start search and stop search buttons.
Allows last name as full or partial
Allows customer id as full or partial with * or wild card
Customer id
2286*----8 6 * ----- Refresh search records window using Alt+ctrl
Display matched records in a pop-up window with ok button after search
completion.
Returns message like too many matches to display when matched records >1000
Test methodology:- (by PM)
Testing stage
Testing Factors
Authorization
Access Control
Audit trail
Continuity to
Correctness
Coupling

System testing

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

39

gcrindia@gmail.com

Data integrity

Ease of use

Ease of operate

Reliability

Fortable

Performance

Service levels

Maintainable

Methodology

Note: Test responsibility matrix for IIXI (11 by 1)

X product is maintaining metadata so


it is not sharing resources of other application
This application went to run on windows 2000
It is a stand alone application (at time one user can operate)
So need net to test load and stress testing)
System test plan by test lead:-

1. Test plan ID: STP_X


2. Introduction: X is a product and it provides a facility to connect to existing data
base and search matching records to retrieve.
3. Test items user creation
Login by user
Search records
4. Features to be tested: user creation
Login by user
Search records
5. Features not to be tested:6. Approach: To apply PM selected 1 factors from TRM, below black box testing
techniques are suitable to follow depends on test lead.
User interface testing
Manual support testing
Functionality testing
i/p domain testing
Recovery testing
Sanitation testing
Installation testing
Security testing
Complaints testing

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

40

gcrindia@gmail.com

7. Entry criteria: Are the necessary documents available?


Is the X product is ready for release from developers?
Is the supporting database available to search required records?
Is the test environment ready?
8. Suspension criteria:
Data base disconnect may require suspension of testing
Suspension of testing is mandatory when searching records process went to
infinite.
Admin is failed to new user creation, a decision can be mad to continue testing
on searching records module using admin or other existing valid users.
9. Exit Criteria: Ensure that the X product provides the required services
Ensure that all test documents has been completed up to date.
All high severity bugs resolved.

10. Test deliverables:Test cases


Test procedures
Automated Test scripts
Test log
Defect Report

Test Engg..

11. Test environment: Client pc or local host


OS: Win2000
DB server: ORACLE/ SQL SERVER/ MS-ACCESS
Connectivity: DSN/ Thin Drivers
12. Testing Tasks: Availability of admin with password
Database consists of records
Valid users are able to database to search records
13. Staff and Training needs:Test Engg.. / QA Engg: Kurugonda Srinaiah
14. Responsibilities:

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

41

gcrindia@gmail.com

Responsibility

Document / Report

1. Tst cases with procedures

Effort

K.Srinaiah

12 hours

Defect Report
Testing

K Srinaiah

every day during

Test completion reports


Testing

Test Lead with

At the end of the

K Srinaiah

15. Schedule:
Task
date
Test design
2005
Implement test and
2005
Review
Execute test
2006
Evaluate test
2006

Effort

Start date

End

12 hours

28-12-2005

29-12-

4 hours

29-12-2005

29-12-

24 hours

30-12-2005

01-01-

8 hours

02-01-2006

02-01-

16. Risks and mitigations:1. Lack of documentation:


Contact business analyst
Ad-hoc testing
Contact customer site people if possible
2. Lack of time
Over-time

3. Delays and delivery:


Over time

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

42

gcrindia@gmail.com

4. Lack of development process reggur:


Contact test lead to motivate developers
Over time to complete task is right time
17. APPROVALS: Test case 1:
Signature of PM and signature of test lead.
Test case 2: Successful entry of user id
BVA (size)
Min = 4 chars pass
Max = 8 chars pass
Min-1 =3 chars fail
Min +1 = 5 chars pass
Max 1 = 7 chars pass
Max +1 = 9 chars fail
Test case 3: successful entry of password
BVA (size)
Min = 4 chars pass
Max = 8 chars pass
Min 1 = 3 chars fail
Min +1 = 5 chars pass
Max + = 7 chars pass
Max = 9 chars fail
Test case 4: successful login operation
User id
Admin
Admin
Other valid
Other valid
Other invalid

Password
valid
invalid
valid
invalid
invalid

Criteria
pass
fail
pass
fail
fail

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

43

gcrindia@gmail.com

Blank
Valid

value
blank

fail
fail

Test case 5:

Successful selection of new user creation option

Test case 6:

Un successful selection of this option due to current user is not admin.

Test case 7

successful entry of user id in create user window

Note: (like as Test case2)


Test case 8:

Successful entry of userid in create user window

Note: (-----------test case 3)


Test case9: Successful creation of new user by admin
Test case10: unsuccessful new user creation due to given user id is not unique.
Test case 11: successful closing of new user creation window using cancel
(After enter user id and offer enter passes)
Test case 12: Successful selection of search records option
Test case 13: successful entry of customer id
BVA (size)

ECP (TYPE)

Min = max = 6

valid
invalid
-------------------------0 9 a-z, A-Z, special characters
Expect * , blank space
Start with *
Exp: 123 , *23,

1*23X
Test case 14: successful entry of first name
Min = max = 1 pass

Ecp
---------------------------

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

44

gcrindia@gmail.com

o fail

Test case 15: successful entry of last name:


Min = 1 and max = 8 pass

valid invalid
---------------------------a- z
o-9,
A-Z
special characters
---------------valid invalid
----------------a- z
A-Z, o-9,
Special characters

Test case 16: successful entry of DOB


BVA (range)
Day: min 01
min 31
Month: min 01
max 12

Ecp (type)
----------------valid
invalid
----------------0- 9 a-z, A-Z
special characters

Year: min 00
max 99
Case 17: success entry of age from
min 01
max 22

Ecp(type)
----------------valid invalid
----------------0- 9
a-z, A-Z
Special characters

Test case 18: successful entry of age to


Note: like as above
Test case 19: successful display of matched records with full last name and all other fields
are
Blank

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

45

gcrindia@gmail.com

Test case 20: successful display of matched records with full last name and first name
Test case21: successful display of matched records with full last name first name and dob
Test case22: successful display of matched records with full last name first name age
Test case 23: on successful search operation due to invalid combination of filled fields.
Test case 24: unsuccessful search operation due to invalid Dob
Test case 25: unsuccessful search operation due to age from greater than age To
Test case26: successful display of records with search key as partial last name and other
fields
as blank
Test case 27:successful display of records with search key a partial last name and first
name
Test case 28: successful display of records with search key as partial last name, first name
and
Dob
Test case 29: successful display of records with search key as partial last name, first name
Test case 30: successful display of records with search key as customer id
Test case 31: successful display of records with search key as partial customer id with *
as
wild card
Test case 32: Unsuccessful display of records due to no matching records in database
w.r.to
given search keys
Test case 33: unsuccessful display of records due to too many records to display when
no
of matched records greater than 1000
Test case 34 : successful refresh search window with Alt +Ctrl
Test case 35: Successful termination of search operation when click stop search button.

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

46

gcrindia@gmail.com

Test case 36: successful closing of records window through click o.k after searching.
Test case 37: spelling check in every screen
Test case 38: Graphics check in every screen
Ex: (alignment, font, size, label, graphical, colour, etc)
Usability
Testing
Test case39: Accuracy of data displayed
Ex:Dob

dd\mm\yy

Test case 40: Meaninful help messages and error messages

G.C.Reddy, QTP Trainer, Hyderabad (9247837478)

47

You might also like