Professional Documents
Culture Documents
SOFTWARE QUALITY
1. Meet customer requirements in terms of functionality
2. Meet customer expectations in terms of performance, reusability, compatibility
3. Cost to purchase by customers
4. Time to release by development organization
Testing
Agile Model: When the customer requirements are suddenly changing. E.g. Mobile
Application
1
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
2
Note 1:
All software development models are derived from waterfall model (Linear
sequential model)
Note 2:
Above all software development process models are maintaining single stage of
testing and that stage is conducting by same development people.
FISH MODEL
(Multiple stages of development and testing)
SRS:
Software requirement specification is defining functional requirements to be
developed and system requirements to be used (converting non-technical information in to
technical information. Derived from BRS). E.g. Bank deposit = addition (Functional
requirements) + some languages (System requirements)
Review:
2
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
3
Design:
HLD:
High-level design document defines the overall architecture of system from root
module (e.g. login) to leaf module (e.g. logout). This document is also known as
architecture design or external design.
LLD:
Low-level design document defines the internal architecture of corresponding
module of functionality. This document is also known as internal design documents. E.g.
Website like yahoo.
Prototype:
A sample model of software is called prototype. It consists of interface
(screens) with out having functionality.
Coding:
Program:
It indicates a set of executable statements, some statements in program are
taking inputs, some statements are performing process and other statements aredisplaying
output.
Module (unit) is a combination of programs and the software is a combination
of modules.
System Testing:
Black Box Testing:
It is a system level testing technique. The responsible are using
these techniques to validate external functionality.
Build:
3
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
4
A.exe the executable form of a system is called build (or) A finally integrated all
modules set is called build.
V-MODEL
V stands for Verification and Validation. This model defines conceptual mapping in
between development stages and testing stages
W.r.p.t
BRS/CRS/URS--------------------- User Acceptance Testing
Analysis reviews
SRS--------------------- System Testing
Designer reviews
Coding (programmers)
4
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
5
2. Reviews In Design:
After completion of analysis and their testing. Designer
category people are developing HLD and LLD. To verify completeness and
correctness of those documents the same designer category people are conducting a
review meeting. In this review they are concentrate on below factors
3. Unit Testing:
After completion of design and their reviews programmers are
concentrating on coding, to physically construct a software build. In this phase
programmers are writing programs and verify that programs using white box testing
techniques. They are of four types
IF
T F
IF IF
T F T F
(Cyclomatic complexity) = 4
5
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
6
d. Mutation Testing:
Mutation means a change in program. Programmers are
performing wanted changes in programs and performing test repeatedly. In this test
repetition programmers are verifying completeness and correctness of that test on
program.
4. Integration Testing:
After completion of dependent programs development and
unit testing, programmers are connecting them to form a complete software build. In
this Integration of programs, programmers are verifying interfaces in between every
two programs on modules. There are four types of approaches to integration modules
such as top down, bottom up, hybrid and big bang approaches.
a. Top-down Approach:
In this model programmers are interconnecting
main modules to sub modules in the place of under constructive, programmers
are using temporary programs called as stubs (or) called programs
Main
6
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
7
Sub1 Sub2
b. Bottom-up Approach:
In this model programmers are interconnecting
sub modules with out using under constructive main module. In this place of
that under constructive main module, programmers are using a temporary
program called as driver (or) calling program
Main
Sub1
Sub2
c. Hybrid-Approach:
This approach is a combined form of top down of
top down and bottom up approaches. It is also known as sandwich approach.
Main
Driver
Sub1
Stub
Sub2 Sub3
7
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
8
d. Big-Bang Approach:
In this model programmers are interconnecting
programs after completion of complete coding.
5. System Testing:
After completion of integration testing development people
are releasing software build to the separate testing team. This separate testing team is
validating that software build w.r.p.t customer requirements. In this level of testing, the
separate testing team is using Black Box Testing Techniques. These techniques are
classified into 3 categories
a. Usability Testing
b. Functional Testing
c. Non-Functional Testing
a. Usability Testing:
Generally the system test execution is a starting with
usability testing. During this test, test engineers are validating user friendliness
of every screen in our application build. This usability testing is also known as
Accessibility Testing. This usability testing consists of two sub techniques.
Screens of Build
Ease of use (understandable screens)
Look & Feel (Attractive)
Speed in Interface. E.g. short navigations.
Conclusion:
Generally the technical writers of our company are developing
user manuals before release the software to customer site. Due to this reason the
manual support testing is coming in to picture at the end of system testing.
8
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
9
Usability
Testing Functional & Non Functional Testing
b. Functional Testing:
It is a mandatory (Compulsory) testing level in
system testing. During this functional testing, test engineers are concentrating
on “meet customer requirements”. This functional testing is classified into
two sub testing techniques. They are functionality testing and sanitation
testing
1. Functionality Testing:
During this test, test engineers are verifying that
whether our build functionalities are working as correct or not? In this this
testing, test engineers are concentrating on below coverage’s.
Input Domain Coverage (Verify the size & type of every input
object values)
9
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
10
2. Sanitation Testing:
It is also known as Garbage Testing. During this
test, test engineers are finding extra functionalities in our application build
w.r.p.t customer requirements.
Note:
One defect means not only missing functionality, not only mistake in
functionality but also extra functionality.
c. Non-Functional Testing:
During this non-functional testing, testing
team is concentrating on extra characteristics of that software build to satisfy
customer site people.
1. Compatibility Testing:
It is also known as Portability Testing.
During this testing, test engineers are validating that whether our application
build is running on customers expected platform or not? Platform means that
operating system, compilers, browsers, and other system software’s
2. Configuration Testing:
It is also known as Hardware Compatibility
Testing. During this test, test engineers are running our application build with
various technologies of hardware devices to estimate hardware compatibility.
3. Recovery Testing:
It is also known as Reliability Testing. During this
test, test engineers are validating that whether our application build is changing
from abnormal state to normal state or not?
Abnormal
10
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
11
Normal
4. Inter-System Testing:
It is also known as Interoperability Testing
(or) End-to-End Testing. During this test, test engineers are validating that
whether our application build is co-existence with other software applications to
share common resources.
E-Seva
Server
WBA server
EBA Local
TBA DB server
ITA
Server
News component
(Existing Component) Common Resource
5. Security Testing:
It is also known as penetration testing. During this
test, test engineers are validating below three factors such as
a. Authorization Testing
b. Access Control (Authentication) Testing
c. Encryption / Decryption Testing
a. Authorization Testing:
In Authorization Testing, test engineers are
validating that whether our application build is allowing valid users and is
preventing invalid users or not?
Client Server
11
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
12
Receiver Response
Request (Sender) Receiver Original Text (sender)
Encryption Decryption
Description
Cipher Text Encryption
Cipher Text
Note:
In above Security Testing, Authorization and Access Control Tests are reasonable to
be applied by test engineers. But Encryption / Decryption test is conducted by separate
people in security team
7. Load Testing:
Load means that the numbers of con-current users, which are
accessing our application, build. The execution of our application build on customer
expected configuration and customer expected load to estimate performance is called Load
Testing.
8. Stress Testing:
The execution of our application build under customer expected
configuration and various loads levels to estimate stability is called Stress Testing.
9. Installation Testing:
12
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
13
Complete Installation
Overall Functionality
13
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
14
14
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
15
15
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
16
1. Monkey Testing:
During this style of testing, testing people are concentrating
on main activities of software due to lack of time for testing. This style of testing is also
known as champagne Testing (or) Random Testing
2. Buddy Testing:
In this style of testing, test engineers is grouping with
developer to conduct testing while coding due to lack of time for testing. Buddy means
a programmer and a tester as a group.
3. Exploratory Testing:
Generally the testing team is conducting system testing
depending on functional and system requirement in SRS. If the SRS is not giving the
complete information about requirementsthen test engineers are depending on past
experience, discussions with other similar projects, browsing ….etc, to collect complete
information about requirements. This style of testing is also calling as exploratory
testing.
4. Pair Testing:
16
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
17
5. Be-Bugging:
In this model development people areadding known bugs in to
coding and releasing to testing team. This type of defect seeding (or) feeding is useful
to estimate of testing people. It is also known as defect seeding (or) defect feeding.
TESTING-TERMINOLOGY
1. Test Strategy:
It is a document and it defines the requiredtesting approach to be
followed by testing people.
2. Test Plan:
It is a document and it provides work allocation in terms of schedule
3. Test Case:
It defines a test condition to validate functionality in terms of
completeness and correctness.
4. Test Log:
It defines the result of a test case in terms of passed / failed, after
execution of the test case on our application build.
6. Re-Testing:
It is also known as Data Driven Testing (or) Iterative Testing.
Generally the test engineers are repeating the test on same application build with
multiple input values. This type of repetition is called Re-Testing.
7. Regression Testing:
The re-execution of selected test cases on modified build to
ensure bug fix work with out any side effects is called Regression Testing. (Change
Testing)
17
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
18
Test Reporting
Initial Build
Test Closure
18
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
19
3. Test Approach: selected list of test factors (or) test issues to be applied by testing
team on corresponding software build. This selection is depending on requirements
in that software, scope of that requirements and the risks involved in that project
testing
4. Roles & Responsibilities: The name of jobs in testing team and their
responsibilities.
6. Test Automation & Testing Tools: The purpose of automation and available tools
in our organization
7. Defect Reporting & Tracking: They required negotiation in between the testing
team and development team to review and resolve defects during testing.
9. Risks and Assumptions: The expected list of problems and their solutions to
overcome.
10. Change and Configuration Management: Managing the development and testing
deliverables for future reference
19
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
20
11. Training Plan: The required no. of training sessions for testing team to understand
customer requirements (or) Business logic.
12. Test Deliverables: Names of test documents to be prepared by testing team during
testing.
E.g. Test plan, test cases, test log, defect reports, and summary reports.
20
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
21
Compliance Testing:
Whether our project team is following our company
standards or not?
Case Study:
21
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
22
In the above example the Project Manager / Test Manager finalized 9 testing topics / issues
to be applied by testing team on project / s/w build.
2. Test Planning:
After completion of test strategy finalization, the test lead
category people are developing test-planned documents. In this stage the test lead
category people are preparing system test plan and divide that plan in to module test
plans. Every test plan is defining “What to Test?”, “How to Test?”, “When to
Test?”, “Who to Test?”
To develop these test plans, test lead category people are following
below approach
Test Strategy
System test plans are compulsory, module test plans are optional
Case study:
22
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
23
Format:
1. Test Plan ID: The title of test plan documents for future reference
3 to 5 “What to Test?”
9. Entry Criteria: Test Engineers are able to start test execution after creating
below criteria
10. Suspension Criteria: Some time test engineers are stopping test execution
part timely due to
23
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
24
12. Staff & Training Needs: The selected test engineer’s names and required
no. of training sessions for them
13. Responsibilities: The mapping in between the namesof test engineers and
the requirements in our project.
15. Risks & Assumptions: List of analyzed risks and their assumptions to
overcome
3. Test Design:
After completion of test planning, the corresponding selected test
engineer’s are concentrating on test design, test execution & test reporting.
Generally the selected test engineers are starting testing job with
the test design in every project. In this test design every test is studying all
requirements of the project and preparing test cases for selected requirements
only w.r.p.t test plan.
In this test design, test engineers are using three types of test case
design methods to prepare test cases for responsible requirements.
Test Case:
Every test case is defining a unique condition. The every test case is self-
standing and self-cleaning to improve understandability in test design, test engineers
24
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
25
are starting every test case with verify or check English words.The every test case is
traceable with requirement in your project
(Run) Test
H.L.D & L.L.D Execution
. Exe (Build)
Coding Build
From the above diagram the test engineers are preparing the maximum test
cases depending on Functional & System Requirements in S.R.S. In this type of test case
writing, test engineers are following below approach.
4. Step 4. Review the test case titles for completeness and correctness
6. Step 6. Go to step 2 until all specifications study and test cases writing
Functional Specification: 1
A login process allows user id and password to
authorize users. User id is taking alpha numeric in lower case from 4 to 16 characters long.
The password is taking alphabets in lowercase from 4 to 8 characters long.
25
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
26
Valid Invalid
a-z A-Z
0–9 Special Characters
Blank Fields
Decision Table:
26
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
27
Functional Specification – 2:
In an insurance application users can apply for
different types of policies when a user select type A Insurance, system asks, age of that
user. The age value should be greater than 16 years and should be less than 80 years.
Valid Invalid
A–Z
a- z
0–9 Special Characters
Blank Fields
Functional Specification – 3:
A door opened when a person comes to in front of the
door and the door closed when that person comes to inside.
Decision Table
27
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
28
Decision Table
Test Case 3: Verify door operation when that person is standing in the middle of the door
Functional Specification – 4:
A computer shut down operation
Functional Specification – 5:
In a Shopping application users are purchasing
different types of items. In this purchase order our system is allowing user to select item
no. and to enter quantity up to 10 this purchase order returns total amount along with one
item price.
28
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
29
Valid Invalid
A–Z
a- z
0–9 Special Characters
Blank Fields
Functional Specification 6:
Washing machine operation
Functional Specification 7:
In an E – Banking application, users are connecting to
bank server through Internet connection. In this application users are filling below fields to
connect to bank server.
Password: 6 digits number
Area code: 3 digits number and optional
29
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
30
Valid Invalid
A–Z
a- z
0–9 Special Characters
Blank Fields
Valid Invalid
A–Z
a- z
0–9 Special Characters
Blank Fields
30
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
31
Valid Invalid
A–Z
a- z
0–9 Special Characters
Blank Fields
Valid Invalid
A–Z
a- z
0–9 Special Characters
Blank Fields
Functional Specification 8:
A computer restart operation
Functional Specification 9:
Money with drawl from ATM machine
31
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
32
Format:
4. Test Suite Id: The name of test batch, in this batch our test case is a member
(Dependent group of member)
32
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
33
6. Test Environment: The required hardware’s & software’s to execute the test case
on our application build.
7. Test Effort: Expected time to execute the test case on build. (ISO – Standards)
E.g. 20 minutes is an average time (manually) by using tool 5 min
8. Test Duration: Approximate Date & Time.
9. Pre condition (or) Test set up: Necessary tasks to do before start the test case
execution
11. Test Case Pass (or) Fail Criteria: When this case is passed & when this case is
failed.
Note:
1. Above 11 fields test case format is not mandatory because some field’s values are
common to maximum test cases & some field’s values are easy to remember or
derive.
2. Generally the test cases are covering objects and operations (more than one object).
If our test case is covering an object input values then test engineers are preparing
Data Matrix.
33
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
34
3. If our test case is covering an operation or function then test engineers are preparing
Test Procedure from Base-State to End-State.
Functional Specification: 10
A login process is allowing a User ID & Password to authorized users.User id is taking
alpha numeric in lower case from 4 to 16 characters long. The password object is
taking alphabets in lower case from 4 to 8 characters long.
34
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
35
4. Priority: P0
6. Data Matrix:
BRS BRS
SRS SRS
Test Cases Use cases
35
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
36
Coding
Coding
(Functional & System Specification (Use Cases Based Test Case Design)
Test Case Design)
From the above diagrams test team is receiving “use cases” from project
management. To prepare test cases, every use case is describing functionality with all
required information. Every use case is following a standard format, unlike theoretical
functional specification.
Formats:
1. Use Case Name: The name of use case for future reference
4. Related Use Cases: Names of related use cases, which have dependency with this
use case
36
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
37
Conclusion:
From the above use case format, project management is providing
every functionality documentation with complete details. Depending on that use cases,
test engineers are preparing test case using IEEE-829 Format
1. Ease of use
2. Look & Feel
3. Speed in Interface
4. User manuals correctness (Help Documents)
Note:
Generally the test engineers are preparing maximum test depending on functional
& system specifications in SRS, the remaining test cases are prepared using application
build because the functional & system specifications are not providing complete
information about every small issue in our project.
Some times the testing people are using “Use Cases” instead of functional
& system specification in SRS.
37
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
38
4. Test Execution:
In test execution, test engineers are concentrating on test cases
execution and defect reporting and tracking. In this stage the testing team is conducting
a small meeting with development team for version controlling and establishment of
test environment.
1. Version Control:
During test execution development people are assigning
unique version no. To software builds after performing required changes. This version
numbering system is understandable to testing people.
For this build version controlling the development people are using version
control software’s
Ex. VSS – Visual source safe
Development Testing
Initial build
38
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
39
Stable build
Level-0 (sanity / smoke)
Defect reporting
Bug fixing Level-1 (comprehensive / real)
Modified build
Bug resolved Level-2 (regression)
1. Level - 0 (Sanity / Smoke Testing): Generally the testing people are starting test
execution with level – 0 testing. It is also known as Sanity / Smoke Testing or
Tester Acceptance Testing (TAT) or Build Verification Testing or Testability
Testing.
In this testing level, test engineers are concentrating on below 8 factors through operating
corresponding initial build
Understandable
Opera table
Observable
Controllable
Consistency
Simplicity
Maintainable
Automat able
39
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
40
Passed: All expected values of the test case are equal to all actual values of that build.
Failed: Any one expected value is variation with any one actual value of that build
Blocked: Dependent test cases execution post phoned to next cycle (After modified
build) Due to wrong parent functionality.
Level-0
Level-1
40
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
41
From the above diagram, test engineers are conducting regression testing
on modified build w.r.p.t modifications, which are mentioned in “release note”.
On modified build
Case1: If the development team resolved bug severity is high then test engineers are
re-executing all P0, all P1 and carefully selected maximum P2 test cases on that
modified build w.r.p.t modifications mentioned in release note
Case2: If the development team resolved bug severity is medium then test engineers
are re-executing all P0 carefully selected all P1 and some of P2 test cases.
Case3: If the development team resolved bug severity is low then test engineers are re-
executing carefully selected some P0, P1, P2 test cases.
Case4: If testing team received modified build due to sudden changes in customer
requirements, then test engineers are re-executing all P0, all P1, max P2 test cases.
41
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
42
5. Test Reporting:
During level –1 & level –2, test execution, test engineers are
reporting miss matches in between test case expected values and build actual values
as defect report to development team.
In this test reporting, development people are receiving defect report from
testing team in a standard format. This format followed by every test engineer in
test execution to report defects.
3. Build Version: The version no. of current build, in this build test engineers
detected this defect.
4. Feature: The name of module / function, in that area test engineers found this
defect.
5. Test Case Name: The name of failed test case, in that case execution test
engineer found this defect.
7. Re-producible: Yes Defect appears every time in test case execution, No
Defect appears rarely in test case execution.
High Not able to continue testing with out resolving this defect
(show stopper)
42
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
43
13. Assigned To: The name of responsible person at development side to receive
this defect report.
14. Suggested Fix (Optional): Reasons to accept and resolve this defect.
Resolution Type:
After receiving defect report from testing team, the
responsible development people are conducting review meeting and sending
resolution type to the responsible testing team.
There are 12 types of resolutions, they are
2. Duplicate: The reported defect is rejected, because this defect raised due to
limitations of hardware devices.
5. Not Applicable: The reported defect is rejected because these defects have
improper meaning.
7. Need More Information: The reported defect is not accepted / not rejected but
the developers are requiring more information about the defect to understand.
8. Not Re-producible: The reported defect is not accepted & not rejected, but the
developers are requiring correct procedure to reproduce that defect.
9. No Plan To Fix It: The reported defect is not accepted & not rejected, but the
development people are requiring some extra time.
10. Open: The reported defect is accepted & the development people are ready to
resolve through changes in coding.
43
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
44
11. Deferred: The reported defect is accepted, but postponed to future release, due
to low severity & low priority.
12. User Direction: The reported defect is accepted, but developers are providing
some valid information about that defect to customer site people, through our
application build screens.
P.M
44
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
45
Deferred
Open Reject
Closed Re-open
New Deferred
Types of Defects
Generally the Black Box Testing techniques is finding below type of
defects during system testing such as
Note 1:
Generally the Test Engineer’s are deciding severity & priority of defect
during reporting but the priority of defect is modifiable by development team
45
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
46
Note 2:
Generally the development people are postponing / differing low severity &
low priority defects.
6. Test Closure:
46
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
47
1. Coverage Analysis:
2. Defect Density:
A 20 %
B 20 %
C 40 % (Need for regression)
D 20 %
Select high
Defect density
Modules
Effort estimation
Test reporting
Plan regression
Regression Testing
47
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
48
from real customers / model customers. There are two ways to conduct user acceptance
testing such as Alpha Testing and Beta Testing
8. Sign Off:
After completion of user acceptance testing and their modifications,
project management is defining release team & C.C.B (Change Control Board).
In both terms few developers & test engineers are involving along with project
manager. In this sign off stage testing team is submitting all prepared testing
documents to project management.
Test Strategy
Test Plan
Test Case Titles (Scenarios)
Test Case Documents
Test Log
Defect Reports
- - - -
- - - - -
- - - - -
- - - - -
48
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
49
Time
20 % 80 %
80 % 20 %
20-80/80-20 rule
b. Sufficiency:
Requirements Coverage
Testing Techniques Coverage
Base on this our PM takes decision on testing time is sufficient
or not?
49
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
50
a. Test Effectiveness:
Requirements coverage
Testing techniques coverage
b. Defect Escapes:
Type phase analysis (What type of defect, in what phase).
Defect removal efficiency = A / A+B
In my Company the Testing process starts with “TEST INITIATION”. In this stage the
Project Manager prepares Test Methodology for corresponding Project. He decides
reasonable tests depending upon requirements and releases the test strategy
document to Test Lead. To prepare the “TEST PLAN” My Test Lead studies the BRS,
SRS, Design Docs, development Plan and Test strategy. He goes to HR Manager to talk
about team formation. After completion of Testing Team formation, Risks analysis he
prepares the complete “TEST PLAN”. The Test Lead decides the schedule of the
different tests i.e. what to Test? When to Test? How to Test? Who to Test? After
completion of test plan and review he will take the approval from Project Manager and
provide training to selected test engineers.
Understandable
Opera table
Observable
Controllable
Consistency
Simplicity
50
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551
51
Maintainable
Automat able
Passed: All expected values of the test case are equal to all actual values of that
build.
Failed: Any one expected value is variation with any one actual value of that build
Blocked: Dependent test cases execution post phoned to next cycle (After modified
build) Due to wrong parent functionality.
In regression testing level, to make sure all the fixed bugs are correct and there
are no side effects i.e. old functionalities will not affect with the new changes after
completion of all reasonable tests and defects closed. The management concentrates
on User Acceptance testing. In this testing level they collect the feed back from
real or model customers. After completion of UAT my test lead will prepare the final
test summary report. This report is submitted to customers by TL or PM.
51
Naga Durga Rao Koya nagkoya@gmail.com M: 9989093551