You are on page 1of 15

Project name

Document name here


Version 1.0 | 15-Oct-2006
Copyright Information 1

This document is the exclusive property of XYZ (XYZ) and the recipient agrees that they may not
copy, transmit, use or disclose the confidential and proprietary information in this document by any
means without the expressed and written consent of XYZ. By accepting a copy, the recipient agrees
to adhere to these conditions to the confidentiality of XYZ's practices and procedures; and to use
these documents solely for responding to XYZ's operations methodology.

© All rights reserved XYZ, 2005

1
Project Name

Table of Contents

Copyright Information ............................................................................................ ........................2


1 Introduction............................................................................................................... ...................5
1.1 Purpose of this document........................................................................................................ .5
1.2 Scope of this document.......................................................................................................... ..5
1.3 References...................................................................................................................... ........5
1.4 Definitions, Abbreviation and Acronyms................................................................. ...........5
2 Build Strategies........................................................................................................... .................6
3 Testing Strategy......................................................................................................................... ...6
3.1 Testing Approach.................................................................................. ............................6
3.2 Test Considerations.......................................................................................................... .7
3.3 Test Completeness........................................................................................................... .7
3.4 Test Setup.......................................................................................................... ...............7
3.5 Features not to be tested................................................................................ ..................8
3.6 Risks that impact testing................................................................................. ..................8
3.7 Testing tools....................................................................................................... ...............8
3.8 Assumptions/Limitations/Dependencies....................................................................... .....9
3.9 Acceptance criteria........................................................................................................ ....9
4 Test Case Design & Development........................................................................................ .........9
4.1 Test Case Design Instructions....................................................................................... ....9
4.1.1 System Test cases................................................................................................... ....9
4.2 Test Case Design Deliverables.................................................................... .....................9
5 Test Execution................................................................................................................ ............10
5.1 Unit testing......................................................................................................... .............10
5.1.1 Environment....................................................................................... .......................10
5.1.2 Entry Criteria............................................................................................................. .10
5.1.3 Responsibility..................................................................................... .......................10
5.1.4 Exit Criteria.................................................................................................... ............10
5.2 Functional Testing - Module Wise............................................................ .......................10
5.2.1 Environment....................................................................................... .......................10
5.2.2 Entry Criteria............................................................................................................. .10
5.2.3 Responsibility..................................................................................... .......................10
5.2.4 Exit Criteria.................................................................................................... ............10
5.3 System testing............................................................................................................ .....11
5.3.1 Environment............................................................................................................. ..11

Test Plan. Version 1.0 | Updated: Page 3 of 15


Proprietary and Confidential. © XYZ
Project Name

5.3.2 Entry Criteria...................................................................................................... ........11


5.3.3 Responsibility ......................................................................................................... ...11
5.3.4 System Test Execution - Specific Instructions....................................................... ....11
5.3.5 Exit Criteria.................................................................................................... ............12
6 Test Results Tracking............................................................................................................ ......12
7 Test Schedule......................................................................................................................... ....14
8 Revision History............................................................................................... ..........................14
Office address............................................................................................................... ...............15
Web- www.XYZ.com......................................................................................... ...........................15

Test Plan. Version 1.0 | Updated: Page 4 of 15


Proprietary and Confidential. © XYZ
Project Name

Project name
Document name here

1Introduction

1.1 Purpose of this document


This document describes the plan for testing releases of the Project Name application. This document
will be referenced and used by the XYZ development team.
This document is intended to meet the following objectives:

• The strategy, responsibilities and schedule for the overall testing.


• Identify the project and software artifacts that should be tested.
• List the scope of testing.
• List the deliverables of the test phases.
• Identify the risks and assumptions in the project, if any.
• Define the entry and exit criteria for each of the testing phase.

1.2 Scope of this document


This document describes the test plan for the testing that will be carried out for the Project Name. This
includes

• Module - Testing.
• Integration - Testing.
• System testing.

1.3 References
No Reference/Prior Reading Location of Availability
1. Use Case Specifications in Project Folder repository
1. Requirements
2. Prototypes (.html Files) in Project Repository.

1.4Definitions, Abbreviation and Acronyms


Acronym Description

UCS Use Case Specifications

IE5 Internet Explorer Version 5

Test Plan. Version 1.0 | Updated: Page 5 of 15


Proprietary and Confidential. © XYZ
Project Name

2Build Strategies
The Build process will be documented by the development team and will be provided to the test team
to setup the software.

3Testing Strategy

3.1Testing Approach

The following table lists the approach for the verification activities.
Test Level Description/Environment Strategy/Test Source for Techniques for Identific-
Type Test Case ation of test cases
Identification
Module– Test- Each Individual module will Functionality Use Cases Understanding use cases
ing be tested once by developers Testing from Project of the individual modules.
and test engineer Repository.
Integration – The Application should be Functionality Use Cases Understanding use cases
Testing tested fully along with interde- Testing from VSS. of the individual modules
pendent modules, once the and dependent modules.
individual modules are integ-
rated.
System- Test- System Testing on QA envir- System Test- All test cases The test cases will be
ing onment. ing. are executed identified based on the in-
which are writ- tegration aspects to avoid
ten for module duplication of tests.
and Integra-
tion testing.

Detailed defect analysis will be done for the reported defects. The metrics to be collected during test
life cycle are:

• Defect location Metrics – Defects raised against the module will be plotted on a graph to indicate
the affected module.
• Severity Metrics – Each defect has an associated severity (Critical, High, Medium and Low),
which is how much adverse impact the defect has or how important the functionality that is being
affected by the issue. Number of issues raised against severity will be plotted on a graph. By
examining the severity of a project’s issues, the discrepancies can be identified.
• Defect Classification Metrics – Each defect will be classified into a defect type (like Enhancement,
Unclear Requirements, Logical Defect etc.). Number of defects raised against defect type will be
plotted on a graph. By examining the problem areas in the implementation stage can be identified.
• Defect Closure Metrics – To indicate progress, the number of raised and closed defects against
time will be plot on a graph.
• Defect severity Trend – The number of defects found for different builds will be plotted on a graph
against severity. By examining the severity of defects raised for different builds, the trends for
discrepancies can be identified.

Test Plan. Version 1.0 | Updated: Page 6 of 15


Proprietary and Confidential. © XYZ
Project Name

These metrics will be collected and presented as test summary report after each test cycle. All the
testing related documents would be managed using configuration management tool – VSS. The test
documents will be managed along with the development documents.

3.2Test Considerations
Test Type Key factors to measure the test effectiveness
Conformance/functional Adequate number of test cases will be generated as per the un-
derstanding of functionality from the Use Cases, test scenarios
and also from the discussion with BA and development team.
Traceability mapping will be done between use cases developed
by BA. All generated test cases will be executed for effective test-
ing. Functional testing will be of Black box testing. This focuses on
the overall functionality of the application. This method will allow
the functional testing to uncover faults like incorrect or missing
functions, error in any of the interfaces, errors in data structures
and program initialization or termination.
Interoperability Interoperability testing involves testing of the system after integra-
tion of the software systems as a whole.
Regression Regression testing involves the re-testing of the application follow-
ing modification to ensure that faults or bugs have not been intro-
duced or uncovered as a result of the changes made during the
System Testing. Regression tests are designed for repeatability
and will be used when testing a later version of the application.
Security Security testing involves testing the system in order to make sure
that unauthorized personnel or other systems cannot gain access
to the system and information or resources within it. Program that
checks for access to the system via passwords are tested along
with any organizational security procedures established. Security
testing on application will be done with all possible roles.
Usability Test The portion of the testing that will involve the verification for the
consistency of look and feel across the screens. How easily the
user can navigate the system.
E.g. Effective use of arrows, tab keys and hot keys.
Performance Performance testing involves testing whether the application re-
sponds quickly enough for the intended users. Here the focus is
on the following tests.
1. Load Test.
2. Stress Test.

3.3Test Completeness
Test Type Details

Manual Testing Test Cases will be reviewed and baseline by the BA, development team and
End users for correctness and completeness. The execution of all test cases
will ensure test completeness.

3.4Test Setup
Hardware setup – Server Side

Test Plan. Version 1.0 | Updated: Page 7 of 15


Proprietary and Confidential. © XYZ
Project Name

Item Description
Server Any version of Windows 2000 and above
20 GB HDD, Minimum of 512 MB RAM

Software Requirements – Server Side


Item Description
Database Server MySQL

Software Requirements – Client Side


Item Description
Client Machine Any version of Windows 98 and Above supporting IE 5 and above
JBOSS application server.

3.5Features not to be tested

NA

3.6Risks that impact testing


NO. Risk Description Level Contingency plan
1. Any of the deliverables by the High The testing time may be lost and to compensate
development team not hap- for this, additional resources would be added ap-
pening as per the Project plan. propriately.
2. System failure and loss of data High A database backup strategy should be in place so
during the Testing Process that loss of data can be prevented.
3. Functional specifications not High Have BA/Development team update the functional
updated with requirements specifications and intimate the test team.
modification/addition
4. Test data not available for all Medium Have Business Analyst prepare the test data.
the test cases

3.7Testing tools
Tool Vendor/In-House

Test Defect Tracking TD XYZ Private Ltd.

Test Cases document Excel 2000 Microsoft

Test Plan. Version 1.0 | Updated: Page 8 of 15


Proprietary and Confidential. © XYZ
Project Name

3.8Assumptions/Limitations/Dependencies
 The testing environment should be ready in terms of Application, Machines, and Server setup,
Test Data Setup etc.
 Functionality is delivered to schedule.
 Software is of the required quality for the Sanity/Build verification testing.
 No major changes in the requirement would happen which would affect the test schedule.
• All “Show-Stopper” bugs receive immediate attention from the development team.
• All bugs found in a version of the software will be fixed and unit tested by the development
team before moving the patch to test environment.
 Effective testing depends upon the review comments on test cases.
• The testing team will depend upon the BA, Development team & Client team for speedy resol-
ution of all queries.
• All documentation will be up to date and delivered to the testing team.
• Functional and technical specifications will be signed off by the business.

3.9Acceptance criteria
Type of document Name of the customer
Test Plan document is accepted by
Test case Document is accepted by
Test execution results needs to be accepted by

4Test Case Design & Development

4.1Test Case Design Instructions


4.1.1 System Test cases
The use cases will be reviewed for correctness and completeness. This will be achieved by review
and understanding of the existing functionality in the Project Name. System Test Cases will be de-
veloped manually for all the defined use cases. All system test cases will be derived from these use
cases, and test team will consider the use cases as the base. The test team will consider the function-
al specs as reference document. In order to ensure full coverage of changes requested by business in
Project Name, BA or Development Team will intimate the changes done to testing team members.
PM and End Users will review the system test cases.

4.2Test Case Design Deliverables


Test Cases will be developed by Test team and reviewed by PM before test execution. In case of re-
quirements change, refer the Updated SRS.

Test Plan. Version 1.0 | Updated: Page 9 of 15


Proprietary and Confidential. © XYZ
Project Name

5Test Execution

5.1Unit testing
The unit test cases will be developed by the development team and the development team would
carry out the Unit testing and the testing team will review these unit test cases.

5.1.1 Environment
The test environment is basically the developer’s environment since the developer does the testing

5.1.2 Entry Criteria


At least a small piece of code, which is testable, should be available for doing Unit testing. Entry cri-
teria will be on completion of a set of lines, which will enable the developer/tester to test the function-
ality for which it has been written for.

5.1.3 Responsibility
The developer will do the testing at this level.

5.1.4 Exit Criteria


Exit criteria will be once its certain that the code paths are fully tested and do not have any issues.

5.2Functional Testing - Module Wise


The functional testing of each module will be performed by testing team, wherein the test engineer
checks for the Integration of the module with all necessary units. Here top down approach is used for
this testing.
Note: Modules of Application are Admin, xyz, xyz1, Address book.

5.2.1 Environment
Testing will be performed on in QA environment.

5.2.2 Entry Criteria


The Unit Testing should be done at least once before this testing.

5.2.3 Responsibility
S.no. Testing Activity Responsibility

1. Test case designing Test team


2. Test cases review PM and End users
3. Test execution Test team
4. Test results reporting Test team

5.2.4 Exit Criteria


All test cases have been executed and do not have any issues.

Test Plan. Version 1.0 | Updated: Page 10 of 15


Proprietary and Confidential. © XYZ
Project Name

5.3System testing
The objective of System application testing is to ensure that the application addresses all the function-
al, user interface and security requirements. It is currently proposed to conduct one round of system
testing and one final round of regression testing after all defects have been fixed.

5.3.1 Environment
Operating Systems: Windows XP
Application Server: jboss-4.0.2
Application DB: MySQL
5.3.2 Entry Criteria
All modules testing should be done at least once before System testing.

5.3.3 Responsibility
S.no. Testing Activity Responsibility

1. Test case designing Test team


2. Test cases review PM and End users
3. Test execution Test team
4. Test results reporting Test team

5.3.4 System Test Execution - Specific Instructions

Functional Testing Using the system test cases developed for functional requirement and by
using valid and invalid data, verify the following:

• The expected results occur when valid data is used


• The appropriate error or warning messages are displayed when in-
valid data is used
• Each business rule is properly applied
User Interface Testing Verify the following:

• Data Validation (Field length Validation, Data Type Validation using a


checklist)

• Application screens should follow the standards.

• Navigation and usability through the target-of-test properly reflects


functional requirements, including window-to-window, field-to- field,
and use of access methods (Tab keys, Horizontal and Vertical
Scrolling, Tool Tips)
• Window objects and characteristics, such as size, position, state,
and focus conform to standards.
Security Testing • Login as users with different roles and privileges to check if they can
access the function and data they are allowed access to, and denied
access to those functions for which they do not have authorization.

Test Plan. Version 1.0 | Updated: Page 11 of 15


Proprietary and Confidential. © XYZ
Project Name

5.3.5 Exit Criteria


All planned tests have been executed - All system test cases should be run through in each round of
testing
The system will be accepted if it has:
• No severity 1 defects (P1): Critical Bugs (System crashes and there is no work
around.)
• No severity 2 defects (P2): High Functional Bugs (Functionality is significantly im-
paired. Either a task cannot be accomplished or a major work around is necessary.)
• A maximum of 10 open severity 3 and 4 defects. Should be part of Test report.

6Test Results Tracking


The defect-tracking tool “TD” will be used for Reporting and Tracking of bugs during testing.
The following table lists the different priorities that can be assigned to a defect and provides the criter-
ia for assignment.

Defect Prioritization Criteria for assigning Priority


Blocker Show Stoppers. These defects prevent testing of major functionality and
do not have a workaround.
Eg:
• Click on a menu item that returns a blank page when a major pro-
cess/workflow needs to be tested.
High Critical Defects that do not allow major functionality to be tested but have
a workaround.
Eg:
• Access Rules conditions not checked.
• If exceptions are not handled properly.
• If data not pre-populated. Etc..
Medium Moderate Defects that prevent testing occasionally used or minor func-
tionality.
Eg:
• No confirmation messages
• Non-critical field validations not performed
Low Minor Defects that do not prevent functionality from being tested.

The Process for tracking and resolving the bugs reported by Testing Team will be as below.
1. Testing team will test modules in the test environment.

2. When a test Engineer identifies a defect, he will create a defect in the defect-tracking tool. A
clear description of the defect along with the steps taken to create it will be logged in defect
tracking tool. The status of the defect will be set to ‘Open’.

3. If necessary the Defect will be discussed in the defect prioritization meeting.

4. If the defect logged by Test Engineer is the same as an existing defect, the Defect Status will
be set to ‘Duplicate’.

Test Plan. Version 1.0 | Updated: Page 12 of 15


Proprietary and Confidential. © XYZ
Project Name

5. If the defect logged by Test Engineer was found not to be a defect after discussion, the Defect
Status will be set to ‘Not a Defect’.

6. After the development team member has modified and tested the code, the Defect Status will
be changed to ‘Fixed’.

7. The modified code will be released into the test environment. Test team will be informed when
a release is made into the test environment. Development team will prepare a Release Notes
document that will list the defects that have been fixed. Release Notes will also include new
functionalities.

8. The test engineer will re-test defects in ‘Fixed’ status.

9. The test engineer will change the Defect status to ‘Closed’ (if the defect has been fixed) or
‘Re-opened’ (if the defect still has not been fixed).

The following table explains each defect status and when it would occur during the workflow.

Defect Status Comments


Open Created by Test Engineer and has been assigned to a Development
team member for resolution
Not Reproducible This status occurs when a defect found by Test Engineer is not repro-
duced.
Fixed This status occurs if the defect has been Fixed by the Development
team member.
Closed The Test Engineer has tested the changes made by Development
team and confirmed that they work.
Re-opened This status occurs when the Test Engineer tests the changes made by
Development team and finds that the defect is still present.
Not a defect This status occurs if the defect is defined as ‘Not a defect’ after dis-
cussion with team/BA/PM/client.
Duplicate This status occurs, If the defect logged by Test Engineer is the same
as an existing defect.
Postponed This status occurs when the defect is accepted by Development team
member and will be resolved in the future versions.
Work in Progress This status occurs when the development team is in the process to fix
that defect.

Test Plan. Version 1.0 | Updated: Page 13 of 15


Proprietary and Confidential. © XYZ
Project Name

7Test Schedule
Iteration Task Start Date End Date
First Integration Testing 18th May 2006 9th jun 2006
Second System Testing 1 12th jun 2006 16th jun 2006
Third System Testing 2 30th jun 2006 7th jul 2006
UAT Release UAT 24th jul 2006 31st jul 2006

8Revision History
Version Date Author(s) Reviewer(s) Change Description

1.0 18th-Aug-2006 Mahindra V


2.0

Test Plan. Version 1.0 | Updated: Page 14 of 15


Proprietary and Confidential. © XYZ
Project Name

XYZ
Contact Information

Office address

Web- www.XYZ.com

Test Plan. Version 1.0 | Updated: Page 15 of 15


Proprietary and Confidential. © XYZ

You might also like