You are on page 1of 27

Generic Project

Prepared for
Project Name
Prepared by
Company Name
Date
July 17, 2001

© 2001, COMPANY NAME. All rights reserved.

This documentation is the confidential and proprietary intellectual


property of COMPANY NAME. Any unauthorized use, reproduction,
preparation of derivative works, performance, or display of this
document, or software represented by this document, without the
express written permission of Sabre Inc. is strictly prohibited.

COMPANY NAME and the COMPANY NAME logo design are


trademarks and/or service marks of an affiliate of COMPANY NAME.
All other trademarks, service marks, and trade names are owned by
their respective companies.
PROJECT NAME
Abacus Detail Test Plan

D O C U M E N T R E V I S I O N I N F O R M AT I O N

The following information is to be included with all versions of the document.

Project Name Project Number

Prepared by Date Prepared

Revised by Date Revised

Revision Reason Revision Control No.

Revised by Date Revised

Revision Reason Revision Control No.

Revised by Date Revised

Revision Reason Revision Control No.

Sabre Inc. Confidential/All Rights Reserved iii


PROJECT NAME
Abacus Detail Test Plan

D O C U M E N T A P P R O VA L

This signature page is to indicate approval from COMPANY NAME sponsor and Client sponsor for the attached Abacus
Detail Test Plan for the PROJECT NAME. All parties have reviewed the attached document and agree with its contents.

COMPANY NAME Project Manager:


Name, Title: Project Manager, PROJECT NAME

Date

CUSTOMER Project Manager:


Name, Title:

Date

COMPANY NAME/DEPARTMENT Sponsor:


Name, Title:

Date

COMPANY NAME Sponsor:


Name, Title:

Date

CUSTOMER NAME Sponsor:


Name, Title:

Date

COMPANY NAME Manager:


Name, Title:

Date

Sabre Inc. Confidential/All Rights Reserved iv


Table of Contents

1 I n t r o d u c t i o n .................................................................................................................v i

1.1 Automated Testing DTP Overview...................................................................................................vi


1.2 Test Identification............................................................................................................................vii

2 T e s t D e s c r i p t i o n ......................................................................................................v i i

1.3 Test Purpose and Objectives..........................................................................................................vii


1.4 Assumptions, Constraints, and Exclusions....................................................................................vii
1.5 Entry Criteria...................................................................................................................... ............viii
1.6 Exit Criteria....................................................................................................................................viii
1.7 Pass/Fail Criteria...........................................................................................................................viii

3 T e s t S c o p e ......................................................................................................................x

1.8 Items to be tested by Automation.....................................................................................................x


1.9 Items not to be tested by Automation...............................................................................................x

4 T e s t A p p r o a c h .............................................................................................................x i

1.10 Description of Approach................................................................................................................xi

5 T e s t D e f i n i t i o n .........................................................................................................x i i

1.11 Test Functionality Definition (Requirements Testing)...................................................................xii


1.12 Test Case Definition (Test Design)................................................................................................xii
1.13 Test Data Requirements...............................................................................................................xii
1.14 Automation Recording Standards.................................................................................................xii
1.15 Winrunner Menu Settings............................................................................................................xiii
1.16 Winrunner Script Naming Conventions.......................................................................................xiii
1.17 Winrunner GUIMAP Naming Conventions..................................................................................xiii
1.18 Winrunner Result Naming Conventions......................................................................................xiv
1.19 Winrunner Report Naming Conventions......................................................................................xiv
1.20 Winrunner Script, Result and Report Repository........................................................................xiv

Sabre Inc. Confidential/All Rights Reserved Table of Contents v


Introduction 1
6 T e s t P r e p a r a t i o n S p e c i f i c a t i o n s ....................................................................x v

1.21 Test Environment..........................................................................................................................xv


1.22 Test Team Roles and Responsibilities........................................................................................xvii
1.23 Test Team Training Requirements.............................................................................................xviii
1.24 Automation Test Preparation......................................................................................................xviii
1.25 Issues............................................................................................................................... ............xix

7 T e s t I s s u e s a n d R i s k s .........................................................................................x i x

1.26 Risks............................................................................................................................................ .xix

8 A p p e n d i c e s ..................................................................................................................x x

1.1 Traceability Matrix...........................................................................................................................xx


1.2 Definitions for Use in Testing.........................................................................................................22
1.2.1 Test Requirement................................................................................................................... 22
1.2.2 Test Case..................................................................................................................... ..........22
1.2.3 Test Procedure.................................................................................................................... ...22
1.3 Automated Test Cases ..................................................................................................................23
1.3.1 NAME OF FUNCTION Test Case..........................................................................................23
1.4 Glossary Reference........................................................................................................................25

9 P r o j e c t G l o s s a r y ......................................................................................................2 5

1.5 Sample Addresses for Testing........................................................................................................26


1.6 Test Credit Card Numbers..............................................................................................................27

1.1 Automated Testing DTP Overview

This Automated Testing Detail Test Plan (ADTP) will identify the specific tests that are to be
performed to ensure the quality of the delivered product. System/Integration Test ensures the product
functions as designed and all parts work together. This ADTP will cover information for Automated
testing during the System/Integration Phase of the project and will map to the specification or
requirements documentation for the project. This mapping is done in conjunction with the Traceability
Matrix document, that should be completed along with the ADTP and is referenced in this document.
This ADTP refers to the specific portion of the product known as PRODUCT NAME. It provides
clear entry and exit criteria, and roles and responsibilities of the Automated Test Team are identified
such that they can execute the test.

Sabre Inc. Confidential/All Rights Reserved Table of Contents vi


Test Description 2
The objectives of this ADTP are:
• Describe the test to be executed.
• Identify and assign a unique number for each specific test.
• Describe the scope of the testing.
• List what is and is not to be tested.
• Describe the test approach detailing methods, techniques, and tools.
• Outline the Test Design including:
– Functionality to be tested.
– Test Case Definition.
– Test Data Requirements.
• Identify all specifications for preparation.
• Identify issues and risks.
• Identify actual test cases.
• Document the design point or requirement tested for each test case as it is developed.

1.2 Te s t I d e n t i f i c at i o n

This ADTP is intended to provide information for System/Integration Testing for the PRODUCT
NAME module of the PROJECT NAME. The test effort may be referred to by its PROJECT
REQUEST (PR) number and its project title for tracking and monitoring of the testing progress.

1.3 Te s t P u r p o s e a n d O b j e c t i ve s

Automated testing during the System/Integration Phase as referenced in this document is intended to
ensure that the product functions as designed directly from customer requirements. The testing goal is
to identify the quality of the structure, content, accuracy and consistency, some response times and
latency, and performance of the application as defined in the project documentation.

1.4 Assumptions, Constr aints, and Exclusions

Factors which may affect the automated testing effort, and may increase the risk associated with the
success of the test include:
• Completion of development of front-end processes

Sabre Inc. Confidential/All Rights Reserved Table of Contents vii


• Completion of design and construction of new processes
• Completion of modifications to the local database
• Movement or implementation of the solution to the appropriate testing or production environment
• Stability of the testing or production environment
• Load Discipline
• Maintaining recording standards and automated processes for the project
• Completion of manual testing through all applicable paths to ensure that reusable automated scripts
are valid

1.5 Entr y Criteria

The ADTP is complete, excluding actual test results. The ADTP has been signed-off by appropriate
sponsor representatives indicating consent of the plan for testing.
The Problem Tracking and Reporting tool is ready for use. The Change Management and
Configuration Management rules are in place.
The environment for testing, including databases, application programs, and connectivity has been
defined, constructed, and verified.

1.6 Exit Criteria

In establishing the exit/acceptance criteria for the Automated Testing during the System/Integration
Phase of the test, the Project Completion Criteria defined in the Project Definition Document (PDD)
should provide a starting point. All automated test cases have been executed as documented. The
percent of successfully executed test cases met the defined criteria. Recommended criteria: No Critical
or High severity problem logs remain open and all Medium problem logs have agreed upon action
plans; successful execution of the application to validate accuracy of data, interfaces, and connectivity.

1.7 Pass/Fail Criteria

The results for each test must be compared to the pre-defined expected test results, as documented in
the ADTP (and DTP where applicable). The actual results are logged in the Test Case detail within the
Detail Test Plan if those results differ from the expected results. If the actual results match the
expected results, the Test Case can be marked as a passed item, without logging the duplicated results.
A test case passes if it produces the expected results as documented in the ADTP or Detail Test Plan
(manual test plan). A test case fails if the actual results produced by its execution do not match the
expected results. The source of failure may be the application under test, the test case, the expected

Sabre Inc. Confidential/All Rights Reserved Table of Contents viii


results, or the data in the test environment. Test case failures must be logged regardless of the source
of the failure.
Any bugs or problems will be logged in the DEFECT TRACKING TOOL.
The responsible application resource corrects the problem and tests the repair. Once this is complete,
the tester who generated the problem log is notified, and the item is re-tested. If the retest is successful,
the status is updated and the problem log is closed.
If the retest is unsuccessful, or if another problem has been identified, the problem log status is updated
and the problem description is updated with the new findings. It is then returned to the responsible
application personnel for correction and test.

Severity Codes are used to prioritize work in the test phase. They are assigned by the test group and
are not modifiable by any other group. The following standard Severity Codes to be used for
identifying defects are:

Table 1 Severity Codes

Severity Code Severity Code


Number Name Description

1 Critical Automated tests cannot proceed further within applicable test


case (no work around)

2 High The test case or procedure can be completed, but produces


incorrect output when valid information is input.

3 Medium The test case or procedure can be completed and produces


correct output when valid information is input, but produces
incorrect output when invalid information is input.

(e.g. no special characters are allowed as part of


specifications but when a special character is a part of the
test and the system allows a user to continue, this is a
medium severity)

4 Low All test cases and procedures passed as written, but there
could be minor revisions, cosmetic changes, etc. These
defects do not impact functional execution of system

The use of the standard Severity Codes produces four major benefits:
• Standard Severity Codes are objective and can be easily and accurately assigned by those executing
the test. Time spent in discussion about the appropriate priority of a problem is minimized.
• Standard Severity Code definitions allow an independent assessment of the risk to the on-schedule
delivery of a product that functions as documented in the requirements and design documents.

Sabre Inc. Confidential/All Rights Reserved Table of Contents ix


Test Scope 3
• Use of the standard Severity Codes works to ensure consistency in the requirements, design, and
test documentation with an appropriate level of detail throughout.
• Use of the standard Severity Codes promote effective escalation procedures.

The scope of testing identifies the items which will be tested and the items which will not be tested
within the System/Integration Phase of testing.

1.8 Items to be tested by Automation

1. PRODUCT NAME
2. PRODUCT NAME
3. PRODUCT NAME
4. PRODUCT NAME
5. PRODUCT NAME

1.9 Items not to be tested by Automation

1. PRODUCT NAME
2. PRODUCT NAME

Sabre Inc. Confidential/All Rights Reserved Table of Contents x


Test Approach 4
1.10 Description of Approach

The mission of Automated Testing is the process of identifying recordable test cases through all
appropriate paths of a website, creating repeatable scripts, interpreting test results, and reporting to
project management. For the Generic Project, the automation test team will focus on positive testing
and will complement the manual testing undergone on the system. Automated test results will be
generated, formatted into reports and provided on a consistent basis to Generic project management.
System testing is the process of testing an integrated hardware and software system to verify that the
system meets its specified requirements. It verifies proper execution of the entire set of application
components including interfaces to other applications. Project teams of developers and test analysts
are responsible for ensuring that this level of testing is performed.
Integration testing is conducted to determine whether or not all components of the system are working
together properly. This testing focuses on how well all parts of the web site hold together, whether
inside and outside the website are working, and whether all parts of the website are connected. Project
teams of developers and test analyst are responsible for ensuring that this level of testing is performed.
For this project, the System and Integration ADTP and Detail Test Plan complement each other.
Since the goal of the System and Integration phase testing is to identify the quality of the structure,
content, accuracy and consistency, response time and latency, and performance of the application, test
cases are included which focus on determining how well this quality goal is accomplished.
Content testing focuses on whether the content of the pages match what is supposed to be there,
whether key phrases exist continually in changeable pages, and whether the pages maintain quality
content from version to version.
Accuracy and consistency testing focuses on whether today’s copies of the pages download the same as
yesterday’s, and whether the data presented to the user is accurate enough.
Response time and latency testing focuses on whether the web site server responds to a browser request
within certain performance parameters, whether response time after a SUBMIT is acceptable, or
whether parts of a site are so slow that the user discontinues working. Although Loadrunner provides
the full measure of this test, there will be various AD HOC time measurements within certain
Winrunner Scripts as needed.
Performance testing (Loadrunner) focuses on whether performance varies by time of day or by load
and usage, and whether performance is adequate for the application.
Completion of automated test cases is denoted in the test cases with indication of pass/fail and follow-
up action.

Sabre Inc. Confidential/All Rights Reserved Table of Contents xi


Test Definition 5
This section addresses the development of the components required for the specific test. Included are
identification of the functionality to be tested by automation, the associated automated test cases and
scenarios. The development of the test components parallels, with a slight lag, the development of the
associated product components.

1.11 Te s t F u n c t i o n a l i t y D e f i n i t i o n ( R e q u i r e m e n t s
Te s t i n g )

The functionality to be automated tested is listed in the Traceability Matrix, attached as an appendix.
For each function to undergo testing by automation, the Test Case is identified. Automated Test Cases
are given unique identifiers to enable cross-referencing between related test documentation, and to
facilitate tracking and monitoring the test progress.
As much information as is available is entered into the Traceability Matrix in order to complete the
scope of automation during the System/Integration Phase of the test.

1.12 Te s t C a s e D e f i n i t i o n ( Te s t D e s i g n )

Each Automated Test Case is designed to validate the associated functionality of a stated requirement.
Automated Test Cases include unambiguous input and output specifications. This information is
documented within the Automated Test Cases in Appendix 8.5 of this ADTP.

1.13 Te s t D at a R e q u i r e m e n t s

The automated test data required for the test is described below. The test data will be used to populate
the data bases and/or files used by the application/system during the System/Integration Phase of the
test. In most cases, the automated test data will be built by the OTS Database Analyst or OTS
Automation Test Analyst.

1.14 Automation Recor ding Standards

Initial Automation Testing Rules for the Generic Project:


1. Ability to move through all paths within the applicable system
2. Ability to identify and record the GUI Maps for all associated test items in each path
3. Specific times for loading into automation test environment

Sabre Inc. Confidential/All Rights Reserved Table of Contents xii


4. Code frozen between loads into automation test environment
5. Minimum acceptable system stability

1.15 Winrunner Menu Settings

1. Default recording mode is CONTEXT SENSITIVE


2. Record owner-drawn buttons as OBJECT
3. Maximum length of list item to record is 253 characters
4. Delay for Window Synchronization is 1000 milliseconds (unless Loadrunner is operating in same
environment and then must increase appropriately)
5. Timeout for checkpoints and CS statements is 1000 milliseconds
6. Timeout for Text Recognition is 500 milliseconds
7. All scripts will stop and start on the main menu page
8. All recorded scripts will remain short; Debugging is easier. However, the entire script, or portions
of scripts, can be added together for long runs once the environment has greater stability.

1.16 W inr unner Script Naming Conventions

1. All automated scripts will begin with GE abbreviation representing the Generic Project and be filed
under the Winrunner on LAB11 W Drive/Generic/Scripts Folder.
2. GE will be followed by the Product Path name in lower case: air, htl, car
3. After the automated scripts have been debugged, a date for the script will be attached: 0710 for
July 10. When significant improvements have been made to the same script, the date will be
changed.
4. As incremental improvements have been made to an automated script, version numbers will be
attached signifying the script with the latest improvements: eg. GEsea0710.1 GEsea0710.2
The .2 version is the most up-to-date

1.17 W inr unner GUIMAP Naming Conventions

Sabre Inc. Confidential/All Rights Reserved Table of Contents xiii


1. All Generic GUI Maps will begin with GE followed by the area of test. Eg. GEsea. GEpond
GUI Map represents all pond paths. GEmemmainmenu GUI Map represents all membership and
main menu concerns. GElogin GUI Map represents all GE login concerns.
2. As there can only be one GUI Map for each Object, etc on the site, they are under constant revision
when the site is undergoing frequent program loads.

1.18 W inr unner Result Naming Conventions

1. When beginning a script, allow default res## name to be filed


2. After a successful run of a script where the results will be used toward a report, move file to
results and rename: GE for project name, res for Test Results, 0718 for the date the script was
run, your initials and the original default number for the script. Eg. GEres0718jr.1

1.19 W inr unner Repor t Naming Conventions

1. When the accumulation of test result(s) files for the day are formulated, and the statistics are
confirmed, a report will be filed that is accessible by upper management. The daily Report file will be
as follows: GEdaily0718 GE for project name, daily for daily report, and 0718 for the date the
report was issued.
2. When the accumulation of test result(s) files for the week are formulated, and the statistics are
confirmed, a report will be filed that is accessible by upper management. The weekly Report file will
be as follows: GEweek0718……… GE for project name, week for weekly report, and 0718 for the
date the report was issued.

1.20 Winr unner Script, Result and Repor t Repositor y

1. LAB 11, located within the GE Test Lab, will “house” the original Winrunner Script, Results and
Report Repository for automated testing within the Generic Project. WRITE access is granted
Winrunner Technicians and READ ONLY access is granted those who are authorized to run scripts
but not make any improvements. This is meant to maintain the purity of each script version.

Sabre Inc. Confidential/All Rights Reserved Table of Contents xiv


Test Preparation Specifications 6
2. Winrunner on LAB11 W Drive houses all Winrunner related documents, etc for GE automated
testing.
3. Project file folders for the Generic Project represent the initial structure of project folders utilizing
automated testing. As our automation becomes more advanced, the structure will spread to other
appropriate areas.
4. Under each Project file folder, a folder for SCRIPT, RESULT and REPORT can be found.
5. All automated scripts generated for each project will be filed under Winrunner on LAB11 W
Drive/Generic/Scripts Folder and moved to folder ARCHIVE SCRIPTS as necessary
6. All GUI MAPS generated will be filed under
Winrunner on LAB11 W Drive/Generic/Scripts/gui_files Folder.
7. All automated test results are filed under the individual Script Folder after each script run. Results
will be referred to and reports generated utilizing applicable statistics. Automated Test Results
referenced by reports sent to management will be kept under the Winrunner on LAB11 W
Drive/Generic/Results Folder. Before work on evaluating a new set of test results is begun, all
prior results are placed into Winrunner on LAB11 W Drive/Generic/Results/Archived Results
Folder. This will ensure all reported statistics are available for closer scrutiny when required.
8. All reports generated from automated scripts and sent to upper management will be filed under
Winrunner on LAB11 W Drive/Generic/Reports Folder

1.21 Te s t E nv i r o n m e n t

Table 2 Environment for Automated Test


Automated Test environment is indicated below. Existing dependencies are entered in comments.

Environment Test System Comments

Test – Cert Access via http://xxxxx/xxxxx


System/Integration Test
(SIT)

Production Production Access via http:// www.xxxxxx.xxx

Other (specify) Development Individual Test Environments

Sabre Inc. Confidential/All Rights Reserved Table of Contents xv


Table 3 Hardware for Automated Test
The following is a list of the hardware needed to create production like environment:

Manufacturer Device Type

Various Personal Computer (486 or Higher) with monitor & required peripherals; with
connectivity to internet test/production environments. Must be enabled to
ADDITIONAL REQUIREMENTS.

Table 4 Software

The following is a list of the software needed to create a production like environment:

Software Version (if applicable) Programmer Support

Netscape Navigator ZZZ or higher

Internet Explorer ZZZ or higher

Sabre Inc. Confidential/All Rights Reserved Table of Contents xvi


1.22 Te s t Te a m R o l e s a n d R e s p o n s i b i l i t i e s

Table 5 Test Team Roles and Responsibilities

Role Responsibilities Name

COMPANY NAME Approve project development, handle major issues Name, Phone
Sponsor related to project development, and approve
development resources

Abacus Sponsor Signature approval of the project, handle major issues Name, Phone

Abacus Project Ensures all aspects of the project are being addressed Name, Phone
Manager from CUSTOMERS’ point of view

COMPANY NAME Manage the overall development of project, including Name, Phone
Development obtaining resources, handling major issues, approving
Manager technical design and overall timeline, delivering the
overall product according to the Partner Requirements

COMPANY NAME Provide PDD (Project Definition Document), project plan, Name, Phone
Project Manager status reports, track project development status, manage
changes and issues

COMPANY NAME Provide Technical guidance to the Development Team Name, Phone
Technical Lead and ensure that overall Development is proceeding in the
best technical direction

COMPANY NAME Develop and deliver the necessary Business Services to Name, Phone
Back End support the PROJECT NAME
Services Manager

COMPANY NAME Provide PROJECT NAME development certification, Name, Phone


Infrastructure production infrastructure, service level agreement, and
Manager testing resources

COMPANY NAME Develops ADTP and Detail Test Plans, tests changes, Name, Phone
Test Coordinator logs incidents identified during testing, coordinates
testing effort of test team for project

COMPANY NAME Tracks SCR’s in DEFECT TRACKING TOOL. Reviews Name, Phone
Tracker new SCR’s for duplicates, completeness and assigns to
Coordinator/ Module Tech Leads for fix. Produces status documents
Tester as needed. Tests changes, logs incidents identified
during testing.

COMPANY NAME Tests changes, logs incidents identified during testing. Name, Phone
Automation
Enginneer

Sabre Inc. Confidential/All Rights Reserved Table of Contents xvii


1.23 Te s t Te a m Tr a i n i n g R e q u i r e m e n t s

Table 6 Automation Training Requirements

Training Target Date for Roles/Resources to


Requirement Training Approach Completion be Trained

1.24 A u t o m a t i o n Te s t P r e p a r a t i o n

1. Write and receive approval of the ADTP from Generic Project management
2. Manually test the cases in the plan to make sure they actually work before recording repeatable
scripts
3. Record appropriate scripts and file them according to the naming conventions described within this
document
4. Initial order of automated script runs will be to load GUI Maps through a STARTUP script. After
the successful run of this script, scripts testing all paths will be kicked off. Once an appropriate
number of PNR’s are generated, GenericCancel scripts will be used to automatically take the
inventory out of the test profile and system environment. During the automation test period,
requests for testing of certain functions can be accommodated as necessary as long as these
functions have the ability to be tested by automation.
5. The ability to use Generic Automation will be READ ONLY for anyone outside of the test group.
Of course, this is required to maintain the pristine condition of master scripts on our data
repository.
6. Generic Test Group will conduct automated tests under the rules specified in our agreement for use
of the Winrunner tool marketed by Mercury Interactive.
7. Results filed for each run will be analyzed as necessary, reports generated, and provided to upper
management.

Sabre Inc. Confidential/All Rights Reserved Table of Contents xviii


Test Issues and Risks 7
1.25 Issues

The table below lists known project testing issues to date. Upon sign-off of the ADTP and Detail Test
Plan, this table will not be maintained, and these issues and all new issues will be tracked through the
Issue Management System, as indicated in the projects approved Issue Management Process.

Table 7 Issues

Target Date
Issue Impact for Resolution Owner

COMPANY NAME test team Testing may not cover some Beginning of CUSTOMER TO
is not in possession of browsers used by CLIENT Automated PROVIDE
market data regarding what customers Testing during
browsers are most in use in System and
CUSTOMER target market. Integration Test
Phase

OTHER

1.26 Risks

The table below identifies any high impact or highly probable risks that may impact the success of the
Automated testing process.

Table 8 Risk Assessment Matrix

Risk Area Potential Impact Likelihood of Difficulty of Overall Threat


Occurrence Timely Detection (H, M, L)

1. Unstable Delayed Start HISTORY OF Immediately DEPENDENT


Environment PROJECT ON
LIKELIHOOD

2. Quality of Greater delays Dependent upon Immediately


Unit Testing taken by automated quality
scripts standards of
development
group

3. Browser Intermittent Delays Dependent upon Immediately


Issues browser version

Sabre Inc. Confidential/All Rights Reserved Table of Contents xix


Appendices 8
Table 9 Risk Management Plan

Risk Area Preventative Contingency Plan Trigger Owner


Action Action

1. Meet with
Environment
Group
2. Meet
with
Developme
nt Group

3.

1.1 Tr a c e a b il ity M atri x

The purpose of the Traceability Matrix is to identify all business requirements and to trace each
requirement through the project's completion.
Each business requirement must have an established priority as outlined in the Business Requirements
Document. They are:
Essential - Must satisfy the requirement to be accepted by the customer.
Useful - Value -added requirement influencing the customer's decision.
Nice-to-have - Cosmetic non-essential condition, makes product more appealing.
The Traceability Matrix will change and evolve throughout the entire project life cycle. The
requirement definitions, priority, functional requirements, and automated test cases are subject to
change and new requirements can be added. However, if new requirements are added or existing
requirements are modified after the Business Requirements document and this document have been
approved, the changes will be subject to the change management process.
The Traceability Matrix for this project will be developed and maintained by the test coordinator. At
the completion of the matrix definition and the project, a copy will be added to the project notebook.

Sabre Inc. Confidential/All Rights Reserved Table of Contents xx


Functional Areas of Traceability Matrix

# Functional Area Priority

B1 Pond E

B2 River E

B3 Lake U

B4 Sea E

B5 Ocean E
B6 Misc U
B7 Modify E Legend:
L1 Language E
EE1 End-to-End Testing EE B = Order Engine
L = Language
N = Nice to have
EE = End-to-End
E = Essential
U = Useful
1.2 D e f i n i t i o n s f o r U s e i n Te s t i n g

1.2.1 Te s t Re q u i r e m e n t

A scenario is a prose statement of requirements for the test. Just as there are high level and detailed
requirements in application development, there is a need to provide detailed requirements in the test
development area.

1.2.2 Te s t C a s e

A test case is a transaction or list of transactions that will satisfy the requirements statement in a test
scenario. The test case must contain the actual entries to be executed as well as the expected results,
i.e., what a user entering the commands would see as a system response.

1.2.3 Te s t P r o c e d u r e

Test procedures define the activities necessary to execute a test case or set of cases. Test procedures
may contain information regarding the loading of data and executables into the test system, directions
regarding sign in procedures, instructions regarding the handling of test results, and anything else
required to successfully conduct the test.
1.3 A u t o m a t e d Te s t C a s e s

1.3.1 N A M E O F F U N C T I O N Te s t C a s e

Project Name/Number Generic Project / Project Request # Date

Test Case Description Check all drop down boxes, fill in boxes and pop-up windows Build #
operate according to requirements on the main “Pond” web
page. Run #

Function / Module Under Test B1.1 Execution Retry #

Test Requirement # Case # AB1.1.1 (A for Automated)


Written by

Goals Verify that Pond module functions as required


Setup for Test Access browser, Go to …..
Pre-conditions Login with name and password. When arrive at Generic Main Menu…..

Step Action Expected Results Pass / Fail Actual Results if Step Fails

1. Go to Pond and ….. From the Generic Main Menu, click on the Pond gif
and go to Pond web page. Once on the Pond web
page, check all drop down boxes for appropriate
information (eg Time….7a, 8a in 1 hour
increments), fill in boxes (remarks allows alpha and
numeric but no other special characters), and pop
up windows (eg. Privacy. Ensure it is retrieved, has
correct verbage and closes).
Project Glossary 9
1.4 Glossar y Reference

Term Definition Pertains to:

CAT Customer Acceptance Testing

CR Change Request

DOC Document Only Change Problem Log

ADTP Automated Detail Test Plan

DTP Detail Test Plan

DUP Duplicate Problem Log

FIX Fix incident resolved/repaired Problem Log

MTP Master Test Plan

PDD Project Definition Document

PMO Project Management Office

PR Project Request

SIT System/Integration Test

SME Subject Matter Expert

UTR Developer unable to recreate Problem Log

WAD Working As Designed Problem Log


1.5 S a m p l e A d d r e s s e s f o r Te s t i n g

United States

International
1.6 Te s t C r e d i t C a r d N u m b e r s

American Express/Optima (AX)

Visa (BA)

Diners Club (DC)

Discover Card/Novus (DS)

MasterCard (IK)

You might also like