Professional Documents
Culture Documents
Prepared for
Project Name
Prepared by
Company Name
Date
July 17, 2001
D O C U M E N T R E V I S I O N I N F O R M AT I O N
D O C U M E N T A P P R O VA L
This signature page is to indicate approval from COMPANY NAME sponsor and Client sponsor for the attached Abacus
Detail Test Plan for the PROJECT NAME. All parties have reviewed the attached document and agree with its contents.
Date
Date
Date
Date
Date
Date
1 I n t r o d u c t i o n .................................................................................................................v i
2 T e s t D e s c r i p t i o n ......................................................................................................v i i
3 T e s t S c o p e ......................................................................................................................x
4 T e s t A p p r o a c h .............................................................................................................x i
5 T e s t D e f i n i t i o n .........................................................................................................x i i
7 T e s t I s s u e s a n d R i s k s .........................................................................................x i x
8 A p p e n d i c e s ..................................................................................................................x x
9 P r o j e c t G l o s s a r y ......................................................................................................2 5
This Automated Testing Detail Test Plan (ADTP) will identify the specific tests that are to be
performed to ensure the quality of the delivered product. System/Integration Test ensures the product
functions as designed and all parts work together. This ADTP will cover information for Automated
testing during the System/Integration Phase of the project and will map to the specification or
requirements documentation for the project. This mapping is done in conjunction with the Traceability
Matrix document, that should be completed along with the ADTP and is referenced in this document.
This ADTP refers to the specific portion of the product known as PRODUCT NAME. It provides
clear entry and exit criteria, and roles and responsibilities of the Automated Test Team are identified
such that they can execute the test.
1.2 Te s t I d e n t i f i c at i o n
This ADTP is intended to provide information for System/Integration Testing for the PRODUCT
NAME module of the PROJECT NAME. The test effort may be referred to by its PROJECT
REQUEST (PR) number and its project title for tracking and monitoring of the testing progress.
1.3 Te s t P u r p o s e a n d O b j e c t i ve s
Automated testing during the System/Integration Phase as referenced in this document is intended to
ensure that the product functions as designed directly from customer requirements. The testing goal is
to identify the quality of the structure, content, accuracy and consistency, some response times and
latency, and performance of the application as defined in the project documentation.
Factors which may affect the automated testing effort, and may increase the risk associated with the
success of the test include:
• Completion of development of front-end processes
The ADTP is complete, excluding actual test results. The ADTP has been signed-off by appropriate
sponsor representatives indicating consent of the plan for testing.
The Problem Tracking and Reporting tool is ready for use. The Change Management and
Configuration Management rules are in place.
The environment for testing, including databases, application programs, and connectivity has been
defined, constructed, and verified.
In establishing the exit/acceptance criteria for the Automated Testing during the System/Integration
Phase of the test, the Project Completion Criteria defined in the Project Definition Document (PDD)
should provide a starting point. All automated test cases have been executed as documented. The
percent of successfully executed test cases met the defined criteria. Recommended criteria: No Critical
or High severity problem logs remain open and all Medium problem logs have agreed upon action
plans; successful execution of the application to validate accuracy of data, interfaces, and connectivity.
The results for each test must be compared to the pre-defined expected test results, as documented in
the ADTP (and DTP where applicable). The actual results are logged in the Test Case detail within the
Detail Test Plan if those results differ from the expected results. If the actual results match the
expected results, the Test Case can be marked as a passed item, without logging the duplicated results.
A test case passes if it produces the expected results as documented in the ADTP or Detail Test Plan
(manual test plan). A test case fails if the actual results produced by its execution do not match the
expected results. The source of failure may be the application under test, the test case, the expected
Severity Codes are used to prioritize work in the test phase. They are assigned by the test group and
are not modifiable by any other group. The following standard Severity Codes to be used for
identifying defects are:
4 Low All test cases and procedures passed as written, but there
could be minor revisions, cosmetic changes, etc. These
defects do not impact functional execution of system
The use of the standard Severity Codes produces four major benefits:
• Standard Severity Codes are objective and can be easily and accurately assigned by those executing
the test. Time spent in discussion about the appropriate priority of a problem is minimized.
• Standard Severity Code definitions allow an independent assessment of the risk to the on-schedule
delivery of a product that functions as documented in the requirements and design documents.
The scope of testing identifies the items which will be tested and the items which will not be tested
within the System/Integration Phase of testing.
1. PRODUCT NAME
2. PRODUCT NAME
3. PRODUCT NAME
4. PRODUCT NAME
5. PRODUCT NAME
1. PRODUCT NAME
2. PRODUCT NAME
The mission of Automated Testing is the process of identifying recordable test cases through all
appropriate paths of a website, creating repeatable scripts, interpreting test results, and reporting to
project management. For the Generic Project, the automation test team will focus on positive testing
and will complement the manual testing undergone on the system. Automated test results will be
generated, formatted into reports and provided on a consistent basis to Generic project management.
System testing is the process of testing an integrated hardware and software system to verify that the
system meets its specified requirements. It verifies proper execution of the entire set of application
components including interfaces to other applications. Project teams of developers and test analysts
are responsible for ensuring that this level of testing is performed.
Integration testing is conducted to determine whether or not all components of the system are working
together properly. This testing focuses on how well all parts of the web site hold together, whether
inside and outside the website are working, and whether all parts of the website are connected. Project
teams of developers and test analyst are responsible for ensuring that this level of testing is performed.
For this project, the System and Integration ADTP and Detail Test Plan complement each other.
Since the goal of the System and Integration phase testing is to identify the quality of the structure,
content, accuracy and consistency, response time and latency, and performance of the application, test
cases are included which focus on determining how well this quality goal is accomplished.
Content testing focuses on whether the content of the pages match what is supposed to be there,
whether key phrases exist continually in changeable pages, and whether the pages maintain quality
content from version to version.
Accuracy and consistency testing focuses on whether today’s copies of the pages download the same as
yesterday’s, and whether the data presented to the user is accurate enough.
Response time and latency testing focuses on whether the web site server responds to a browser request
within certain performance parameters, whether response time after a SUBMIT is acceptable, or
whether parts of a site are so slow that the user discontinues working. Although Loadrunner provides
the full measure of this test, there will be various AD HOC time measurements within certain
Winrunner Scripts as needed.
Performance testing (Loadrunner) focuses on whether performance varies by time of day or by load
and usage, and whether performance is adequate for the application.
Completion of automated test cases is denoted in the test cases with indication of pass/fail and follow-
up action.
1.11 Te s t F u n c t i o n a l i t y D e f i n i t i o n ( R e q u i r e m e n t s
Te s t i n g )
The functionality to be automated tested is listed in the Traceability Matrix, attached as an appendix.
For each function to undergo testing by automation, the Test Case is identified. Automated Test Cases
are given unique identifiers to enable cross-referencing between related test documentation, and to
facilitate tracking and monitoring the test progress.
As much information as is available is entered into the Traceability Matrix in order to complete the
scope of automation during the System/Integration Phase of the test.
1.12 Te s t C a s e D e f i n i t i o n ( Te s t D e s i g n )
Each Automated Test Case is designed to validate the associated functionality of a stated requirement.
Automated Test Cases include unambiguous input and output specifications. This information is
documented within the Automated Test Cases in Appendix 8.5 of this ADTP.
1.13 Te s t D at a R e q u i r e m e n t s
The automated test data required for the test is described below. The test data will be used to populate
the data bases and/or files used by the application/system during the System/Integration Phase of the
test. In most cases, the automated test data will be built by the OTS Database Analyst or OTS
Automation Test Analyst.
1. All automated scripts will begin with GE abbreviation representing the Generic Project and be filed
under the Winrunner on LAB11 W Drive/Generic/Scripts Folder.
2. GE will be followed by the Product Path name in lower case: air, htl, car
3. After the automated scripts have been debugged, a date for the script will be attached: 0710 for
July 10. When significant improvements have been made to the same script, the date will be
changed.
4. As incremental improvements have been made to an automated script, version numbers will be
attached signifying the script with the latest improvements: eg. GEsea0710.1 GEsea0710.2
The .2 version is the most up-to-date
1. When the accumulation of test result(s) files for the day are formulated, and the statistics are
confirmed, a report will be filed that is accessible by upper management. The daily Report file will be
as follows: GEdaily0718 GE for project name, daily for daily report, and 0718 for the date the
report was issued.
2. When the accumulation of test result(s) files for the week are formulated, and the statistics are
confirmed, a report will be filed that is accessible by upper management. The weekly Report file will
be as follows: GEweek0718……… GE for project name, week for weekly report, and 0718 for the
date the report was issued.
1. LAB 11, located within the GE Test Lab, will “house” the original Winrunner Script, Results and
Report Repository for automated testing within the Generic Project. WRITE access is granted
Winrunner Technicians and READ ONLY access is granted those who are authorized to run scripts
but not make any improvements. This is meant to maintain the purity of each script version.
1.21 Te s t E nv i r o n m e n t
Various Personal Computer (486 or Higher) with monitor & required peripherals; with
connectivity to internet test/production environments. Must be enabled to
ADDITIONAL REQUIREMENTS.
Table 4 Software
The following is a list of the software needed to create a production like environment:
COMPANY NAME Approve project development, handle major issues Name, Phone
Sponsor related to project development, and approve
development resources
Abacus Sponsor Signature approval of the project, handle major issues Name, Phone
Abacus Project Ensures all aspects of the project are being addressed Name, Phone
Manager from CUSTOMERS’ point of view
COMPANY NAME Manage the overall development of project, including Name, Phone
Development obtaining resources, handling major issues, approving
Manager technical design and overall timeline, delivering the
overall product according to the Partner Requirements
COMPANY NAME Provide PDD (Project Definition Document), project plan, Name, Phone
Project Manager status reports, track project development status, manage
changes and issues
COMPANY NAME Provide Technical guidance to the Development Team Name, Phone
Technical Lead and ensure that overall Development is proceeding in the
best technical direction
COMPANY NAME Develop and deliver the necessary Business Services to Name, Phone
Back End support the PROJECT NAME
Services Manager
COMPANY NAME Develops ADTP and Detail Test Plans, tests changes, Name, Phone
Test Coordinator logs incidents identified during testing, coordinates
testing effort of test team for project
COMPANY NAME Tracks SCR’s in DEFECT TRACKING TOOL. Reviews Name, Phone
Tracker new SCR’s for duplicates, completeness and assigns to
Coordinator/ Module Tech Leads for fix. Produces status documents
Tester as needed. Tests changes, logs incidents identified
during testing.
COMPANY NAME Tests changes, logs incidents identified during testing. Name, Phone
Automation
Enginneer
1.24 A u t o m a t i o n Te s t P r e p a r a t i o n
1. Write and receive approval of the ADTP from Generic Project management
2. Manually test the cases in the plan to make sure they actually work before recording repeatable
scripts
3. Record appropriate scripts and file them according to the naming conventions described within this
document
4. Initial order of automated script runs will be to load GUI Maps through a STARTUP script. After
the successful run of this script, scripts testing all paths will be kicked off. Once an appropriate
number of PNR’s are generated, GenericCancel scripts will be used to automatically take the
inventory out of the test profile and system environment. During the automation test period,
requests for testing of certain functions can be accommodated as necessary as long as these
functions have the ability to be tested by automation.
5. The ability to use Generic Automation will be READ ONLY for anyone outside of the test group.
Of course, this is required to maintain the pristine condition of master scripts on our data
repository.
6. Generic Test Group will conduct automated tests under the rules specified in our agreement for use
of the Winrunner tool marketed by Mercury Interactive.
7. Results filed for each run will be analyzed as necessary, reports generated, and provided to upper
management.
The table below lists known project testing issues to date. Upon sign-off of the ADTP and Detail Test
Plan, this table will not be maintained, and these issues and all new issues will be tracked through the
Issue Management System, as indicated in the projects approved Issue Management Process.
Table 7 Issues
Target Date
Issue Impact for Resolution Owner
COMPANY NAME test team Testing may not cover some Beginning of CUSTOMER TO
is not in possession of browsers used by CLIENT Automated PROVIDE
market data regarding what customers Testing during
browsers are most in use in System and
CUSTOMER target market. Integration Test
Phase
OTHER
1.26 Risks
The table below identifies any high impact or highly probable risks that may impact the success of the
Automated testing process.
1. Meet with
Environment
Group
2. Meet
with
Developme
nt Group
3.
The purpose of the Traceability Matrix is to identify all business requirements and to trace each
requirement through the project's completion.
Each business requirement must have an established priority as outlined in the Business Requirements
Document. They are:
Essential - Must satisfy the requirement to be accepted by the customer.
Useful - Value -added requirement influencing the customer's decision.
Nice-to-have - Cosmetic non-essential condition, makes product more appealing.
The Traceability Matrix will change and evolve throughout the entire project life cycle. The
requirement definitions, priority, functional requirements, and automated test cases are subject to
change and new requirements can be added. However, if new requirements are added or existing
requirements are modified after the Business Requirements document and this document have been
approved, the changes will be subject to the change management process.
The Traceability Matrix for this project will be developed and maintained by the test coordinator. At
the completion of the matrix definition and the project, a copy will be added to the project notebook.
B1 Pond E
B2 River E
B3 Lake U
B4 Sea E
B5 Ocean E
B6 Misc U
B7 Modify E Legend:
L1 Language E
EE1 End-to-End Testing EE B = Order Engine
L = Language
N = Nice to have
EE = End-to-End
E = Essential
U = Useful
1.2 D e f i n i t i o n s f o r U s e i n Te s t i n g
1.2.1 Te s t Re q u i r e m e n t
A scenario is a prose statement of requirements for the test. Just as there are high level and detailed
requirements in application development, there is a need to provide detailed requirements in the test
development area.
1.2.2 Te s t C a s e
A test case is a transaction or list of transactions that will satisfy the requirements statement in a test
scenario. The test case must contain the actual entries to be executed as well as the expected results,
i.e., what a user entering the commands would see as a system response.
1.2.3 Te s t P r o c e d u r e
Test procedures define the activities necessary to execute a test case or set of cases. Test procedures
may contain information regarding the loading of data and executables into the test system, directions
regarding sign in procedures, instructions regarding the handling of test results, and anything else
required to successfully conduct the test.
1.3 A u t o m a t e d Te s t C a s e s
1.3.1 N A M E O F F U N C T I O N Te s t C a s e
Test Case Description Check all drop down boxes, fill in boxes and pop-up windows Build #
operate according to requirements on the main “Pond” web
page. Run #
Step Action Expected Results Pass / Fail Actual Results if Step Fails
1. Go to Pond and ….. From the Generic Main Menu, click on the Pond gif
and go to Pond web page. Once on the Pond web
page, check all drop down boxes for appropriate
information (eg Time….7a, 8a in 1 hour
increments), fill in boxes (remarks allows alpha and
numeric but no other special characters), and pop
up windows (eg. Privacy. Ensure it is retrieved, has
correct verbage and closes).
Project Glossary 9
1.4 Glossar y Reference
CR Change Request
PR Project Request
United States
International
1.6 Te s t C r e d i t C a r d N u m b e r s
Visa (BA)
MasterCard (IK)