Professional Documents
Culture Documents
1.
Introduction..................................................................................................................................... 2
1.1
Purpose................................................................................................................................... 2
1.2
Area of Appliance.................................................................................................................... 2
1.3. Responsibilities........................................................................................................................ 2
2. SDP Phases and Testing activities..................................................................................................2
2.1
Testing during the Analysis and Planning phases....................................................................3
2.2
Testing during the Design phase..............................................................................................3
2.3
Testing during the Implementation phase................................................................................4
2.4
Testing during the Testing phase.............................................................................................5
2.5
Testing during the Deployment phase......................................................................................6
QA activities................................................................................................................................ 6
2.6
during the Transition and Support phases...............................................................................6
3. Methods and Test documents......................................................................................................... 6
3.1
Test Plan.................................................................................................................................. 6
3.2
Test Types............................................................................................................................... 7
3.2.1
Unit Testing....................................................................................................................... 7
3.2.2
Component Testing........................................................................................................... 7
3.2.3
Integration Testing............................................................................................................ 7
3.2.4
Regression Testing...........................................................................................................7
3.2.5
User Interface Testing.......................................................................................................8
3.2.6
Load and Performance Testing.........................................................................................8
3.2.7
Solution Documentation review........................................................................................8
3.2.8
Automated Testing............................................................................................................8
3.3
Test Cases............................................................................................................................... 8
3.3.1
Create Test scenarios.......................................................................................................8
3.3.2
Identify Test cases............................................................................................................ 9
3.3.3
Identify Data Values to Test..............................................................................................9
3.4
Test Reports........................................................................................................................... 11
3.4.1
Categories of software errors..........................................................................................11
3.4.2
Reporting and Describing an Issue.................................................................................11
3.4.3
Test Reports................................................................................................................... 12
4. Exclusion points............................................................................................................................ 14
5. Links and References................................................................................................................... 14
Change Comments..................................................................................................................................
1.
Introduction
1.1 Purpose
The purpose of this Procedure is to outline the QA Team activity and responsibilities in Q-company. The
procedure outlines phases and procedures of quality assurance and quality control activities, to cover
high quality during all phases in the Software Development Process, in accordance with the rules of the
Quality Management System. The QA Team activities include all quality assurance (QA) and all quality
control (QC) activities.
The procedure defines test types and techniques, test documentation, test cycle and activities. Also are
described specific techniques, rules, norms and directions for testing process. As very important part of
the QMS, the cycle of testing documentation is described and the basic rules are indicated for
compilation, contents and maintain of testing documentation.
In this document is used the term testing as a summary of quality assurance and quality control activities
during all phases in the SDP.
Quality Manager ensures the effective quality assurance (process focused activities) of all
processes during the project lifecycle;
QA Team ensures the high quality control (product oriented activities) during the project lifecycle;
Middle Management
1.3. Responsibilities
The Quality Manager monitors the appliance of this procedure within the QA Team during all phases in
the Software Development Process.
The Quality Manager controls the appliance of this procedure making reviews and approving test
documentation.
The effective control should be performed during internal audits also.
The Middle Management must include the quality control activities in the project tasks and must estimate
the QA Team specific tasks in the schedule.
The QA Team must follow the rules in this procedure, prepare the relevant documents according the
phases in the Software Development Process and execute own activities and tasks with high
responsibility and diligence.
The project Team should prepare a design of the solution with maximum testability the solution
should have a set of components with capsulated functionality and clear defined inputs and expected
outputs.
Perform Test Plan
For Test Plan are described test techniques, that should be performed during Testing phase.
Define test structure and strategy
Define test types, which will be executed and test succession. During the Implementation phase
should be generated writing off notes for test cases
Acceptance Criteria
In the Test Plan should be defined acceptance criteria the set of criteria (tests) for the
successful acceptance of the solution from the QA Team (during Testing phase) and from the Client
(during Deployment phase).
Analyze stability of acquisitions
Perform initial stability test to avoid bad acquisition.
Analyze customer data
Identify errors that the QA Team missed. Asses the severity of the missed errors missed or that
were deferred. Develop an empirical basis for evaluating deferred errors. Justify the expense of
necessary further testing.
Review the UI for consistency
The QA Team should unsure the conformity with User Interface standards. Especially important is
to verify that UI contain all client requirements and include no extra-steps.
Negotiate early testing environment
Outputs:
Test Plan
Integration testing
Integration testing is conducting to inspect how the components work together. The component
groups can consist of two or more components, as well as all whole system. At this case, the QA
Team should assure that group of components interact well, and whole component cluster works
properly. Like in the previous, phase the QA Team tests components cluster in testing environment
identical with the production environment.
Functional and System testing
These testing tasks are called most popular Independent Verification and Validation. The functional
and system testing will be performed with all specifications, design documents and all clarifications
from the Meeting Minutes documents. During these tasks, the QA Team should run examples tests
as following:
- Requirements verification compare the programs behavior against the specifications;
- Usability;
- Boundary conditions check the program response to all boundary and exceptional input
values;
- State transitions inspect does the program switch correctly from state to state;
- Mainstream usage tests to use the program the way the QA Team expects client to use it.
- Load and Performance test - for performance test identify tasks and measure how long it
takes to do each. Load test study the behavior when it is working at its limits: volume, stress,
storage;
- Error recovery verify for all listed in documentation error messages. Error handling code is
among the least tested so these should be among QA Team most fruitful tests.
- Security inspect security, for example: How easy would it be for an unauthorized user to
gain access?
- Compatibility in dependence of technical requirements should check that program works
compatible with another, or check for conversion between two versions of one program.
- Configuration ensure that program works with range of systems configuration (computers)
and with listed environments in the requirements documents.
The system testing should be taken over the testing environment. The testing environment is that,
which is referenced in the requirements.
Regression testing
The QA Team should assure that the components work properly after any changes in the
solution. This task includes needs to execute the tests with new versions of components again. In
this case, the QA Team should re-execute the testing scenarios when new component version is
released. A regression test suite should be the minimum possible set of tests that covers an area of
the program. It should cover as many aspects (sub-functions, boundary conditions, etc.) of the area e
with correct input values for the components.
Performance testing
The objective of performance testing is performance enhancement. Should be determining which
modules are executed most often or use the largest part of the computer resources inspect time
and storage.
Deliverable Sign Off
After executing of planned set of tests, verification of fixing problems, believing in deliverable quality
(functionality and completeness) the Team Leader and QA Team prepare the formal document with
all known issues and send it to the TM. After review, the TM or authorized person (in written form)
gives the sign off for the Deliverable (in free format).
Outputs:
Test Report
Deliverable Sign off;
2.5
Deployment testing
Deployment testing contains verification of the configuration and installation procedures.
User Documentation verification
The User Documentation includes User Manual, Installation, Configuration Manual, and
Administration Manual. During the deployment, the QA Team task is to verify these documents for
consistency and conformity with Clients Requirements and Internal Standards.
Technical Documentation verification
The QA Team should review the Technical Documentation for completeness and conformity with
Client Requirements and Internal Standards
Acceptance Testing
The acceptance test should be short. It should test mainstream functions with mainstream data.
The test will be performed from the QA Team as a final test before delivering the project to the client.
In addition, the client will execute acceptance test after successful deployment of the solution. This
test should be executed on production environment. With acceptance testing the Client makes
validation for the product delivery.
Outputs:
Final Test Report
Deliverable package;
In the initial version of the Test Plan the QA team can describe the following:
Prepare a structure for the test plan. Dont list boundary conditions yet, but make a place for
them;
Describe Test Types;
Define acceptance criteria;
Describe Test items;
List the main features, commands, menu options i.e., start preparing the function list;
Describe reporting and bug fixing schema;
Describe Test deliverables;
Describe Project Environment prerequisites;
Include actors responsibilities.
In some cases, the Test Plan can be revised during the Deployment phase, but the initial revision of Test
Plan should be performed during the Design phase. In this document, all referenced documents and
deliverables (input and output documents) should be indicated.
The Test Plan is part of the Design documents and it should be approved by the Quality Manager.
.3.2.1
Unit/Component Testing
In unit testing the developers fill in input data, observe output data, but do not know, or pretend not to
know, how the program works. The QA team looks for uncommon input data and conditions that might
lead to uncommon outputs. Input data are interesting representatives of a class of possible inputs if
they are the ones most likely to make visible an error in the program. The unit/component testing is the
first cycle of the testing process - writing code and debugging is an iterative cycle of programming.
.3.2.2
QA Team performs component integration testing after unit testing. In this test, the program is examined
as a set of components and the focus should be on the interfaces and relationships. It does not inspect a
particular component for proper functionality and proper working. For each interface and relationship of
the components, the QA team executes a specific test scenario to check how the interfaces between
components react to typical and uncommon inputs or environment events.
.3.2.3
System Testing
The system testing is a more thorough release test. It provides an opportunity to review the solution
functionality in details before the solution deployment. The QA team must carefully compare the
program, the project documentation, and the clients requirements. The System testing can be performed
immediately after the component integration testing to inspect how the tested components work together
in one system. The QA team should ensure proper functionality, including navigation, data entry,
processing and retrieval for all related parts of the system
.3.2.4
Regression Testing
The regression testing is used in two different ways. Common for both is the idea of reusing old tests:
Imagine finding an error, fixing it and then repeating the test that made the issue visible in the first
place. Under these circumstances, the regression testing is done to make sure that the fixed
code works properly;
Imagine making the same fix, and testing it, but then executing a standard series of tests to make
sure that the change did not disturb anything else.
Both types of the regression tests are executed whenever errors are fixed.
.3.2.5
The User Interface Testing is a specific testing routine. When testing the UI the QA team should measure
the following items:
Functionality does the UI fit the clients requirements;
UI reaction time is the UI reaction time in admissible bounds;
Availability does the UI work adequately in any possible cases;
Usability the UI should have intuitive navigation and no extra steps;
Look & feel does the UI match project Design standards (Clients and Internal);
.3.2.6
The Performance tests identify tasks and measure how long it takes to do each. One objective of
performance testing is performance enhancement. These tests can determine which modules are
executed most often or use the largest part of the computer resources (time and storage). These
modules should be examined in details and rewritten to run more quickly. The solution's bad
performance may reflect bugs, especially when there are clients expectations for high performance of
some modules or components.
.3.2.7
The Solution Documentation can include: User Documentation, Technical Documentation, Design
Documents. During the Deployment phase, the QA team should review and verify all documents, related
to the project. The verification of the User and Technical documentation should be done towards
correspondence with internal standard IS.IML.001.01. - Guidelines for Solution Documentation.
.3.2.8
Automated Testing
Most tests can be executed using specific test tools. For particular project or test types, the QA team can
use different tools to automate the testing process. Anutomated testing consists of writing specific
commands, scripts, which should be executed to simulate a specific test system, load, performance,
etc.
It is strongly recommended to prepare test scripts and use them for regression testing. The automated
testing saves much time for manual testing, reduce the risk in software development and reduces
mistakes made by the QA team during the testing process.
.3.3.1
The QA Team should review the use-case description in details, identify each combination of main and
alternate flows (the test scenarios) and create a scenario matrix. The table below shows an example of a
partial scenario matrix.
Scenario Name
Scenario 1 Successful registration
Scenario 2 Unidentified user
Scenario 3 User quits
Scenario 4 Course catalog system unavailable
Scenario 5 Registration closed
Scenario 6 Cannot enroll
.3.3.2
Starting Flow
Basic Flow
Basic Flow
Basic Flow
Basic Flow
Basic Flow
Basic Flow
Alternate
A1
A2
A4
A5
A3
The QA team can identify test cases by analyzing the test scenarios. For some scenarios a few cases
can exist; for others only one. The next step in fleshing out the test cases is to read the use-case
description and find the conditions or data elements required to execute the various scenarios.
The QA team creates a Test case matrix. Here is an example:
Test
case
ID
RC1
Scenario/Condition
User ID
Password
Courses
selected
Prerequisites
fulfilled
Course
Open
Schedule
Open
Expected Result
RC2
Scenario 2
unidentified student
N/A
N/A
N/A
N/A
N/A
RC3
Scenario 3 valid
user quits
Scenario 4 course
registration system
unavailable
Scenario 5
registration closed
Scenario 6 cannot
enroll course full
Scenario 6 cannot
enroll prerequisite
not fulfilled
Scenario 6 cannot
enroll schedule
conflict
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Schedule and
confirmation
number
displayed
Error message;
back to login
screen
Login screen
appears
Error message;
back to step 2
N/A
N/A
N/A
N/A
RC4
RC5
RC6
RC7
RC8
Error message;
back to step 2
Error message;
back to step 3
Error message;
back to step 4
Error message;
back to step 4
V indicates valid;
I indicates invalid;
N/A means that it is not necessary to supply a data value in this test case.
.3.3.3
The test cases cannot be implemented or executed without test data. In accordance with the above
table, the QA team should substitute actual data values for valid (V) and invalid (I) indicators.
Most important for test data identification and test cases design is to use the following techniques:
Equivalence class analysis;
Boundary analysis;
Testing state transitions;
Testing race conditions and other time equivalence testing;
Draw a line
The QA team does not need to add all possible values to find data errors during testing. The first task is
to analyze data types and to define their boundaries. If the QA team uses boundary values this fact can
spread many error cases. The following examples present analysis of data entry with valid and invalid
equivalence classes, boundary and special cases:
Field
Problem
Report #
0 - 999999
Invalid
Classes
1. < 0
2. > 999999
3. Duplicate
report #
Program
Up to 36 printable chars.
<Return>, <Tab>, leading
<Space> not OK. Must
much a name in reference
file PROGRAMS.DAT
12 printable characters
UNKNOWN OK
Version
Report Type
1 digit, range 1 - 6
Equivalence
any
other
4. Any non-digits
1. Contains non-printing
chars (like control chars)
Notes
1. 0
2. 999999
3. Enter a record after
999999
4. -00001
Required field
2. No matching name in
reference file
1. < 1
2. > 6
3. Not a number
1. 0
2. 7
3. /
4. :
The valid input range is 1 to 99. Use 0 and 100 as tests of invalid input.
If a program writes checks from $1 to $99, can you make it write a check for a negative amount, for $0, or
for $100? Maybe you cant, but try.
If the program expects an uppercase letter give it A and Z. Test @ because its ASCII code is just below the
code for A, and [, the character just beyond Z. Try a and z too.
If the program prints lines from one dot to four inches long, try one-dot lines and four-inch lines. Try to
make it print a line thats four inches and one dot-width long. Try to make it attempt a zero-dot line.
If the inputs should sum to 180, feed the program values that sum to 179, 180, 181.
If the program needs a specific number of inputs, give it that many, one more, and one fewer.
If the program gives you menu options B, C and D, test each of them and test A and E too.
Try sending your file to the printer just before and after someone else has sent his.
When reading from or writing to a disk file, check the first and last characters in the file. (Can you read past
the files end?)
The state transitions can be much more complex than menu-to-menu. For testing interactions between
paths, the QA team should select paths through the program as follows:
Test all paths that the actors are particularly likely to follow;
If the QA team has any reason to suspect that choices at one menu level or data entry screen
may affect the presentation of choices elsewhere, the QA team can test the effects of those
choices or entries;
Along with conducting the most urgent types of tests, as described above, the QA team can try a
few random paths through the solution. The QA team selects different paths in each test cycle
randomly.
The QA team cans try to guess all possible errors to inspect handling, for correct error messages and
their presence. The QA team should include the expected outputs after errors appear in the test cases.
In this section, some rules and recommendation for the QA team are described.
Boundary-related errors
The simplest boundaries are numeric, but largest and smallest amounts of memory that a program
can cope with boundaries too. In this means that common boundary conditions can be used.
Calculation errors
Remember that simple arithmetic is difficult and error-prone in some languages.
Initial and later states
Bear in mind that a function might not fail only the first time the QA team uses it.
Control flow errors
A control flow error occurs when the program does the wrong thing next.
Errors in handling or interpreting data
A set of data might be passed back and forth many times. In the process, it might be corrupted or
misinterpreted.
Race conditions
The classic race is between two events. If A comes first, the program works. If B happens before A,
the program fails because it expected A to always occur before B.
Load conditions
The program may misbehave when overloaded.
Hardware
Programs send bad data to devices, ignore error codes coming back, and try to use devices that are
busy or arent there.
Source and version control
Old problems reappear if the developers use an old version of one subroutine with the latest version
of the rest of the program.
Documentation
Poor documentation can lead users to believe that the software is not working correctly.
Testing errors
Test errors are among the most common errors discovered during testing. If the program leads the
QA team to make mistakes, it has design problems, but test errors are test data too.
.3.4.2
This section presents instructions to the QA team for issues/errors reporting to cover:
Correct test documents;
Good communication and synchronization between the QA and Developers Teams;
Effective tracking of tasks;
Obligatory and optional items, included in issue description.
When describing and reporting issues the QA team should follow the following instructions:
Before reporting any issue, be sure of its existence if it is necessary, check several times. Dont
describe suspicions for error appearance;
Every issue or error description should be described in a possibly plain, exact and
comprehensible style;
Make the error description reproducible. This is most important, because if the Developers cant
reproduce it, it is the same as if it does not exist. If an error is not reproducible, write that in your
report by all means;
Describe step-by-step the continuity at which errors arise;
Describe in detail, dont be stingy in description, but dont be wasteful either;
Use visual tools if it is necessary this makes description easier for adoption.
The following items are obligatory for the issue/error description:
Area, screen, page or component in which error appears;
Issue category for example: error/bug, improvement, enhancement, change, etc.;
Error priority for example: ASAP, high, medium, low;
Issue state for example: identified, in progress, finished, verified, closed, etc.;
Solution build number, version or release;
Reported by person name and project role;
Assigned to person name and project role;
The following items are optional and recommendable for the QA team:
Attachments doc files, images, etc.;
Issue/error severity for example: critical, major, normal, minor, etc.;
Suggested fix;
Opinion or idea when can the mistake be found and how to debug it;
Additional comments.
The following example presents how to describe an issue/error:
Project: ABCD
Version (Build): 1.0.4
Priority: High
Severity: Critical
Attachments:
1. Invalid_discount_error.jpg a short screen of appeared result after error occurs;
2. Invalid_characters.txt Included all invalid character for discount field, according to design;
3. correct_messages.txt included correct error messages, needed to show if error appears
Functional area: EstimateDetails.aspx, discount field
Error summary (name, title): Invalid value validation
Description: Add invalid value (character) in discount field. For example, add negative number or non-numeric
character. Click on Save button. The system crashes, cannot save data and have no any error messages.
Date: 12/01/2006
Reported by: QA Engineer name
Assigned to: Developer name
Comments: Discount field is described in design as float data type 2:2. Should catch all invalid characters.
.3.4.3
Test Reports
In this section some rules and recommendations to the QA team how to compose Test Reports are
described.
During the Implementation phase the QA Team describes Issues/Bugs in Q-company internal Issue
Tracking System (ITS) and prepares Issues Reports based on the information in the ITS. During the
Testing phase the QA team may write many reports for different cases or test types. During the
Deployment phase, the QA team should describe Summary Reports, which may be Build or Release
Reports.
The Issue reports substance is described in the previous chapter (Reporting and Describing an Issue).
The Summary Reports should contain the following items:
Project name;
Version number;
Date of build/release;
Reported by person name and project role;
Summary (name, title);
Test type;
Description, tables, figures;
Test Results;
Comments.
Items for which there is no data or which are not a subject of the performed test may not be filled in.
For specific projects, during the Testing phase the QA team can use and fill in Test Cycle Complete
Reports, which can be used for detailed analyses. The following is a simple example of the contents of
these reports:
Program: ABCD
Version: 1.0
Unresolved
Problems
this Version
Fatal
8
Serious
48
Minor
80
Total
136
Resolution for this Version:
Fixed: 22
Deferred: 6
Irreproducible: 5
Other: 6
New Problems
Resolved
Remaining Problems
10
12
15
37
9
16
14
39
9
44
81
134
Before
At the end of the Deployment phase after Code Freeze, the QA team can write a Final Summary Test
Report (optional). This Test Report can contain a summary of performed tests, number of bugs found
and fixed, etc.
.4 Exclusion points
The clients requirements for testing methodology, standards and procedures are with high priority. If
both sides agree to a specific testing methodology, different from the one described in this procedure, the
QA Team must make changes in the appliance of the testing methodology or perform other testing
methodology in conformity with the clients requirements.