Professional Documents
Culture Documents
Incorrect calculations
o This is seen whenever mathematical functions and
operators are involved.
Inconsistent processing
o This refers to software that only works correctly in one
environment and cannot be transported and used in
another platform.
Introduction to DEFECTS
A defect is defined as
o A variance from the desired product quality.
o Any situation where the system does not behave as
indicated in the specification.
Error
o Mistake made by programmer.
o Human action that results in the software containing a
fault/defect
Defect/Fault
o Symptom of an error in a program
o Condition that causes the system to fail
o Synonymous with bug
Failure
o Incorrect program behavior due to fault in the program.
o Failure can be determined only with respect to a set of
requirement specification.
Categories of defects
Defects generally fall into the following three categories:
Wrong
o The specification has been implemented incorrectly.
o This defect is a variance from customer/user
specification.
Missing
o A specification or wanted requirement is not in the built
product. This can be
Extra
o A requirement included into the product that was not
specified.
o This is always a variance from specification.
Key Points
Testing Techniques
Testing Techniques can be classified into two categories:
Static testing
Dynamic testing
Static Testing
o Review
o Inspection
o walkthrough
Dynamic Testing
It is not fully black box testing because the tester should have
the knowledge of internal design of the code
The design techniques for a test case are used to build test
data
Error Guessing
Equivalence Partitioning
Key Points
Functional Testing
Different types of input (test data) are given for testing the
functionality of the component.
Integration Testing
Top-down Testing
Bottom-up Testing
System Testing
Acceptance Testing
Alpha Testing
Beta Testing
Key Points
Smoke Testing
This test reveals any simple failures and ensures that the
system is ready for further testing
For Example :
o Smoke test is done to check whether the link or button
does some action. If the answer is No then the test
condition fails
Compatibility Testing
For Example
o Test the application on different operating systems
such as Windows 2000 and XP, Macintosh, Linux and
UNIX.
o Test the web application on different browsers such as
Internet explorer, Mozilla Firefox, Netscape navigator
and Opera.
Usability Testing
Performance Testing
Exploratory Testing
Security Testing
Recovery Testing
Regression Testing
Key Points
Stages of STLC
Requirements Analysis
Illustration of RTM:
S.No
Requirements
Testcases
Login Page
Testcase001
Home Page
Testcase002
Inbox Page
Testcase003
Compose Page
Testcase004
Test Execution
Once executed the test cases, will get the actual results.
Compare the expected result with actual result then fill the
status of the test case.
Defect Tracking
The defects are fixed and the tester re-tests the application to
ensure full resolution of the defects.
Key Points
Test Plan
For Example
o Features to be tested would be User required modules
with clear priority levels defined (H, M, and L).
o Features not to be tested can be features which are
not part of the current release
Approach
Testing Environment
Testing Methodologies
Deliverables
o Test Lead
Assumptions
Entry and Exit Criteria are the inputs and outputs for the
different stages of Testing.
For example: In Functional testing
o The entry criteria would be
Test environment should be ready.
White box testing should be completed
Test cases should be ready with test data.
o Exit criteria would be
Percentage of test execution i.e. the number of
test cases which are passed.
Number of defects sorted by severity levels.
Defect Tracking
Test Automation
Test automation is the part of the test plan which defines the
tools to be used for Testing, Defect tracking and other testing
activities in the test plan
It also describes automation framework.
Templates
Key Points
Expected
result
Actual
result
Status Comments
1
2
Author:
Reviewed by:
Last modified date:
The tester fixes all the review comments and gets the
test case approved by the test lead before execution
of the test.
Step no.
Comments
Severity
Author
Major
fixed
Key Points
Assigned
Open
Reopen
Tester
Dev. Lead
Developer
Fixed
Developer
Retest
Test Lead
Closed
Tester
States of Bugs
The different states of bug can be new, assigned, open, fixed,
retest and closed.
New
o The bug is in the New state when the bug is detected
the first time.
o The tester logs the bug with the status as New in the
defect report
Assigned
o Here the bug is assigned to the developer to fix it.
o The development lead logs the status as Assigned in
the defect report
Open:
o The developer changes the status as Open when he
starts fixing the bug.
Fixed:
o Once the developer fixes the bug, he changes the
status as Fixed which is reviewed by the development
lead and it is forwarded to test lead.
Retest:
o The test lead changes the status as Retest and sends
it to tester to retest to check whether the bug is fixed.
Closed:
o The tester checks whether the defect is fixed or not. If
yes then the status is changed to Closed.
Reopen:
o If the defect is not fixed, the tester changes the status
to Reopen.
Rejected:
o The test lead reviews the bug and if the bug is not
valid then the state is changed to Rejected.
Defect id/name:
o Describes the name of the defect according to the
company standard.
o The naming convention varies from company to
company.
o Eg: cbo_hr_def_001.
Release name:
o Describes the name of the Release.
o Eg: Release_1, Release_2, Release_3
Project name:
o Describes the name of the project.
o Eg: CitiBankOnline.
Module name:
o Describes the name of the module.
o Eg: Sales.
Status:
o Describes the status of the bug.
o Eg: New, Open, Fixed, Closed.
Severity:
o Describes the impact of the defect on the business.
o Eg: critical/major/minor.
Priority:
o Describes the priority or urgency of fixing the defect.
o Eg: High/Medium/Low.
Environment:
o Describes the environment of where the defect was
found like operating system and browser name.
o Eg: Win XP, IE 6.0
Detailed description:
o Describes the steps on how the defect was detected.
Expected result:
o Describes the expected result based on client
requirements.
Actual result:
o Describes the actual result after the test condition was
executed.
Severity
It can be specified as
o Blocker or Show stopper
o Critical
o Major
o Minor
It is specified as
Priority
P1 / P2 / P3 / P4
The duration between the bug detection and the bug being
closed is called as software bug life cycle.
Key Points
Verification Overview
Verification Methods
The following are some of the Verification methods:
o Inspection
o Walkthroughs
o Buddy Checks
The methods above are explained in the following slides
Inspection
This team finally comes out with a inspection report and the
facts on errors.
Walkthroughs
Testing Environment
Buddy checks
Verification Activities
Verification activities include verifying the following documents
Functional Design
Internal Design
Code
Test Plans
Validation Overview
Validation Methods
The following are some of the Validation methods:
Equivalence partitioning
Boundary-value analysis
Error guessing
Syntax testing
Validation Activities
Unit testing
Integration testing
Usability testing
Function testing
Acceptance testing
etc.,.
Key Points
Use the following snap shots and write the test cases.
Specifications:
Agent Name:
o
o
o
o
Password:
o Should be mercury.
o Should not be blank spaces
o Special characters not allowed
Specifications:
Date of Flight:
o Date should be greater than the system date
o Date of the year should be less than 2038.
o Check the date with different possibilities.
Specifications:
Open order
Enter order no / customer name
Check whether you are getting the appropriate data.
Enter Valid Date of Flight, Fly from, Fly to and select the
Flight.