You are on page 1of 35

SREE CORP

Testing Fundamentals
Testing Terms
SREE CORP

Testing:
Testing is the process of trying to discover every conceivable fault or weakness in a work product
Quality Control:
A set of activities designed to evaluate a developed working product.
Quality Assurance:
A set of activities designed to ensure that the development and/or maintenance process is
adequate to ensure a system will meet its objectives.
In simple words
* QUALITY CONTROL measures the quality of a product
* QUALITY ASSURANCE measures the quality of processes used to create a quality product.
Tester’s Goal:
The Goal of a software tester is to find bugs, find them as early as possible , and make sure they
get fixed
Defect/Software bug :
A software bug (or just "bug") is an error, flaw, mistake, failure, or fault in a computer program
that prevents it from behaving as intended

2
Cost of Defects
SREE CORP

Cost of Defects :
 The cost of Correcting an error in a software
product increases dramatically at each new
phase in the product’s development life
cycle
 The cost is maximized if the error is
detected after the product is shipped to the
customer
 The cost is minimized if the error is detected
in the phase where it is introduced
 It is known as the 1:10:100 Rule: the cost to
fix a defect increases exponentially the
later in the development lifecycle that it is
identified.
 A defect caught in requirements phase costs
a factor of 1 (1x) to fix.
 A defect caught in construction costs 10
times as much as in requirements.
 A defect caught in production costs up to 100
times as much as in requirements.

3
Distribution of Defects
SREE CORP

Distribution of Defects
 Specifications are the largest defect
producer and the next largest source of
defects is design
 As per the studies 56% of the defects can be
traced back to incorrect requirements. This
aspect lays stress on the need for exhaustive
verification of requirements.

4
SDLC MODELS
SREE CORP

Software Products go through several stages as they mature from Initial Concept to
finished Product
The sequence of Stages is called a Life Cycle
SDLC – Systems Development Life Cycle also called as Software Development Life
Cycle
It is a framework that describes the activities performed at each stage of a software
development project
Few Important SDLC models are
• Big Bang Model
• Code and Fix Model
• Waterfall Model
• Spiral Model
• Proto Type Model
• Agile Methodologies*

5
Big Bang Model
SREE CORP

The big-bang model for software development follows much the same principle as
shown in Figure B. A huge amount of matter (people and money) is put together, a lot
of energy is expended—often violently—and out comes the perfect software product…or
it doesn't..
The beauty of the big-bang method is that it's simple. There is little (if any) planning,
scheduling, or formal development process. All the effort is spent developing the
software and writing the code.
It's an ideal process if the product requirements aren't well understood and the final
release date is flexible. It's also important to have very flexible customers, too, because
they won't know what they're getting until the very end.

6
SDLC – Code – and - Fix
SREE CORP

A team using this approach usually starts with a rough idea of what they want, does
some simple design, and then proceeds into a long repeating cycle of coding, testing,
and fixing bugs. At some point they decide that enough is enough and release the
product.
It is only suitable for Project works , Programming assignments , short project which
does not require maintenance
After several fixes the code becomes poorly structured and it won’t fulfill the needs

7
SDLC – Waterfall Model
SREE CORP

The waterfall model is the classical model of


development for both hardware and software. This
model is frequently called the conventional model. The
project is expected to progress down the (primary) path
through each of the phases (requirements, design,
coding and unit test, integration, and maintenance) of
development, with deliverables (software requirements
specification, design documents, actual code and test
cases, final product, product updates) at each stage.
Waterfall model moves down a series of steps starting
from an Initial idea to a final product
waterfall performs well for products with clearly
understood requirements or when working with well
understood technical tools, architectures and
infrastructures.
Waterfall model is simple but unworkable as each stage
can and must be completed before the next one occurs
(Backing up to address mistakes is difficult )

8
SDLC – Waterfall Model (2)
SREE CORP
Strengths Weaknesses
Easy to understand, easy to use All requirements must be known upfront
Provides structure to inexperienced staff Deliverables created for each phase are
Milestones are well understood considered frozen – inhibits flexibility
Sets requirements stability Can give a false impression of progress
Good for management control (plan, staff, Does not reflect problem-solving nature of
track) software development – iterations of phases
Works well when quality is more important Integration is one big bang at the end
than cost or schedule Little opportunity for customer to preview the
system (until it may be too late)

When to Use :
Requirements are very well known
Product definition is stable
Technology is understood
New version of an existing product
Porting an existing product to a new platform

9
SDLC – Spiral Model
SREE CORP

10
SDLC – Spiral Model (2)
SREE CORP
Spiral model consists of iterative cycles
Spiral model can be considered a generalization of other process models
Each cycle consists of four steps:
Step 1
 Identify the objectives (for example: performance, functionality,
ability to accommodate change)
 Identify the alternative means of implementing this portion of the
product (for example: different designs, reuse, buy)
 Identify the constraints imposed on the application of the alternatives (for example: cost, schedule)
Step 2
 Evaluate the alternatives relative to objectives and constraints.
 Evaluate the risks involved with each alternative
 Resolve the risks using prototyping, simulation, benchmarking, requirements analysis, etc.
Step 3
 Develop and verify the product
 Product could be the software requirements
 specification, the design specification, etc.
Step 4
 Plan the next phase
 Depending on the next-phase this could be a requirements plan, an integration and test plan, etc.

11
SDLC – Spiral Model (3)
SREE CORP
Strengths Weaknesses
Provides early indication of insurmountable Time spent for evaluating risks too large for small
risks, without much cost or low-risk projects
Users see the system early because of rapid Time spent planning, resetting objectives, doing
prototyping tools risk analysis and prototyping may be excessive
Critical high-risk functions are developed first The model is complex
The design does not have to be perfect Risk assessment expertise is required
Users can be closely tied to all lifecycle steps Spiral may continue indefinitely
Early and frequent feedback from users Developers must be reassigned during non-
Cumulative costs assessed frequently development phase activities
May be hard to define objective, verifiable
milestones that indicate readiness to proceed
through the next iteration

When to Use :
When creation of a prototype is appropriate
When costs and risk evaluation is important
For medium to high-risk projects
Long-term project commitment unwise because of potential changes to economic priorities
Users are unsure of their needs
Requirements are complex
New product line
Significant changes are expected (research and exploration)

12
V- Model
SREE CORP

13
V- Model (2)
SREE CORP

Strengths Weaknesses

Emphasize planning for verification and Does not easily handle concurrent events
validation of the product in early stages of Does not handle iterations or phases
product development Does not easily handle dynamic changes in
Each deliverable must be testable requirements
Project management can track progress by Does not contain risk analysis activities
milestones
Easy to use

When to Use :
Excellent choice for systems requiring high reliability – hospital patient control applications
All requirements are known up-front
When it can be modified to handle changing requirements beyond analysis phase
Solution and technology are known

14
Verification & Validation
SREE CORP

Verification :
 Verification is the process of evaluating a system or component to determine whether the
products of a given development phase satisfy the conditions imposed at the start of that
phase
 Does not require execution of the code
 Verification ensures the product is designed to deliver all functionality to the customer;
 It typically involves reviews and meetings to evaluate documents, plans, code, requirements
and specifications; this can be done with checklists, issues lists, walk-through and inspection
meetings.
Validation :
 Validation is the process of evaluating a system or component during or at the end of the
development process to determine whether it satisfies specified requirements
 Involves executing the actual software or a simulated mockup
 Validation ensures that functionality, as defined in requirements, is the intended behavior of
the product; validation typically involves actual testing and takes place after verifications are
completed.

15
SDLC Phases - Tester Involvement
SREE CORP
Planning : Defining the system to be developed and scope of the project
(Note : In general Tester is not involved in this phase)
Requirements (Analysis) : Developing and analyzing the requirements
 Requirements Verification
 Test Strategy & Testing Estimates
Design : High & Low Level design
 Design Verification
 Test Plan & Test Scenario’s
 Environment Setup
Development : Coding
 Test Script Creation & Verification
 Requirements Mapping & Traceability
Testing
Deployment : Product Release
 Production Checkout
• Maintenance : Post Production support
 Project Handover to Support
 Post release analysis Report

16
Requirements
SREE CORP
Requirement :
 A requirement is a condition or capability needed by a user to solve a problem or achieve an
objective
 Sometimes referred to as Spec , Product Spec
Software Requirements Specification (SRS):
 It defines the product they are creating , detailing what it will be , how it will act , what it
will do and what it won’t do .
Requirement Verification :
 Requirement Verification is the activity to determine whether the Requirements are ready to
move forward into design
Properties of Good Requirements
 Correct (match customer needs)
 Possible (feasible)
 Necessary (rather than nice-to-have)
 Prioritized (very important, important, optional)
 Unambiguous (user’s language )
 Concise
 Verifiable (testable, measurable)
 Complete – have all significant requirements
 Consistent – all documents internally consistent
 Changeable – changes are a fact of life
 Traceable – a requirement can followed from its source to its fulfillment in design and code.

17
Requirements (2) - Test Strategy
SREE CORP
Test Strategy :
 The Test Strategy is the first document QA should prepare for any project. This is a living
document that should be maintained/updated throughout the project.
 The Test Strategy is a high-level document that details how we are going to approach the testing in
terms of people, tools, procedures and support. This document can vary based on the project.
Components in the Test Strategy are as follows:
1. Scope and objective
2. Business issues
3. Roles and responsibilities
4. Communication and status reporting
5. Test deliverability
6. Test approach
7. Test automation and tools
8. Testing measurements and metrics
9. Risks and mitigation
10. Defect reporting and tracking
11. Change and configuration management
12. Training plan

18
Requirements (3) – Testing Estimates
SREE CORP

Estimate : Estimate is the prediction of the effort associated with project scope given a set of
assumptions
- The answer to “how long will it take to …?” after thinking all possible scenarios and considering
realistic assumptions
Estimation approaches
Metrics-Based Approach:
A useful approach is to track past experience of an organization's various projects and the
associated test effort that worked well for projects. Once there is a set of data covering
characteristics for a reasonable number of projects, then this 'past experience' information can be
used for future test project planning. For each particular new project, the 'expected' required test
time can be adjusted based on whatever metrics or other information is available, such as function
point count, number of external system interfaces, risk levels of the project, etc.
Test Work Breakdown Approach:
Another common approach is to decompose the expected testing tasks into a collection of small
tasks for which estimates can, at least in theory, be made with reasonable accuracy. This of course
assumes that an accurate and predictable breakdown of testing tasks and their estimated effort is
feasible. In many large projects, this is not the case. For example, if a large number of bugs are
being found in a project, this will add to the time required for testing, retesting, bug analysis and
reporting. It will also add to the time required for development, and if development schedules and
efforts do not go as planned, this will further impact testing.
Percentage-of-Development Approach:
40 – 60% of Programming effort

19
Design
SREE CORP

Functional Design /External Design/High Level Design :


 Functional design is the process of translating user requirements into external Interfaces and
the output of the process is functional design specification
Technical Design/ Internal Design /Low Level Design
 Technical design is the process of translating the functional specification into a detailed set of
data structures, data flows, and algorithms. The output of the process is the technical design
specification
Design Verification :
 Design Verification is the process to verify the correctness and completeness of the Design
Components
Environment :
 Test Environments include hardware and software requirements including dependencies, stubs
and test drivers required for execution of testing cycles
 Ultimate goal is to replicate the testing configuration and settings as close as possible to the
Production Environment
 Types of Environments
1) Development Environment
2) Test Environment
a) Integration Test Environment
b) End to End or System test Environment
c) Load test Environment
d) Pre Production Environment
3) Production Environment

20
Design
SREE CORP

Test Plan :
 Test Plan lays execution details with timelines for the defined approach to be taken to test the
application under test
1. Project scope/objective
2. Introduction
3. Test items
4. Features to be tested
5. Features not to be tested
6. Approach
7. Testing tasks
8. Suspension criteria
9. Features pass or fail criteria
10.Test environment (Entry criteria, Exit criteria)
11.Test deliverables
12.Staff and training needs
13.Responsibilities
14.Detailed Schedule
Notes :
1) Test Strategy Defines “how” to test and Test plan Defines “when” to test
2) In small scale projects Test Strategy and Test Plan may be merged into one document. When
it is combined the approach/strategy will be a section in the test plan

21
Some Testing Definitions
SREE CORP

Test Case (IEEE-610) - A set of test inputs, execution conditions, and expected results developed
for a particular objective
Test Plan: A detail of how the test will proceed, who will do the testing, what will be tested, in
how much time the test will take place, and to what quality level the test will be performed.
Test Design Specification: A detail of the test conditions and the expected outcome. This
document also includes details of how a successful test will be recognized.
Test Case Specification: A detail of the specific data that is necessary to run tests based on the
conditions identified in the previous stage.
Test Procedure Specification: A detail of how the tester will physically run the test, the physical
set-up required, and the procedure steps that need to be followed.
Test Item Transmittal Report: A detail of when specific tested items have been passed from one
stage of testing to another.
Test Log: A detail of what tests cases were run, who ran the tests, in what order they were run,
and whether or not individual tests were passed or failed.
Test Incident Report: A detail of the actual versus expected results of a test, when a test has
failed, and anything indicating why the test failed.

Reference : http://en.wikipedia.org/wiki/IEEE_829
22
Some Testing Definitions (2)
SREE CORP

Test Scenario : A set of test cases that ensure that the business process flows are tested from end
to end. They may be independent tests or a series of tests that follow each other, each dependent
on the output of the previous one
Test Script: Commonly used to refer to the instructions for a particular test that will be carried
out by an automated test tool.
Test suite: A collection of test scenarios and/or test cases that are related or that may cooperate
with each other
**** In simple words
Test Case : A set of test data, test programs and expected results.
Test Scenario : A set of test cases.
Test Suite : A collection of test cases and/or test scenarios.

23
Development
SREE CORP

Coding : Coding is the process of Translating the detailed internal design specification into a
specific set of directions, code.
Test case Creation : Requirement is the primary base for the test case Creation, Functional design
and other documents will be referred for more clarity
Methods used for Test Case Creation :
- Equivalence Partitioning : A test case design technique for a component in which test
cases are designed to execute representatives from equivalence classes.
(Equivalence Class: A portion of a component's input or output domains for which the
component's behavior is assumed to be the same from the component's specification)
- Boundary Value analysis : In boundary value analysis, test cases are generated using the
extremes of the input domain, e.g. maximum, minimum, just inside/outside boundaries,
typical values, and error values. BVA is similar to Equivalence Partitioning but focuses on
"corner cases".
- Error Guessing : An approach , guided by Intuition and experience to identify tests which are
presumed likely to expose errors. The basic idea is to make a list of possible errors or error
prone situations

24
Development
SREE CORP

Test Script or Test Case Verification :Purpose of the Test Script Verification process is to ensure
the quality of Test Scripts & satisfactory requirements coverage
Techniques followed for Test case verification
Test Coverage : Various ways in which we can measure the completeness of testing with respect
to requirements & functionality , The primary method is Traceability matrix
(Traceability Matrix: A document showing the relationship between Test Requirements and test
Cases)
Peer Reviews: Each script is reviewed by one other Testing Team member
Walk thru : A walk-through is an informal meeting for evaluation or informational purposes. A
walk-through is also a process at an abstract level. It's the process of inspecting software code by
following paths through the code (as determined by input conditions and choices made along the
way). The purpose of code walk-through is to ensure the code fits the purpose.
Inspection : An inspection is a formal meeting, more formalized than a walk-through and typically
consists of 3-10 people including a moderator, reader (the author of whatever is being reviewed)
and a recorder (to make notes in the document). The subject of the inspection is typically a
document, such as a requirements document or a test plan. The purpose of an inspection is to find
problems and see what is missing, not to fix anything. The result of the meeting should be
documented in a written report. Attendees should prepare for this type of meeting by reading
through the document, before the meeting starts; most problems are found during this
preparation. Preparation for inspections is difficult, but is one of the most cost-effective methods
of ensuring quality, since bug prevention is more cost effective than bug detection

Note : Peer Reviews , Walk Thru , Inspection meetings are the common techniques for any kind of
verifications

25
Testing
SREE CORP

Black Box Testing


 Black-box test design treats the system as a "black-box", so it doesn't explicitly use knowledge
of the internal structure. Black-box test design is usually described as focusing on testing
functional requirements. Synonyms for black-box include: behavioral, functional, opaque-
box, and closed-box.
In simple words
Black box testing Not based on any knowledge of internal design or code. Tests are based on
requirements and functionality.
White Box Testing
 White-box test design allows one to peek inside the "box", and it focuses specifically on using
internal knowledge of the software to guide the selection of test data. Synonyms for white-
box include: structural, glass-box and clear-box.
In simple words
White box testing is based on knowledge of the internal logic of an application’s code. Tests
are based on coverage of code statements, branches, paths, conditions.
Gray Box Testing:
 A combination of Black Box and White Box testing methodologies: testing a piece of software
against its specification but using some knowledge of its internal workings.

26
Testing
SREE CORP

Testing Phases
– These are the phases of testing life cycle in which test execution happens
– These are phases of testing executed in parallel to the Development Phases and relate to the
level of testing with the unit of system/application under test
– The Testing Phases are
• Unit Testing
• Integration Testing
• System Testing
• Load Testing
• User Acceptance Testing
Testing Types
– These are techniques/methods of testing the application in a typical phase of testing
– One type of testing can happen across different phases
– The Testing Types are Smoke Testing , Stability Testing , Break Testing etc ..

27
Testing Phases
SREE CORP

Unit or Module or Component Testing :


 Module or Unit Testing is the Process of testing individual components (sub programs or
Procedures of a program)
 Conducted by application developers
 Logic Coverage : a) Statement Coverage
b) Decision (branch) coverage
c) Condition Coverage
d) Path Coverage

Unit
Stub Call
Under
Test Call
Driver

Access to Non Local Variables


Testing a Functional Unit under Test

 Driver : Driver module transmits test case


(in the form of Input arguments) to x and either prints or interprets the
- Result produced by x
 Stub : Stub module simulates the function of a module called by x

28
Testing Phases (2)
SREE CORP

Integration testing :
 Integration testing : Integration testing is the Process of Combining and Testing multiple
components together
 Non – Incremental (Big Bang) Integration : Unit test each program independently and
combine all the components (at once) to form the program and test the integrated result
 Incremental Integration : Unit test the next program component after combining it with the
set of previously-tested components
- Bottom Up – Integration : Integration begins with the terminal modules of the execution
hierarchy
- Top – Down Integration : Integration begins with the top module in the execution
hierarchy

System testing :
 A type of testing to confirm that all code modules work as specified, and that the system as a
whole performs adequately on the platform on which it will be deployed..
 Testing that attempts to discover defects that are properties of the entire system rather than
of its individual components.
 It covers all combined parts of a system.

29
Testing Phases (3)
SREE CORP

Load Testing:
 Load Testing : Testing an application under heavy but expected loads is known as load
testing.
 Performance Testing: Testing the system performance for response time of transactions
 Stress Testing : Stress testing is subjecting a system to an unreasonable load while denying it
the resources (e.g., RAM, disc, mips, interrupts, etc.) needed to process that load. The idea
is to stress a system to the breaking point in order to find bugs that will make that break
potentially harmful. The system is not expected to process the overload without adequate
resources, but to behave (e.g., fail) in a decent manner (e.g., not corrupting or losing data).
Bugs and failure modes discovered under stress testing may or may not be repaired depending
on the application, the failure mode, consequences, etc. The load (incoming transaction
stream) in stress testing is often deliberately distorted so as to force the system into resource
depletion.
 Volume Testing: Volume testing is used to determine if the system under test can handle the
required amounts of data, user requests, etc
User Acceptance Testing : Acceptance testing is the Process of comparing the end product to
the current needs of its end users mainly validates that the system supports business or operation
needs from the user perspective agreed upon during the Definition Phase
Alpha Testing : Alpha test is usually performed by end users inside the developing company but
outside the development/testing organization
Beta Testing: Beta test is usually performed by a selected subset of actual customers outside the
company, before the software is made available to all customers

30
Testing Types
SREE CORP

Functional testing:
– Testing the features and operational behavior of a product to ensure they correspond to its
specifications.
– Testing that ignores the internal mechanism of a system or component and focuses solely on
the outputs generated in response to selected inputs and execution conditions.
End-to-end testing:
Similar to system testing; the ‘macro’ end of the test scale; involves testing of a complete
application environment in a situation that mimics real-world use, such as interacting with a
database, using network communications, or interacting with other hardware, applications, or
systems if appropriate.
Sanity testing:
Typically an initial testing effort to determine if a new software version is performing well enough
to accept it for a major testing effort. For example, if the new software is crashing systems every
5 minutes, bogging down systems to a crawl, or destroying databases, the software may not be in a
’sane’ enough condition to warrant further testing in its current state.
Regression testing:
Re-testing after fixes or modifications of the software or its environment. It can be difficult to
determine how much re-testing is needed, especially near the end of the development cycle.
Automated testing tools can be especially useful for this type of testing.
Usability testing:
Testing for ‘user-friendliness’. Clearly this is subjective, and will depend on the targeted end-user
or customer. User interviews, surveys, video recording of user sessions, and other techniques can
be used. Programmers and testers are usually not appropriate as usability testers.

31
Testing Types
SREE CORP

Install/uninstall testing:
Testing of full, partial, or upgrade install/uninstall processes.
Recovery testing:
Testing how well a system recovers from crashes, hardware failures, or other catastrophic
problems.
Security testing:
Testing how well the system protects against unauthorized internal or external access, willful
damage, etc; may require sophisticated testing techniques.
Compatibility testing:
Testing how well software performs in a particular hardware/software/operating
system/network/etc. environment.
Exploratory testing:
Often taken to mean a creative, informal software test that is not based on formal test plans or
test cases; testers may be learning the software as they test it.
Ad-hoc testing:
Similar to exploratory testing, but often taken to mean that the testers have significant
understanding of the software before testing it.
Comparison testing:
Comparing software weaknesses and strengths to competing products.

32
Standards
SREE CORP

Why standards ?:The use of standards simplifies communication , promotes consistency and
uniformity , and eliminates the need to invent yet another solution to the same problem
IEEE : Institute of Electrical and Electronics Engineers' - among other things, creates standards
such as 'IEEE Standard for Software Test Documentation' (IEEE/ANSI Standard 829), 'IEEE Standard
of Software Unit Testing (IEEE/ANSI Standard 1008), 'IEEE Standard for Software Quality Assurance
Plans' (IEEE/ANSI Standard 730), and others.
• ANSI : 'American National Standards Institute', the primary industrial standards body in the U.S.;
publishes some software-related standards in conjunction with the IEEE and ASQ (American Society
for Quality).
• ISO 9000 : A set of international standards for both quality management and quality assurance
that has been adopted by over 90 countries worldwide. The ISO 9000 standards apply to all types of
organisations, large and small, and in many industries.
ISO 9001 covers design and development. ISO 9002 covers production, installation and
service, and ISO 9003 covers final testing and inspection. ISO 9000 certification does not
guarantee product quality. It ensures that the processes that develop the product are documented
and performed in a quality manner.

33
Standards
SREE CORP

• Capability Maturity Model : A process developed by SEI in 1986 to help improve, over time,
the application of an organization's supporting software technologies. The process is broken into
five levels of sequential development: Initial, Repeatable, Defined, Managed and Optimizing.
(SEI - SEI = 'Software Engineering Institute' at Carnegie-Mellon University; initiated by the U.S.
Defense Department to help improve software development processes. )
 Level 1 – Initial : Unpredictable and poorly controlled
Characterized by chaos, periodic panics, and heroic efforts required by individuals to successfully
complete projects. Few if any processes in place; successes may not be repeatable.
 Level 2 – Repeatable : Can repeat Previously mastered tasks
Software project tracking, requirements management, realistic planning, and configuration
management processes are in place; successful practices can be repeated.
 Level 3 – Defined : Process characterized , fairly well understood
Standard software development and maintenance processes are integrated throughout an
organization; Software Engineering Process Group is in place to oversee software processes, and
training programs are used to ensure understanding and compliance.
 Level 4 – Managed : Process measured and controlled
Metrics are used to track productivity, processes, and products. Project performance is
predictable, and quality is consistently high.
 Level 5 – Optimizing : Focus on process Improvement
The focus is on continuous process improvement. The impact of new processes and technologies
can be predicted and effectively implemented when required.

34
Testing Life Cycle
SREE CORP

• Requirement Specification documents


• Functional Specification documents
Test Requirements • Design Specification documents (use cases, etc)
• Use case Documents
• Test Trace-ability Matrix for identifying Test Coverage

• Test Scope, Test Environment


• Different Test phase and Test Methodologies
Test Planning • Manual and Automation Testing
• Defect Mgmt, Configuration Mgmt, Risk Mgmt. Etc
• Evaluation & identification – Test, Defect tracking tools

• Test Bed installation and configuration


• Network connectivity’s
Test Environment Setup
• All the Software/ tools Installation and configuration
• Coordination with Vendors and others

• Test Traceability Matrix and Test coverage


• Test Scenarios Identification & Test Case preparation
Test Design • Test data and Test scripts preparation
• Test case reviews and Approval
• Base lining under Configuration Management

• Automation requirement identification


• Tool Evaluation and Identification.
Test Automation • Designing or identifying Framework and scripting
• Script Integration, Review and Approval
• Base lining under Configuration Management

• Executing Test cases


Test Execution and • Testing Test Scripts
Defect Tracking • Capture, review and analyze Test Results
• Raised the defects and tracking for its closure

• Test summary reports


Test Reports • Test Metrics and process Improvements made
and Acceptance • Build release
• Receiving acceptance

35

You might also like