You are on page 1of 26

Template Version 5.

0/13-Nov-2003

System Testing

Version 1.2/11-Mar-2005

Prepared By

Shanthala .A.R

<System Testing> <10-Mar-2005> Page 1 of 26


Template Version 5.0/13-Nov-2003

Table of Contents
1 INTRODUCTION...........................................................................................................................................................3
1.1 PURPOSE OF THE DOCUMENT.............................................................................................................................................3
1.2 INTENDED AUDIENCE......................................................................................................................................................3
1.3 REFERENCES..................................................................................................................................................................3
2 SYSTEM TESTING.......................................................................................................................................................3
2.1 SYSTEM TESTING APPROACH............................................................................................................................................4
2.1.1 Requirement Analysis & System Study...............................................................................................................4
2.1.2 Test Plan.............................................................................................................................................................4
2.1.3 Roles and Responsibility.....................................................................................................................................5
2.1.4 Test Case Preparation........................................................................................................................................6
2.1.5 QA Environment Setup.......................................................................................................................................7
2.1.6 System Test Execution.........................................................................................................................................8
2.1.7 Regression Testing..............................................................................................................................................8
2.1.8 Defect Tracking..................................................................................................................................................9
2.2 FUNCTIONALITY TESTING...............................................................................................................................................10
2.2.1 Testing Approach..............................................................................................................................................10
2.3 INTERNATIONALIZATION/LOCALIZATION TESTING................................................................................................................10
2.3.1 Testing Approach..............................................................................................................................................10
2.4 PERFORMANCE TESTING.................................................................................................................................................11
2.4.1 Load/Volume testing..........................................................................................................................................11
2.4.2 Stress Testing...................................................................................................................................................12
2.4.3 Volume Testing..................................................................................................................................................13
2.4.4 Endurance Testing............................................................................................................................................14
2.5 CONFIGURATION TESTING..............................................................................................................................................15
2.5.1 Testing Approach..............................................................................................................................................15
2.6 COMPATIBILITY TESTING................................................................................................................................................15
2.6.1 Testing Approach..............................................................................................................................................16
2.7 INSTALLATION TESTING..................................................................................................................................................16
2.7.1 Testing Approach..............................................................................................................................................17
2.8 USER INTERFACE TESTING.............................................................................................................................................17
2.8.1 Testing Approach..............................................................................................................................................17
2.9 USABILITY TESTING......................................................................................................................................................18
2.9.1 Testing Approach..............................................................................................................................................18
2.10 DOCUMENTATION TESTING...........................................................................................................................................18
2.10.1 Testing Approach............................................................................................................................................19
2.11 SECURITY TESTING:....................................................................................................................................................19
2.11.1 Testing Approach............................................................................................................................................20
2.12 RELIABILITY TESTING..................................................................................................................................................20
2.13 EXPLORATORY TESTING................................................................................................................................................20
2.13.1 Testing Approach............................................................................................................................................21
2.14 FAILOVER AND RECOVERY TESTING..............................................................................................................................21
2.14.1 Testing Approach............................................................................................................................................22
2.15 SCALABILITY TESTING.................................................................................................................................................23
2.15.1 Testing Approach............................................................................................................................................23
2.16 METRICS...................................................................................................................................................................23
2.17 TEST REPOSITORY AND TEST PACK CATEGORIZATION........................................................................................................24
2.18 CONFIGURATION MANAGEMENT....................................................................................................................................24
2.19 RISKS THAT IMPACT TESTING.........................................................................................................................................24
3 APPENDIX....................................................................................................................................................................25
3.1 MIND TREE’S STANDARD TEMPLATES..............................................................................................................................25

<System Testing> <10-Mar-2005> Page 2 of 26


Template Version 5.0/13-Nov-2003

3.2 USABILITY TESTING TOOLS............................................................................................................................................26


3.3 SECURITY TESTING TOOLS.............................................................................................................................................26
3.4 I18N CHECKLISTS........................................................................................................................................................26
4 REVISION HISTORY.................................................................................................................................................26

1 Introduction
1.1 Purpose of the document

This document describes the MindTree System Testing methodology that will be implemented for the
projects.

1.2 Intended Audience

The intended audiences of this document are:

1. Project Management - MindTree

2. Testing Practice – MindTree

3. Customer

1.3 References

SL# Reference / Prior Reading Location of Availability

1.
2.

2 System Testing
The purpose of System Testing is to deliver code that has been comprehensively tested and certify the
functionality to meet the Business rules and regulations & Requirements for the stake holders of the
system.

When the Requirement/Functional Specification is ready, System Testing activities can be planned i.e.
Requirement analysis & system study, Requirement Matrix, Test Planning & Test case Preparation &
Testing.

This section describes the System testing activities & Types of testing that can be conducted during
system Testing.

The following types of testing can be performed under system testing depending on the requirements.

• Functionality testing
• Internationalization testing
• Performance testing
• Compatibility testing
• Installation testing

<System Testing> <10-Mar-2005> Page 3 of 26


Template Version 5.0/13-Nov-2003

• User Interface Testing


• Usability testing
• Documentation testing
• Security testing
• Reliability testing
• Exploratory testing
• Failover and Recovery testing
Note: We need to finalize the types of testing that to be carried out at the test strategy definition level..

2.1 System Testing Approach

2.1.1 Requirement Analysis & System Study


• Testing team will involve in requirement analysis and system study to understand the business aspects
of the system and requirements.

• System study involves reviewing the Business rules and regulation docs, Domain specific trainings &
Face to Face meeting with End users of the system.

• Testing minds will review the requirement specs for testability aspect and adequacy.

• The outcome of this phase is that testing team will develop Requirement Traceability Matrix.

2.1.2 Test Plan


• The test plan will be developed to plan the system testing activities those are identified i.e. Test
Strategy , Functions/featured to be tested, Test Environment set up , Types of testing , Test case
preparation, Test execution, Exit criteria and Test reporting.

• Test Lead prepares the test plan and signed off by the customer.

• The activities that are planned for System testing will be estimated and scheduled in the test plan.

• The Test Plan should contain the details on:

 The test strategy, Roles & responsibilities of the minds involved in the testing.
 Project and software components that should be tested.
 The types of testing that will be performed.
 The expected use of any automated testing tools and usage requirements for regression testing
 Deliverable elements of the test phases.
 Configuration Management & Test Management tools & procedures
 Defect Tracking Process
 Sign off criteria of testing artifacts.
 Risks and assumptions in the project, if any.
 Entry and exit criteria for each of the testing phase.
 Metrics that need to be collected in the testing phase.
 Timelines of the project milestones

<System Testing> <10-Mar-2005> Page 4 of 26


Template Version 5.0/13-Nov-2003

• MindTree has a standard test plan template available to prepare test plan. However, if customer asks
to use his/her specific templates and the same will be used.

2.1.3 Roles and Responsibility


The typical project organization for this system testing would consist of the following:

• Customer

• QA Manager

• Test Lead(s) in onsite/offshore depending on the project need.

• Configuration coordinator

• Test Engineer(s) at offshore

The roles and responsibilities of the team are detailed below:

Role Responsibilities Deliverables

Customer • Sponsoring

• Sign off the deliverables

• Provide infrastructure

Role Name: QA Manager

Primary Responsibilities • Project Management.


• Responsible for Overall project delivery management.
• Publish Project Matrix
• Identifying Project risks and escalation.
• Responsible for providing necessary information to Client on projects.
• Attend Steering Committee meetings.
• Staffing
Deliverables and secondary • Project Plans
responsibilities • Metrics to measure project effectiveness
• Involve Project Management Reviews

Role Name: Test Lead(Onsite)

Primary Responsibilities • Knowledge acquisition & transfer to offshore teams.


• Responsible for preparing test plans, effort estimations, if required.
• Interfacing with Business analysts at Client’s place.
• Provide functional and technical information to offshore team
• Sanity checks on the build before send it to offshore. Responsible to
train offshore resources in the product as well as domain.

<System Testing> <10-Mar-2005> Page 5 of 26


Template Version 5.0/13-Nov-2003

Deliverables and secondary • Test Plans


responsibilities • Project QA Metrics
• Automation Frameworks (For automation projects)

Role Name: Test Lead(Offshore)

Primary Responsibilities • Responsible for test planning & coordinate with Onsite Test lead
• Provide the project status to the QA Manager and Client.
• Involve in the Weekly meetings with the Client’s team
• Technical & functional oversight to the testers.
• Responsible for reviewing offshore artifacts before send it to the
Client.
• Team coordination
Deliverables and secondary • Test plans &Test strategy
responsibilities • Daily/Weekly status report
• QA Metrics
• Team coordination
• Test estimation
• Automation Frameworks (For automation projects)

Role Name: Test Engineer

Primary Responsibilities • Test case preparation


• Automation testing
• Test Data preparation
• Test Environment set up
• Test execution
• Responsible for entering the defects in the defect tracker.
• Test Summary reports.
• Develop functional scripts and load scripts
• Test bed preparation
• Conducts test cycles as per test plan
• Implement checklist and guidelines in to the project
• Re-test the system after functional testing certification.
Deliverables and secondary • Test cases
responsibilities • Test Summary reports
• Defect reports/logs.
• Automation scripts & Test reports (For automation Projects)

2.1.4 Test Case Preparation


• Test cases will be developed and provide step-by-step details of each function, with expected results.

• Usually System Test Case preparation starts once the requirements/Use cases and Test Plans are base
lined.

<System Testing> <10-Mar-2005> Page 6 of 26


Template Version 5.0/13-Nov-2003

• The following aspects will be covered in the test cases

o Valid Data Test Cases

o Invalid Data Test Cases

o Negative Test Cases

o Alternative flows

o Value Testing & Calculations (If applicable)

• Test Cases will have reference to Requirements/Use cases and recorded in Traceability Matrix.

• The Test cases will be prepared based on referring to SRS, Functional specifications, Use cases and Story
boards which ever is available. In specific cases depending on the requirements, we also refer Domain
specific material, Design documents, Business Regulatory documents etc.

• While preparing the test cases, the functions will be identified form the requirement documents, The
functions will be further break down to Unit functions (smallest function which can’t be further break
down to test conditions).Test cases will be prepared for all the functions/unit functions using the
following testing techniques ;

 Equivalence partitioning: Based on the property (integer/Boolean) of the input, its domain can be
classified into 2 or 3 classes. These classes will be derived as test cases.

 Boundary value analysis: The values for the test cases are selected from the boundaries (i.e. max,
max+1, max-1 and min, min+1, min-1) and are given as input to the system.

 Cause Effect Graphing Technique: Causes and Effects are listed for a module and an identifier is
assigned to each. A cause-effect graph is developed. The graph is converted into a decision table.
Decision table rules are converted to test cases.

• If the requirements captured in Use cases, the test cases will follow the Use Case –Test Case methodology.
The test scenarios will be identified and each scenario refers to Use case name/No. The Test scenarios
will be breaked in to test cases & test conditions. All the test cases/Conditions follow the Actor steps with
valid and invalid data inputs, alternative flows, and negative flows.

• Test Cases internally reviewed by peers and then by customer. In case of requirement change, appropriate
changes will be made to the test case document and same review process will take place.

• Test case execution instructions/pre conditions will be covered in the test case document to help the
tester for smooth execution.

2.1.5 QA Environment Setup


The test environment should be set up separately out of Development and Production environments. The
load testing environments should replicate the production environment.

• System requirements (Software/Hardware)

• Tools used for defect tracking. (Bugzilla/Testdirector/any other)

<System Testing> <10-Mar-2005> Page 7 of 26


Template Version 5.0/13-Nov-2003

• Tools used for functional testing. (Winrunner/QTP/any other)

• Tools for Remote access & Net work monitoring

• Tools used for load testing. (Loadrunner/Eload/any other)

2.1.6 System Test Execution


• Configuration coordinator or build manager should deliver the fresh build to QA and deploy in the system
test environment, as directed by the QA Manager along with the release notes.

• QA team starts with Build Acceptance Testing (Smoke testing or sanity testing of the build) once the build
is installed in the QA environment. The smoke test check list contains the tests which are risk based /
critical functionality & Business priority. If the build passes the smoke test phase, then the system testing
phase starts. Smoke test report will be send and if the build fails in the smoke test level, the build will be
rejected and same will be notified to Development team asking the new build.

• During the test execution, the following should be considered.

 No “dead” links/ inaccessible pages.


 Data Accuracy & Calculations
 Application security
 Content & usability
 No system crashing bugs.
 All interfaces verified/validated as per the system design specification.

• The defects found during the System test execution will be entered into defect tracking system. During
critical testing phases such as nearing production deployment, the defects identified shall be discussed
and communicated to Client.

• The exit criteria for system testing would be any of the following conditions are met.

 All the requirements verified for as per the requirements specification.


 No Blockers/critical defects.
 Less major defects, with the acceptance of stake holders of the Project.
 The stakeholders have agreed upon all issues that exist in the system.
 All test scenarios/cases have been run at least once and the last build undergoes a full cycle of
regression tests.

• At the end of each test cycle test lead will send a Test Summary Report and Test metrics to the customer
and QA manager.

2.1.7 Regression Testing


Regression testing involves in testing a system that has been modified to verify that modifications have
not caused unintended effects and still complies with its specified requirements.

Testing Approach:

• An additional re-test plan created by test lead provides a clear road map of regression testing activities.

• The system test cases, Impact analysis, Release notes and the resolved defects found during the earlier
test rounds will be executed and update the Defect Tracking system as appropriate. The test cases will be
developed for the defects, if the existing test cases are not covered them.

<System Testing> <10-Mar-2005> Page 8 of 26


Template Version 5.0/13-Nov-2003

• QA team should maintain a controlled test environment that prevents the unknown from creeping into
testing.

• The regression testing phase should exit after meeting the following criteria

 None of the functionality is broken/ its behavior is changed


 Bugs reported during the system testing phase are fixed
 Completed execution of regression test cases.

2.1.8 Defect Tracking


• The defect life cycle and defect reports will be defined in conjunction with the Customer.

• The defects will be tracked by using the Mind tree’s standard defect tracking tool (Bugzilla), it can be
changed according to the Customer’s need.

• As soon as any bug is identified, the same will be updated in Bugzilla and tracked to closure.

• Test engineer selects appropriate version, Component, Platform, Operating system, and assign severity to
each bug.

• Priority will be decided by tech leads / PMs


• In case test lead and implementation team leader are not able to agree on any bug, then it will be sent to
test manager for a decision.
• Depending on the severity of the bug, MT will communicate to the developers/Customer (if it is
independent testing project) and provide more information if required.

• While testing the production candidate release, bug details will be provided to Client immediately as they
are detected through an email or telephonic conversation.

• The number of bugs identified, closed and reopened for the day will be provided in the Daily Status
Report.

• Test lead should report defect status every week to customers, onsite / offshore tech leads and project
managers through weekly status report.

Defect Reporting Structure:

<System Testing> <10-Mar-2005> Page 9 of 26


Template Version 5.0/13-Nov-2003

Update
Valid Register
Test Result
Bug in
with
Bugzilla
tool
Bug ID

Regression Re-open
enterBug
Automated
Bug,
Test Result Validate
Update
file Bug
Test Result

Invalid Consolidate
Test Result

Re-test &
Update
Test Result
or Script
Send Test
Results
Resultsto
Client
to DB

2.2 Functionality Testing

The goal of Functionality testing is to verify the feature/functional behavior of the system. Typically,
Calculations, data accuracy and end to end business flow.

2.2.1 Testing Approach


• The functional data sets are prepared keeping the function validations.

• We will follow the black box testing approach and perform the test execution using the prepared test
data.

• At the end of the functionality testing phase testing mind has to send

 Functionality coverage track report for a Release decision

 Test observations and recommendations.

2.3 Internationalization/Localization testing

Internationalization testing is performed to verify the functional behavior and message display in the
identified languages.

2.3.1 Testing Approach


• While doing the system study translatable as well as non translatable components that are present in the
application are identified. Customized checklist will be prepared.

• The test cases will be developed based on

 Test cases related to Internationalization/localization standard process i.e. I18N.

 Test cases related to a set of conventions affected or determined by human language and customs, as
defined within a particular geographical location. i.e. Locales incorporate cultural information such as
time, date, font, currency conversion.

<System Testing> <10-Mar-2005> Page 10 of 26


Template Version 5.0/13-Nov-2003

• Typically the tests covers the Unicode & multi code related issues, Message display in local languages &
impact of the message libraries on features and incase of Localization testing ,we also test the
application on various locale OSs.
• Based on the project need, Mind tree sets up the infrastructure for the test environment. A combination
matrix of hardware and software platforms will be created to test various possible environments.
• The specified numbers of desktops will be installed with required operating system and service packs for
different languages.
• The test plan will be done according to the Mind tree’s Standard Test Plan format.
• To verify the locale language messages other then English, the language converter or message dictionaries
should be developed.
• Mind Tree follows the rapid testing with Exploratory testing methods and the specification based testing.
• Release/Build Acceptance: QA team will do a round of sanity testing to confirm the build is testable. If
Showstoppers and critical bugs found during sanity testing, the build will be rejected and Client
engineering team need to give a new build to QA.
• The application will be installed in the identified Windows operating systems with different service packs
and locale languages.
• We will follow the black box testing approach to test the functionality, internationalization and
localization of the application on different locale languages with windows service packs.
• The test cases are executed as per the iteration notes.
• The defects found during the test execution will be entered into defect tracking system
• The test results are entered in the test matrix and will be sent to the customer and QA manager on daily
basis.

2.4 Performance Testing

The Performance Testing is designed towards a successful software application launch with reliability and
speed. Its primary focus is to determine how well applications on a near deployment infrastructure
perform under real world loads, stress and recommendations to eliminate bottlenecks for the identified
loads.

The deliverables of this testing include a detailed assessment report of test results, recommendations on
the readiness to "go live", and areas which could be improved. Companies who are ready with a
functionally tested application/product will want to employ this service when they are confident of the
accurate transactional functionality of their software applications, but uncertain about their performance
and availability under projected user volumes.

Different Types of performance testing conducted by Mind tree are

• Load/Volume Testing

• Stress Testing

• Volume Testing

• Endurance Testing

2.4.1 Load/Volume testing


Load testing is a performance test that subjects the target-of-test to varying workloads to measure and
evaluate the performance behaviors and abilities of the target-of-test to continue to function properly under
these different workloads.

<System Testing> <10-Mar-2005> Page 11 of 26


Template Version 5.0/13-Nov-2003

The goal of load testing is to determine and ensure that the system functions properly beyond the expected
maximum workload. Additionally, load testing evaluates the performance characteristics, such as response
times, transaction rates, and other time-sensitive issues.

2.4.1.1 Testing Approach


• System Study

• Test case writing

• The Performance Testing Environment holds the same configuration standard as the QA Environment. It is
dedicated to automated load testing and includes software used to monitor the behavior of the web
server and database under heavy load.

• Load Testing will consist of a select group of (N) scripts that accurately represent a cross section of
functionality. Scripts will be executed to generate up to 1500 user peak load levels.

• Tests will be executed for 1500 concurrent users at load levels of 50, 100, 200, 500, 1000 and 1500. The
test execution would be completed when the 2000 user load is ramped up or any failure condition
necessitates stopping the test. The team would monitor the test execution and record the timings, errors
for report preparation.

• Determine the serviceability of the system for a volume of 1500 concurrent users.
• Measure response times for users

Test Steps:

• Virtual users estimation: Arrive at a maximum number of concurrent users hitting the system where the
system response time is within the response time threshold and the system is stable. This number would
be the virtual user number and should be higher by a factor of x times the average load.

• Virtual users profiles and their distribution for client operations.

• Load simulation schedule: Schedule for concurrent user testing with a mix of user scenarios and the
acceptable response times.

Statistics:

• The graph with y-axis representing response times and x-axis concurrent users will depict the capability of
the system to service concurrent users.

• The response times for slow users will provide worst-case response times.

2.4.2 Stress Testing


Stress testing is a type of performance test implemented and executed to understand how a system fails
due to conditions at the boundary, or outside of, the expected tolerances. This typically involves low
resources or competition for resources. Low resource conditions reveal how the target-of-test fails that is
not apparent under normal conditions. Other defects might result from competition for shared resources,
like database locks or network bandwidth, although some of these tests are usually addressed under
functional and load testing.

In order to check whether the system is tolerant to extreme conditions, stress testing is performed.

<System Testing> <10-Mar-2005> Page 12 of 26


Template Version 5.0/13-Nov-2003

Test Objective:

Exercise the target-of-test functions under the following stress conditions to observe and log target
behavior that identifies and documents the conditions under which the system fails to continue
functioning properly:

 Little or no memory available on the server (RAM and persistent storage space)

 Maximum actual or physically capable number of clients connected or simulated

 Multiple users performing the same transactions against the same data or accounts

 "Overload" transaction volume or mix

2.4.2.1 Testing Approach


• Use tests developed for Load Testing.

• To test limited resources, tests should be run on a single machine, and RAM and persistent storage
space on the server should be reduced or limited.

• For remaining stress tests, multiple clients should be used, either running the same tests or
complementary tests to produce the worst-case transaction volume or mix.

2.4.3 Volume Testing


Volume testing subjects the target-of-test to large amounts of data to determine if limits are reached
that cause the software to fail. Volume testing also identifies the continuous maximum load or volume the
target-of-test can handle for a given period. For example, if the target-of-test is processing a set of
database records to generate a report, a Volume Test would use a large test database, and would check
that the software behaved normally and produced the correct report.

In order to test whether the system is able to handle large quantities of data, volume testing is done.

Test Objective:

Exercise the target-of-test functions under the following high volume scenarios to observe and log target
behavior:

 Maximum (actual or physically-capable) number of clients connected, or simulated, all performing the
same, worst case (performance) business function for an extended period.

 Maximum database size has been reached (actual or scaled) and multiple queries or report
transactions are executed simultaneously.

2.4.3.1 Testing Approach


Test lead will also be defining the scope of work and will in the activities like

• Business transactions identification with the help of business team.

• Finalize Performance test process, Test strategy, Communication & reporting

• Preparation of the detail test plan.

<System Testing> <10-Mar-2005> Page 13 of 26


Template Version 5.0/13-Nov-2003

• Use tests developed for Load Testing.

• Multiple clients should be used, either running the same tests or complementary tests to produce the
worst-case transaction volume or mix (see Stress Testing) for an extended period.

• Maximum database size is created (actual, scaled, or filled with representative data), and multiple clients
are used to run queries and report transactions simultaneously for extended periods.

2.4.4 Endurance Testing


Endurance Testing is designated towards validating systems behavior for continuous hours of operation for
projected load conditions.

2.4.4.1 Testing Approach


Endurance testing – check resource usage and release namely; CPU, Memory, Disk I/O and network
(TCP/IP sockets) congestion for continuous hours of operation

Determine the robustness - check for breakages in the web server, application server and data base server
under CHO conditions.

Test Steps:

• Arrive at a base line configuration of the web server and application server resources i.e. CPU, RAM and
Hard disk for the endurance and reliability test.

• The test would be stopped when one of the components breaks. A root cause analysis is to be carried out
based on the data collection described under the server side monitoring section.

Client side monitoring:

• Failure rate -- web server responses/timeouts/exceptions and incomplete page downloads

• Response time degradation under peak load numbers (concurrent users)

Server side monitoring:

• Collect CPU, Disk and Memory usage for analysis

• Check for application server slow down/freeze/crash

• Check for resource contention/deadlocks

• Database server load and slow down

• Web server crashes

• Collect data for analysis to tune the performance of web server, application server and database server

• If there is an alarm support in the tool through an agent, check for alerts when the activity level exceeds
preset limits.

• If there is a load-balancing configuration deployed, check if it is able to distribute the requests.

<System Testing> <10-Mar-2005> Page 14 of 26


Template Version 5.0/13-Nov-2003

Result:

The result of this test will be a proof of confidence for Continuous Hours of Operation. The data
collected in this phase would give pointers to improve the reliability of the system and fix any
configuration, component parameters for reliable performance.

The reports shall contain the following:

1. Test case cycles vs. Response times

2. Virtual users vs. Response times

3. TPS vs. Response times

4. Transaction throughput

5. Bad transactions [CPU]

6. Bad transactions [Memory]

7. RAM usage

8. CPU usage

9. Breaking point [the limit at which the performance becomes unacceptable]

10. Conclude with a performance statement based on the statistics collected and analyzed

11. Analysis and recommendation against results recorded.

2.5 Configuration Testing

Not Completed The configuration testing is normally conducted by both user and a software maintenance
test team.

2.5.1 Testing Approach


Develop the tests to detect problems prior to placing the change into production.

Correct problems prior to placing the change in production.

Test the completeness of needed training material.

Involve the users in the testing of software changes.

2.6 Compatibility Testing

Compatibility testing is ensures that the product works properly on the different platforms and operating
systems on the market and also with the applications and devices in its environment.

<System Testing> <10-Mar-2005> Page 15 of 26


Template Version 5.0/13-Nov-2003

The aim of compatibility tests is to locate application problems by running them in real environments,
thus ensuring you do not launch any product that could damage your corporate image or increase
maintenance costs.

2.6.1 Testing Approach


• As part of the system study, testing minds will prepare checklists to test every combination of following
Platforms/ technologies depending on the project need.

 Operating Systems:
Windows 95/98, Windows NT, Windows 2000, Windows Me, Windows XP Professional and
Home, Windows 2003 Server, MacOS 7.5 a X, Linux or Unix in its localized versions.
 Browsers:
Internet Explorer, Netscape, Mozilla, AOL, Safari... etc
 Network Operating Systems:
Novell NetWare 4.x, 5.x and 6.x, Windows NT and 2003 Servers, Unix and Linux, Mac OS X,
AppleTalk...
 Connectivity:
USB, Firewire (IEEE 1394), Parallel (IEEE 1284) Bluetooth, Wireless (WiFi), VPN’s…
 Languages:
Western (EFIGSP), Asian, North and Central Europe Languages.
• The test lead finalizes a representative sample of combinations in terms of budget and time.

• Testing minds should identify the customer's requirements and historical compatibility issues; they should
also determine the possible test scenarios and coverage measurements. Testing minds can use the test
cases developed for functionality testing.

• Based on the project need, Mind tree sets up the infrastructure for the test environment. A combination
matrix of hardware and software platforms will be created to test with various operating systems/service
packs/browsers.

• Mind Tree follows the rapid testing with Exploratory testing methods and the specification based testing.

• The testing minds run the test cases against the created required hardware and software combinations.

• The defects found during the test execution will be entered into defect tracking system. The found
defects and issues together with analysis of possible causes and fixing suggestions should be
communicated to Client.

• At the end of each test cycle test lead should send a Test Summary Report and Test metrics to the
customer and QA manager.

2.7 Installation Testing

Installation testing confirms that the application under test recovers from expected or unexpected events
without loss of data or functionality. Events can include shortage of disk space, unexpected loss of
communication, or power out conditions.

<System Testing> <10-Mar-2005> Page 16 of 26


Template Version 5.0/13-Nov-2003

2.7.1 Testing Approach


• Testing mind should develop the installation test cases for the following scenarios:

 Installation under normal conditions (such as a new installation, an upgrade, a remove and a complete
or custom installation)
1) New Installation
2) Update:
o A machine previously installed, same version
o A machine previously installed, older version
3) Complete or custom installation
4) Remove
 Installation under abnormal conditions (such as insufficient disk space, lack of privilege to create
directories, and so on)
 Software operation after the installation.
• When the build is ready to test, upload the build in a QA environment and start the test execution using
the installation test cases.

• Launch or perform installation.

• If Showstoppers and critical bugs found during testing, the build will be rejected and the development
team need to give a new build to QA.

• The defects found during the test execution will be entered into defect tracking system and the status
report will be sent to the client.

2.8 User Interface Testing

User Interface testing will be carried out to verify an actor’s interaction with the system. The goal of the
UI Testing will be to ensure that the User Interface provides the actor with the appropriate access and
navigation through the use cases. In addition, UI Testing will ensure that the elements of the UI function
as expected and conform to corporate or industry standards.

2.8.1 Testing Approach


• During the system study the significant pages, such as the Home Page, Reservation, and Search etc. will
be identified by the testing minds.

• Test lead should develop a Test plan for the whole activity.

• Testing mind should develop the test cases which ensures that:

 Navigation through the application properly reflects business functions and requirements, including page
to page, field to field, frames, etc. (tab keys, mouse movements, accelerator keys)

 GUI objects and characteristics, such as menus, size, position, state, and focus conform to standards.

 The proper navigation, element states and content in each page.

 The structure of the site to ensure that it is complete and navigable.

 All hyperlinks should reference a legitimate page / URL.

<System Testing> <10-Mar-2005> Page 17 of 26


Template Version 5.0/13-Nov-2003

 All images should present and hotspots created and linked, and there should be no orphaned pages.

• When the build is ready to test, upload the build in a QA environment and start the test execution.

• At the end of each test cycle test lead should send a Test Summary Report and Test metrics to the
customer and QA manager.

2.9 Usability Testing

Usability testing verifies a user’s interaction with the software. The goal of Usability testing is to ensure
that the User Interface provides the user with the appropriate access and navigation through the
functions.

2.9.1 Testing Approach


• During the system study the testing minds should prepare a checklist for usability testing.

• Test lead should develop a Test plan for the whole activity.

• Testing mind should develop the test cases which ensures that:

 The requirements of the user interfaces are fulfilled according to descriptions in Use Case
Specifications and Supplementary Specification.

 Verification of Friendliness, Ease of use, Usefulness, Ease of learning, Ease of relearning, consistency,
error prevention and correction, visual clarity, navigation, language of the application from user’s
point of view.

 Create or modify tests for each User Interface to verify proper navigation and object states for each
application window and objects.

 It’s easy to navigate between the forms.

 The user interfaces are built according to existing standards.

• When the build is ready to test, upload the build in a QA environment and start the test execution.

• Testing will be carried out manually as per the prepared checklist for usability testing.

• We can make use of many usability testing tools like ErgoLight WebTester, WebSAT, Bobby etc... to
perform the security testing.

• At the end of each test cycle test lead should send a Test Summary Report and Test metrics to the
customer and QA manager.

2.10 Documentation Testing

It is important to ensure that the right documentation has been prepared, is complete and current,
reflects the criticality of the system, and contains all necessary elements.

The testing of documentation should conform to other aspects of system testing.

<System Testing> <10-Mar-2005> Page 18 of 26


Template Version 5.0/13-Nov-2003

2.10.1Testing Approach
• Documentation testing is involved in two stages.

 The testing mind has to test the documents received by the developer before starting the actual test
execution. Here the testing mind will involve in testing the documents like Release Notes,
Requirement Specification document etc…

 Once the testing cycle is over the QA documents (i.e. Test Plan, Test case document, Test Data etc...)
should be tested by testing mind before sending to the customer.

• Before starting the documentation testing the testing mind should prepare a note on

 Measure of project documentation needs.

 Determine what documents must be produced.

 Determine the completeness of individual documents.

 Determine how current projects documents are.

• During the test execution cycle the testing mind should test whether the document

 Assist in planning and managing resources.

 Help in planning and implementing security procedures.

 Help transfer knowledge of software testing throughout the life cycle.

 Define what is expected and verify that that is what delivered.

 Provide basis for training individual in how to maintain software.

 Provide managers with technical documents to determine that requirements have been met.

 Provide the customers with technical documents to determine that installation procedure, user guide
etc…

• Once the document is completely tested, send back the documents to the respective owners and the
owners should update it according to the comments given by reviewer. This cycle ends only when right
documentation is done.

2.11 Security Testing:

Security and Access Control Testing focuses on two key areas of security:

− Application-level security and

− System-level Security

In order to check whether database is secured Security testing is done.

<System Testing> <10-Mar-2005> Page 19 of 26


Template Version 5.0/13-Nov-2003

2.11.1Testing Approach
• Test lead and testing minds should start with a System Study.

• Test lead should develop a Test plan for the whole activity.

• During the test case development testing mind should cover the scenarios on:

 Application-level security, including access to the Data or Business Functions i.e. An actor can access
only those functions or data for which their user type is provided permissions. Application-level
security test cases should ensure that actors are restricted to specific functions or use cases, or they
are limited in the data that is available to them.

 System-level Security, including logging into or remotely accessing to the system i.e. only those actors
with access to the system and applications are permitted to access them. System level security test
cases should ensure that only those users granted access to the system are capable of accessing the
applications and only through the appropriate gateways.

• When the build is ready to test, upload the build in a QA environment.

• The testing has to be done on two parts of the application: i.e.

Application-level Security Testing:

 Identify and list each user type and the functions or data for which each type has permissions.

 Create tests for each user type and verify each permission by creating transactions specific to each
user type.

 Modify user type and rerun test cases for same users. In each case, verify those additional functions
or data are correctly available or denied.

System-level Access Testing:

 Access to the system must be reviewed or discussed with the appropriate network or systems
administrator. This testing may not be required as it may be a function of network or systems
administration.

• We can make use of many security testing tools like Cisco Secure Scanner, Network Monitoring etc... to
perform the security testing.

• The defects found during the test execution will be entered into the defect tracking system and the
status report will be sent to the client.

2.12 Reliability testing

2.13 Exploratory testing

Exploratory testing is an interactive process of concurrent product exploration, test design and test
execution.

The outcome of the exploratory testing is a set of notes about the product, failures found, and a concise
record of how the product was tested.

<System Testing> <10-Mar-2005> Page 20 of 26


Template Version 5.0/13-Nov-2003

Exploratory testers should continually learn about the software they are testing, the market for the
product and the best ways to test the software.

2.13.1Testing Approach

• Test lead and testing minds should start with a System Study.

• Test lead should develop a Test plan for the whole activity.

• The test cases already created for Function and Business Cycle testing can be used as a basis for
exploratory testing. Most of the functionality and stability test cases should be chosen to perform this
testing.
• When the build is ready to test, upload the build in a QA environment.

• The testing mind should make sure that the following conditions are covered during the execution
cycle:

Functionality Tests should make sure:

 Each primary function tested is observed to operate in a manner apparently consistent with its
purpose, regardless of the correctness of its output.

 Any incorrect behavior observed in the product does not seriously impair it for normal use.

Stability Tests should make sure:

 The product is not observed to disrupt Windows.

 The product is not observed to hang, crash, or lose data.

 No primary function is observed to become inoperable or obstructed in the course of testing.

• The defects found during the test execution will be entered into defect tracking system. During critical
testing phases such as nearing production deployment, the defects identified shall be discussed and
communicated to Client.

2.14 Failover and Recovery Testing

Failover and recovery testing ensures that the target-of-test can successfully failover and recover from a
variety of hardware, software, or network malfunctions with undue loss of data or data integrity.

For those systems that must be kept running, failover testing ensures that when a failover condition
occurs, the alternate or backup systems properly "take over" for the failed system without any loss of data
or transactions.

Recovery testing is an antagonistic test process in which the application or system is exposed to extreme
conditions, or simulated conditions, to cause a failure, such as device Input/Output (I/O) failures, or
invalid database pointers and keys. Recovery processes are invoked, and the application or system is
monitored and inspected to verify proper application, or system, and data recovery has been achieved.

The loss of data is very critical and should be avoided, so Failover or recovery testing is been performed.

<System Testing> <10-Mar-2005> Page 21 of 26


Template Version 5.0/13-Nov-2003

2.14.1Testing Approach
• Test lead and testing minds should start with a System Study.

• Test lead should develop a Test plan for the whole activity.

• During the test case development testing mind should cover the scenarios on:

 Power interruption to the client

 Power interruption to the server

 Communication interruption via network servers

 Interruption, communication, or power loss to DASD (Direct Access Storage Devices) and DASD
controllers

 Incomplete cycles (data filter processes interrupted, data synchronization processes interrupted)

 Invalid database pointers or keys

 Invalid or corrupted data elements in database

• The test cases already created for Function and Business Cycle testing can be used as a basis for
creating a series of transactions to support failover and recovery testing.

• When the build is ready to test, upload the build in a QA environment.

• The testing minds should make sure that the following conditions are tested completely during the
test execution cycle.

 Primarily the following conditions have to be tested to say that recovery was successful. The
following types of conditions are included in the testing to observe and log behavior after
recovery:

− Power interruption to the client: power down the PC.

− Power interruption to the server: simulate or initiate power down procedures for the server.

− Interruption via network servers: simulate or initiate communication loss with the network
(physically disconnects communication wires or power down network servers or routers).

− Interruption, communication, or power loss to DASD and DASD controllers: simulate or


physically eliminate communication with one or more DASDs or DASD controllers.

 Once the above conditions or simulated conditions are achieved, additional transactions should be
executed upon reaching this second test point state, recovery procedures should be invoked.

 Testing for incomplete cycles utilizes the same technique as described above except that the
database processes themselves should be aborted or prematurely terminated.

 Testing for the following conditions requires that a known database state be achieved. Several
database fields, pointers, and keys should be corrupted manually and directly within the database

<System Testing> <10-Mar-2005> Page 22 of 26


Template Version 5.0/13-Nov-2003

(via database tools). Additional transactions should be executed using the tests from Application
Function and Business Cycle Testing and full cycles executed.

• The defects found during the test execution will be entered into defect tracking system and the
status report will be sent to the client.

2.15 Scalability testing

Scalability testing is an evaluation of the increase in performance due to expanded configuration, number
of nodes by adding additional resource.

2.15.1Testing Approach
• As part of the system study testing minds will

 Baseline the scalability with one node and vary its resources to determine the configuration that
yields good performance.

 Add more nodes of similar configuration and determine the number of nodes that yields best
performance.

• Testing minds, with inputs from the client’s business team, will finalize the frequently used business
transactions, and generate the scripts.

• Testing minds will use the test data derived by client on availability or will identify and set up the
required test data for scalability testing with the help of Client.

• Scalability test execution:

 The configuration under test for a single node could have 3 degrees of variance namely
i. Number of CPUs
ii. Memory
iii. Disk storage scheme for the database.
 The system response time is to be recorded for each permutation of changing variable values, and
measures the relative effect against a baseline configuration.
 Simultaneously system-monitoring tools need to be used to record resource utilization statistics
for root cause analysis/bottleneck analysis.
 Determine the yield point for each configuration and iteratively take the best mix as the relative
configuration for peak performance.
• The result of this test will list the configuration and number of nodes for peak performance.

2.16 Metrics

• Defect location Metrics – Defects raised against each module/operating system/language.


• Severity Metrics – Each defect has an associated severity (Critical, High, Medium and Low), which is
how much adverse impact the defect has or how important the functionality that is being affected by
the issue. Number of issues raised against severity shall be collected.
• Defect Closure Metrics – To indicate progress, the number of raised and closed defects against time
shall be collected.

<System Testing> <10-Mar-2005> Page 23 of 26


Template Version 5.0/13-Nov-2003

• Defect Status Metrics – It will indicate the number of defects in various states like, new, assigned,
resolved, verified, etc.

2.17 Test Repository and Test pack categorization

Test Repository shall be created to maintain the existing test packs and additional test packs created in
the future. A Change Management procedure shall be defined to ensure that the test packs are
maintained uniquely and addition of new test cases into the existing test pack is possible with the help of
Test management tool. The activity is illustrated as follows:

Test preparation
process

Test Test
Repository Execution
Process

Additional
tests
identified, Additional Test identification
Prepared, process
Test Repository ramp up process reviewed and
Approved
Functional
specs and
Business
scenarios

2.18 Configuration Management

• The Configuration management tool used for the project should be finalized before starting the
project. Mind Tree is using CVS, VSS as a configuration management tool.

• The entire test related documents will have to be checked in to the CM tool and the files must be
checked out before modification.

• All the checked out files must be checked back in. CM tool administrator will do the CM
administration. Sufficient Access rights should be provided to the QA resources to access the QA
Region of CM tool.

2.19 Risks that impact testing

Risk Probabili Impac Contingency plan


ty t
Any of the deliverables by Medium High The testing time may be lost and to
the development team not compensate for this, additional
happening as per the Project resources would be added

<System Testing> <10-Mar-2005> Page 24 of 26


Template Version 5.0/13-Nov-2003

plan. appropriately.

Non availability of Test Medium Medium Client should provide the


scripts on time (if test scripts information resources.
were given by the client)

System failure and loss of Medium Low A database backup strategy should
data during the Testing be in place so that loss of data can
Process. be prevented.

Test data not migrated in Medium Low Test the functionality not involving
time. data feed until migration.

Lack of Infrastructure in Medium Low The Project Manager must be


terms of computers, tools. informed and necessary
requirements must be met.

Lost connectivity during test Low High Local test setup should be in place.
execution from Except scenarios involving
offshore/onsite. mainframes, others can be executed
locally.

The problem should be


communicated to the client if
connectivity lost from their end.

3 Appendix
3.1 Mind tree’s Standard Templates.

1)Test Plan Template S:\EBiz-Testing\


Templates\Test_Plan_Template.doc

2) Test Case Document S:\EBiz-Testing\


Templates\Test Case Template.xls

3)Test Summary Report S:\EBiz-Testing\


Templates\Test Summary Report Template.doc

4) Traceability Matrix

5) Defect Metrics

<System Testing> <10-Mar-2005> Page 25 of 26


Template Version 5.0/13-Nov-2003

6) Graphs

3.2 Usability Testing Tools

1) ErgoLight WebTester provides navigation analysis, site evaluation and mechanisms for user feedback

2) Web Static Analyzer Tool (WebSAT) analyzes a site to determine the level of adherence to a set of
usability guidelines

3) Bobby, is a free service that assists in validating the accessibility of a web site by identifying site
aspects that make it difficult to use by web users with disabilities

3.3 Security Testing Tools

1) Port scanners Nmap attempts to connect to a server on various ports and also uses various types of
packets and fragments to bypass firewall and other filters

2) Cisco Secure Scanner provides port scanning and vulnerability assessment in addition to security
management and documentation

3) Network Monitoring – Windows NT/2000 Network Monitor, part of OS, allows the capture and
examination of packets destined to or transmitted from the local machine

4) Tcpdump-Freeware UNIX for network monitoring, (Windows port is called WinDump)

5) EthReal-freeware network analyzer for UNIX and Windows

3.4 I18N checklists

D:\TestingProcess\
I18N Checklists.doc

4 Revision history
Version Date Author(s) Reviewer(s) Change Description

1.0 30-Mar- Shanthala. Venkateswara


2005 A. R Rao. V

<System Testing> <10-Mar-2005> Page 26 of 26

You might also like