You are on page 1of 22

AppLabs white box testing process

Process Document

White Box Testing

AppLabs Inc., 1700 Market Street, Suite 1406, Philadelphia, PA 19103


Phone: 215-569-9976 Fax: 215-569-9956 Web: www.applabs.com

- CONFIDENTIAL - 1/22
AppLabs white box testing process

Table of Contents

1.0 Introduction .......................................................................................................... 4


1.1 Goals ................................................................................................................... 4
1.2 Scope................................................................................................................... 4
1.3 Staffing................................................................................................................ 5
1.4 Roles and Responsibilities .................................................................................. 5
1.5 Methodology ....................................................................................................... 6
1.6 Deliverables ........................................................................................................ 6
2.0 Requirements Analysis ....................................................................................... 7
2.1 System Architecture............................................................................................ 7
2.2 Identification ....................................................................................................... 7
2.3 Criteria for Test Cases ........................................................................................ 7
3.0 Practices................................................................................................................. 8
3.1 Code Coverage Analysis..................................................................................... 8
3.1.1 Basis Path Testing....................................................................................... 8
3.1.2 Control Structure testing ............................................................................. 9
3.2 Design by Contract (DbC) ................................................................................ 10
3.3 Profiling ............................................................................................................ 10
3.4 Error Handling .................................................................................................. 11
3.5 Transactions ...................................................................................................... 11
4.0 Parameters........................................................................................................... 11
4.1 Test Case Bundles............................................................................................. 11
4.2 Deployment Configurations.............................................................................. 12
4.3 Users Roles ....................................................................................................... 12
5.0 Test Areas............................................................................................................ 12
5.1 Unit Testing ...................................................................................................... 12
5.1.1 Component................................................................................................ 12
5.1.2 Module ...................................................................................................... 13
5.1.3 Subsystem ................................................................................................. 13
5.1.4 System....................................................................................................... 13
5.2 Security ............................................................................................................. 13
5.2.1 Authentication........................................................................................... 14
5.2.2 Data Security............................................................................................. 14
5.3 Network Resources ........................................................................................... 14
5.3.1 Bandwidth ................................................................................................. 14
5.3.2 Applications .............................................................................................. 14
5.4 Capacity ............................................................................................................ 14
5.5 Scalability ......................................................................................................... 15
5.6 Reliability.......................................................................................................... 15
5.7 Burn-in .............................................................................................................. 15
5.8 Clustering.......................................................................................................... 15
6.0 Testing Tools....................................................................................................... 16
6.1 QA Products...................................................................................................... 16
6.2 Test Harnesses .................................................................................................. 16
6.3 Stubs.................................................................................................................. 16

- CONFIDENTIAL - 2/22
AppLabs white box testing process

6.4 Probes................................................................................................................ 16
6.5 Monitoring Tools .............................................................................................. 17
6.6 Custom Tools .................................................................................................... 17
7.0 Infrastructure ...................................................................................................... 17
7.1 Test Case Collection System ............................................................................ 17
7.2 Problem Tracking System................................................................................. 17
7.2.1 Roles ......................................................................................................... 18
7.2.2 Severity Levels.......................................................................................... 18
7.3 Communication................................................................................................. 18
7.3.1 Internet Based ........................................................................................... 18
7.3.2 Phone/Fax ................................................................................................. 19
7.4 Risk Management ............................................................................................. 19
7.5 Project Management ......................................................................................... 19
7.5.1 Activity Tracking ...................................................................................... 19
7.5.2 Task Tracking ........................................................................................... 20
7.5.3 Issues Tracking ......................................................................................... 20
8.0 Documentation ................................................................................................... 20
8.1 Project Documentation...................................................................................... 20
8.2 Reporting........................................................................................................... 21
Glossary........................................................................................................................... 21
References ....................................................................................................................... 22

- CONFIDENTIAL - 3/22
AppLabs white box testing process

1.0 Introduction

AppLabs has strong processes for quality assurance in all the tasks performed.
Our attempt is to incorporate these processes across all activities - from
managing the Software Development Life Cycle, to tracking of the status of
projects, through to communicating with the client.

Our end goal is to build in processes that will integrate our offshore team with
the client’s onsite team, and to minimize person-dependence.

A typical rollout of a product is shown in figure 1 below. Applabs offers white


box testing during all the stages shown.

Staging
Production
Area

Development Integration
Workstations Environment

Development Environment

Figure 1. Generic Work Flow for Application Development

1.1 Goals

The purpose of white box testing as performed by Applabs is to:

1.1.1 Initiate a strategic initiative to build quality throughout the


life cycle of a software product or service.
1.1.2 Provide a complementary function to black box testing.
1.1.3 Perform complete coverage at the component level.
1.1.4 Improve quality by optimizing performance.

1.2 Scope

Applabs is capable of performing most aspects of white box testing as


outlined in this document. This includes System level testing down to
code coverage analysis.

Some clients may choose not to pursue certain aspects of the scope, e.g. 3.1 below.

- CONFIDENTIAL - 4/22
AppLabs white box testing process

1.3 Staffing

Applabs has a primary focus of retaining the best talent available in its
offshore facility in India and has staff dedicated to performing recruitment
and Human Resources (HR) functions.

Suitable candidates with the necessary mindset are hired through a


rigorous process of written exams, interviews and reference checking.

Since white box testing requires a deeper understanding of the software


development process, software engineers with a development background
are placed on white-box testing assignments for the duration of a project.

1.4 Roles and Responsibilities

Applabs dedicates the following personnel to work on a typical white-box


testing project:

Role Responsibilities
Project Formulation of the Entire Project Plan
Manager All commercial communication with the client
Quality assurance at project level
Project or Scheduling and tracking of activities as per project plan
Module Managing the team assigned to the project for coding
Leader through integrated testing
Interfacing with support services
Quality assurance at programmer level
Project Coding and unit testing
Member Support for Integration testing
Configuration Configuration control of the hardware and software
Control environment
Assurance of proper working conditions
Maintenance and secrecy control for software
Technical Provides technical consultancy to the project team
Consultant Resolve technical issues in the project
Quality Assisting Project Leader in quality assurance
Assurance Periodic audit of project to ensure compliance
Ensuring good data collection
Analyzing data and providing feedback
Suggesting process improvements

- CONFIDENTIAL - 5/22
AppLabs white box testing process

The above Applabs personnel interface with one or more technical and
project management staff on the client side. For example, the following
might be the client staffing on an EJB based project:

Figure 2. Actors in an EJB Application Development Scenario

Typically, there is a single point of contact on the client side that acts as
the project manager for interfacing to Applabs. This person facilitates
communications and resource management functions with Applabs as
well as tracks the project.

1.5 Methodology

The methodology employed by Applabs to accomplish white-box testing


comprises the following key elements:

• Preparation of a Software Test Plan


This document simply lists all the functions and associated test plan
items along with a project schedule.
• Coordinating offshore QA efforts with the client on regular basis
• Any client-specific process / method for testing, test data collection for
testing etc.

1.6 Deliverables

The deliverables of a white-box testing engagement are as follows:

- CONFIDENTIAL - 6/22
AppLabs white box testing process

• Software Test Plan.


• Collection of problem reports as filed in the online Problem
Tracking System.
• Status reports on a periodic basis.
• Any specific deliverable as requested by the client.

2.0 Requirements Analysis

Understanding the requirements is key to performing white box testing. To this


end, Applabs works with the customer to get a better understanding of the
system under consideration. All relevant project documents relating to the
system functionality and design are desirable.

2.1 System Architecture

Key to undertaking any white-box testing project is to understand the


overall system architecture. To this end, Applabs works with client
provided documentation relating to system architecture. Alternately,
Applabs follows Rational Unified Process standards, include UML, to
represent system architectures in a concise manner.

The system architecture documentation forms the basis for identification


of systems, subsystems and generation of test cases.

2.2 Identification

The identification of the test items is done primarily based on the


specifications of the product.

These specifications would be related to:


• Functions (exhaustive list) of the system
• Response criteria (benchmarking and stress testing)
• Volume constraints (number of users, hits, stress testing)
• Stability criteria (24 hour testing with fast operations)
• Database responses (flushing, cleaning, updating rates etc.)
• Network criteria (network traffic, choking, etc.)
• Compatibility (environments, browsers, etc.)
• User Interface / Friendliness Criteria
• Modularity (Ability to easily interface with other tools)
• Security

2.3 Criteria for Test Cases

- CONFIDENTIAL - 7/22
AppLabs white box testing process

Each Test Plan Item should have the following specific characteristics:

• It should be uniquely identifiable


• It should be unambiguous
• It should have well-defined test-data (or data-patterns)
• It should have well defined pass/fail criteria for each sub-item and
overall-criteria for the pass/fail of the entire test itself
• It should be easy to record
• It should be easy to demonstrate repeatedly

Many of the above criteria are related to actually identifying the test plan
items and would involve a good understanding of the specifications.
However, keeping in mind the need for a strong process, AppLabs has
kept the above aspects in mind and formulated a structure of the test plan.

3.0 Practices

This section outlines some of the general practices comprising the Applabs
white-box testing process. In general, white-box testing practices have the
following considerations:
1. The allocation of resources to perform class and method analysis and to
document and review the same
2. Developing a test harness made up of stubs, drivers and test object
libraries
3. Development and use of standard procedures, naming conventions and
libraries
4. Establishment and maintenance of regression test suites and procedures
5. Allocation of resources to design, document and manage a test history
library
6. The means to develop or acquire tool support for automation of
capture/replay/compare, test suite execution, results verification and
documentation capabilities

3.1 Code Coverage Analysis

3.1.1 Basis Path Testing


A testing mechanism proposed by McCabe whose aim is to derive a
logical complexity measure of a procedural design and use this as a
guide for defining a basic set of execution paths.

These are test cases that exercise basic set will execute every
statement at least once.

- CONFIDENTIAL - 8/22
AppLabs white box testing process

3.1.1.1 Flow Graph Notation


A notation for representing control flow similar to flow
charts and UML activity diagrams.

3.1.1.2 Cyclomatic Complexity


The cyclomatic complexity gives a quantitative measure of
the logical complexity. This value gives the number of
independent paths in the basis set, and an upper bound for
the number of tests to ensure that each statement is executed
at least once.

An independent path is any path through a program that


introduces at least one new set of processing statements or a
new condition (i.e., a new edge)

Cyclomatic complexity provides upper bound for number of


tests required to guarantee coverage of all program
statements.

3.1.2 Control Structure testing

3.1.2.1 Conditions Testing


Condition testing aims to exercise all logical conditions in a
program module. They may define:
• Relational expression: (E1 op E2), where E1 and E2 are
arithmetic expressions.
• Simple condition: Boolean variable or relational
expression, possibly proceeded by a NOT operator.
• Compound condition: composed of two or more simple
conditions, boolean operators and parentheses.
• Boolean expression: Condition without relational
expressions.

3.1.2.2 Data Flow Testing


Selects test paths according to the location of definitions and
use of variables.

3.1.2.3 Loop Testing


Loops fundamental to many algorithms. Can define loops as
simple, concatenated, nested, and unstructured.
Examples:

- CONFIDENTIAL - 9/22
AppLabs white box testing process

Note that unstructured loops are not to be tested – rather,


they are redesigned.

3.2 Design by Contract (DbC)

DbC is a formal way of using comments to incorporate specification


information into the code itself. Basically, the code specification is
expressed unambiguously using a formal language that describes the
code's implicit contracts. These contracts specify such requirements as:
• Conditions that the client must meet before a method is invoked.
• Conditions that a method must meet after it executes.
• Assertions that a method must satisfy at specific points of its
execution

Tools that check DbC contracts at runtime such as JContract


[http://www.parasoft.com/products/jtract/index.htm] are used to perform this
function.

3.3 Profiling

Profiling provides a framework for analyzing Java code performance for


speed and heap memory use. It identifies routines that are consuming the
majority of the CPU time so that problems may be tracked down to
improve performance.

- CONFIDENTIAL - 10/22
AppLabs white box testing process

These include the use of Microsoft Java Profiler API and Sun’s profiling
tools that are bundled with the JDK. Third party tools such as JaViz
[http://www.research.ibm.com/journal/sj/391/kazi.html] may also be used to
perform this function.

3.4 Error Handling

Exception and error handling is checked thoroughly are simulating partial


and complete fail-over by operating on error causing test vectors. Proper
error recovery, notification and logging are checked against references to
validate program design.

3.5 Transactions

Systems that employ transaction, local or distributed, may be validated to


ensure that ACID (Atomicity, Consistency, Isolation, Durability). Each of
the individual parameters is tested individually against a reference data
set.

Transactions are checked thoroughly for partial/complete commits and


rollbacks encompassing databases and other XA compliant transaction
processors.

4.0 Parameters

The white-box testing process applies the following artifacts to each of the test
areas outlined in section 5.0 below.

4.1 Test Case Bundles

These bundles, sometimes referred to as Workloads, are scripts that


exercise application functionality together with some associated
parameterization.

These bundles may be created as the following sets:

4.1.1 Interface Set: A complete set of interfaces tests that exercise


all functionality of that interface that validates that a
component interface has been developed as designed

4.1.2 Standard Set: This is a subset of all the Interface Set for a
component and only contains normal cases. Unlike the

- CONFIDENTIAL - 11/22
AppLabs white box testing process

Interface Set that contains single instances for each test case,
the Standard Set contains a balanced mix of interface calls

4.1.3 Installation Verification Set: This is a strict subset of the


Standard Set that applies to a component and provides high-
level verification that an installation or upgrade has been
completed in a satisfactory fashion.

4.2 Deployment Configurations

In order to test partial functionality of a business process, the deployment


may be segmented into functionally equivalent sets of servers that are able
to support the entirety of a set of business applications for a specific
service delivery channel or technology. This segmentation is sometimes
referred to as ‘zones’.

Identification of such zones helps to segment the systems into smaller


chunks that may be tested independently to check for errors and/or
performance improvement.

4.3 Users Roles

In a multi-user system, various classes or users may be identified and test


cases run under the context of these users. This helps weed out potential
functional and security problems as well as provide test coverage of any
Access Control Lists (ACL’s), if any.

5.0 Test Areas

5.1 Unit Testing

The various constituents of the system are tested in increasing degrees of


granularity starting from a component through to the full system. A
component is the smallest unit for testing. A module or API consists of
components, while a subsystem comprises multiple modules. The
complete system is built with various subsystems.

Control Points are used to segment the life cycle of the development
process into the Design, Development and Deployment phases.

5.1.1 Component

- CONFIDENTIAL - 12/22
AppLabs white box testing process

A component is an independent, isolated and reusable unit of a


program that performs a well-defined function. It usually has
public interfaces that allow it be used to perform its functions.

Individual components are tested against their functional and


design goals using the parameters outlined in section 4.0.

5.1.2 Module

A module comprises of one or more components to achieve a


business function. Also known as an API, the module encapsulates
and aggregates the functionality of its constituent components and
appears as a ‘black-box’ to its users.

Usually, a module is homogeneous in nature with respect to the


application domain. For example, a database module with interface
and/or encapsulate database specific functions.

Modules are tested against their functional and design goals using
the parameters outlined in section 4.0.

5.1.3 Subsystem

Subsystems are defined as heterogeneous collections of modules to


achieve a business function. For example, a credit card processing
subsystem might interface to a credit card clearing house, a
database component and an audit mechanism to perform complete
credit card related operations.

Subsystems are also tested against their functional and design goals
using the parameters outlined in section 4.0.

5.1.4 System

The full system uses multiple subsystems to implement the full


functionality of the Application. An example is an online shopping
system that includes catalog, shopping cart and credit card
processing subsystems.

The complete system is tested against their functional and design


goals using the parameters outlined in section 4.0.

5.2 Security

- CONFIDENTIAL - 13/22
AppLabs white box testing process

To address security issues with the system under test, the following
aspects are tested:

5.2.1 Authentication

The authentication mechanism is tested based on class of user and


password validation, if any. If 3rd party products such as LDAP
servers are used in this mechanism, those are tested as well.

In the situation where digital certificates are used for


authentication, the certificates as well as system behavior upon
activation of the certificates are tested.

5.2.2 Data Security

Flow of information (channel security) as well as firewall access


issues are tested for all aspects of the system under test.

If data encryption is being used, it is checked with compatibility


with standards.

5.3 Network Resources

Systems that operate on networks are tested for dependencies on network


performance and issues. These factors include:

5.3.1 Bandwidth

System behavior is tracked by varying the bandwidth available on


the network from 1Kbps all the way to 10Mbps. Parameters such as
timeouts and breakdown of operations are monitored.

Network Latencies are also introduced to check for the same.

5.3.2 Applications

If the system under test has dependencies on network applications


like SNMP, these are checked for graceful failover and recovery.

5.4 Capacity

- CONFIDENTIAL - 14/22
AppLabs white box testing process

Capacity testing looks at resource demand for each of the test vectors in
various parts of the system. These test vectors are designed to reflect
normal usage in terms of workflows and bandwidth requirements.

Extrapolation is sometimes used to approximate system behavior for


larger capacities.

5.5 Scalability

This aspect of testing recognizes the potential for increases in scalability


from a system architecture, design, development, and deployment
standpoint. The application under test is tested for the ability to scale
linearly by the balanced addition of computing (Network, CPU, Memory,
Disk, Tape), resulting in a proportionate increase in capacity at the same
performance level.

Extrapolation is sometimes used to approximate system behavior for


larger capacities that are not feasible to simulate.

5.6 Reliability

The purpose of this test to ensure that the Application under test performs
in a consistent fashion. It is exercised to expose any specific and gross
node failures including:
• No single point of failure that compromises entirety of application
• Potential exists for in-flight activities to terminate abruptly
• In some failure mode application function compromised for a
period

Products such as CISCO Local Director are tested during this phase.

5.7 Burn-in

In this aspect of testing, the system or subsystem under test is run at full
and partial loads over an extended period of time (e.g. 24 hours or more)
to test for consistency.

5.8 Clustering

For systems that use clustering support, one or more systems are removed
from the installation on an operational system to check for system
recovery and verify fail over to backup system.

- CONFIDENTIAL - 15/22
AppLabs white box testing process

The application under test is checked for damage or compromise of data


and/or resources.

6.0 Testing Tools

6.1 QA Products

Applabs has extensive experience in commercial testing tools such as


WinRunner and SilkTest. These are tools that generate User Interface
events and are capable of comparing results as well as parameterized
input.

These tools are used in cases where User Interface is the primary driver.

6.2 Test Harnesses

Test harnesses are generally a mechanism where sets of stored interface


pairs (Request and Response) are imported or loaded. The mechanisms
have the general advantage that they will generally run on the platform
being tested.

Some products such as WebLogic have test harness generation support.

Applabs is capable of using product specific test harnesses as well writing


custom harnesses if needed.

6.3 Stubs

Stubs are programs or components that have deterministic behavior and


are used to interface to subsystem in order to take care of dependencies.
For example, a complex user registration component may be replaced in a
system with a stub that returns 1 or 2 outputs regardless of input. This
helps to reduce the overall complexity of the system into more
manageable chunks.

These stubs, whether they are off-the-shelf, automatically generated via


tools or custom-built are available for use in the white-box testing process.

6.4 Probes
During coverage analysis or execution verification, most often, C1* is
measured by planting subroutine calls--called software probes -- along each
* See glossary for definition of C1

- CONFIDENTIAL - 16/22
AppLabs white box testing process

segment of the program. The test object is then executed and


instrumentation gathered for analysis. These tools are also classified as
Instrumentation tools.

Probes, whether they are off-the-shelf, automatically generated via tools


or custom-built are available for use in the white-box testing process.

6.5 Monitoring Tools

Monitors gather system performance information from the operating


system kernel including CPU system info, VM info, disc info, and network
info. For example, the Sun Microsystems program RICHP may be used to
gather system performance metrics on a running system and cross-
referenced with test case information.

6.6 Custom Tools

Due to the strong technology background of the team employed by


Applabs to do the white-box testing project and the availability of
resources from the Development division of Applabs, development of
custom tools including exercisers, harnesses and stubs may be
accomplished with relative ease.

7.0 Infrastructure

7.1 Test Case Collection System

Test cases, which may be authored by a variety of qualified personnel, are


collected via email, ftp or through a online forms-based system. Standard
templates are used for authoring test cases.

All suitable technical personnel and domain experts are encouraged to


submit test cases through this system. These test cases are reviewed and
approved for inclusion by the project managers of the client and Applabs.

7.2 Problem Tracking System

Applabs employs its own problem tracking system to track problems


relating to a white-box testing project. This system tracks the entire life
cycle of a problem from the time it is submitted, till it is assigned, fixed,
verified, and finally closed.

- CONFIDENTIAL - 17/22
AppLabs white box testing process

Alternately, Applabs personnel may use a similar system that is employed


by the client to perform the same functions. In either case, the following
are basic characteristics of the problem tracking system:

7.2.1 Roles

Users of the problem tracking system may broadly be categorized


into managers, developers and testers.

7.2.2 Severity Levels

Each test item would have a severity level associated with it.
AppLabs broadly identifies three levels of severity for each test
item as follows:

Severity Level 1
Program aborts and the system cannot function further

Severity Level 2
Major functionality of the system (associated with that test item)
affected – however other functions still work well

Severity Level 3
All major functionality of the system works as per specifications –
however minor user irritation or minor data inconsistencies exist
(which could be easily handled by a system administrator)

These levels are based on the severity perception of the


functionality and could also be modified as per the client’s criteria.

7.3 Communication

Applabs has built and advanced communications infrastructure for


optimal communications for internal and external use. Clients have the
option of using these facilities if desired.

7.3.1 Internet Based

These include email, newsgroups & instant messaging based


communications. Applabs maintains its own servers for the above
on a secure network for optimal performance and security.

- CONFIDENTIAL - 18/22
AppLabs white box testing process

Other advanced forms of Internet communications such as video


conferencing may be employed at the client’s request, with due
consideration given to network issues relating to connections to
Applabs offshore facilities.

7.3.2 Phone/Fax

Applabs maintains multiple phones and fax lines along with


voicemail and cellular phone access for relevant project personnel.

7.4 Risk Management

Risk Plan
Manpower Process-wise backups
attrition On-going training
Very strong documentation
Regular reviews and technology transfers
All processes handled with shared responsibility.
Communication Back Up communication infrastructure
Failure De-linking actual development from communication
infrastructure
Software Use of original software
Environment Proper configuration management and security
Corruption
Machine Backup machines availability
Downtime Link-up with support vendors for quick service and
low downtime

7.5 Project Management

A Standard SEPM approach is taken for project management and


Microsoft tools used. These include Microsoft Project, Microsoft
Outlook and Microsoft Office Products.

7.5.1 Activity Tracking

Activity Process
Project The entire plan will be formulated in MS Project
Monitoring and sent to the client at the beginning of the project
itself.
Details of the plan are worked out at the offshore
site with intermediate milestones, completion
dates and resource allocation

- CONFIDENTIAL - 19/22
AppLabs white box testing process

Every week and at the end of every major


milestone, the current status of the project is made
known to the client with reference to the original
plan

7.5.2 Task Tracking

Activity Process
Task Identification of roles and assignments to persons
Scheduling MS Scheduler to track role-wise tasks
and tracking Regular access to tasks and review by management
Security levels for task deletions and modification

7.5.3 Issues Tracking

During the course of the project there could be several issues that
would have to be followed up and resolved. These could be either
technical or commercial. Applabs has a tracking mechanism for this
by using the following template via email:

Date Type Details Follow Up


Customer Feedback
Technical
Clarification

The types of issues could either be internal or specified by the


customer. An end date is always specified as part of the column
related to details.

8.0 Documentation

When the entire project is executed by AppLabs, Rational Unified Process (RUP)
or IEEE standards of documentation are followed wherever applicable in order
to ensure that the right quality levels are maintained at each stage of the project.
The three basic documents that we rely on corresponding to different stages of
the Software Development Life Cycle are:

• Software Requirement Specifications Document


• Software Design Description Document
• Software Test Plan

8.1 Project Documentation

- CONFIDENTIAL - 20/22
AppLabs white box testing process

Depending on the nature and schedule of the project we use more detailed
documentation such as:

• Software Development Plan


• System Segment Specifications
• Interface Requirements and Interface Design Documents
• Software Test Plans and Descriptions
• Version Control and Product Specification Documents

These documents are applicable to the Structured Systems Analysis and


Design methodology.

8.2 Reporting

The Applabs team regularly sends status reports to the client. A sample
report is given below:

Status Report
Report Number
Date and Time
Subject / Topic
Review / Feedback Required
(time-based)
To & From (if required)

The reports address the following issues and purposes:

• Tasks completed during the previous week with reference to the


schedule
• Plan for tasks which are not on schedule
• Any change in the original schedule
• Technical clarifications
• Response to any queries, feedback
• Details of any activity, procedure or task if asked for by the client

Glossary

White Box Testing: A software testing technique whereby explicit knowledge of


the internal workings of the item being tested are used to select the test data.
Unlike black box testing, white box testing uses specific knowledge of
programming code to examine outputs. The test is accurate only if the tester

- CONFIDENTIAL - 21/22
AppLabs white box testing process

knows what the program is supposed to do. He or she can then see if the
program diverges from its intended goal. White box testing does not account for
errors caused by omission, and all visible code must also be readable.

Design by Contract (DbC): is a formal way of using comments to incorporate


specification information into the code itself. Basically, the code specification is
expressed unambiguously using a formal language that describes the code's
implicit contracts.

C1: One of the more common strategies for testing involves declaring in a
minimum level of testing coverage, ordinarily expressed as a percentage of the
elemental segments that are exercised in aggregate during the testing process.
This is called C1 coverage, where C1 denotes the minimum level of testing
coverage so that every logically separate part of the program has been exercised
at least once during the set of tests. Unit testing usually requires at least 85-90%
C1 coverage.

References

1. White-Box Testing, Dr. Dobb’s Journal March 2000


http://www.ddj.com/articles/2000/0003/0003a/0003a.htm
2. Code Coverage Analysis
http://www.bullseye.com/coverage.html
3. Design by Contract
http://www.eiffel.com/doc/manuals/technology/contract/page.html
4. MCG's White Paper: Software Quality Assurance in Smalltalk.
http://www.mcgsoft.com/ftp/assurance.pdf
5. Design for Debug and Test
http://japan.amc.com/support/resources/white_papers/desdeb/desdeb.html

- CONFIDENTIAL - 22/22

You might also like