You are on page 1of 37

Software Quality Assurance

Software Quality
Measurement and Metrics

Outline

Basic Terminologies
Project Indicators
Classification of Software Metrics
Software Quality Metrics
Software Maintenance Metrics
Software Defects Classification
Software Quality
Measurement and Metrics

Software Metrics
Software Metrics let you know when
to laugh and when to cry
(Tom Glib)

Software Quality
Measurement and Metrics

Basic Terminologies

Measure

Measurement

a quantitative indication of some


attribute of a product or process. For example,
5 errors, 1200 LOC etc
the act of determining a

measure.

Metric

a quantitative measure of the


degree to which a product or process
possesses a given attribute. For example, 5
defects / KLOC. Metric is, basically, a ratio of
two related measures.
Software Quality
Measurement and Metrics

Why Measurement ?

There are four reasons for


measuring software processes,
products and resources:
1.
2.
3.
4.

To
To
To
To

Characterize
Evaluate
Predict
Improve
Software Quality
Measurement and Metrics

Types of Measurements
Direct Measures

Easily and directly measurable. For


example, LOC, Effort, Cost etc.

Indirect Measures

Not easy to measure.


Indirectly measured from direct
measures etc.
For example, complexity, quality etc.
Software Quality
Measurement and Metrics

Indicators

Indicator

Indicators are of two types:

is a metric or combination of
metrics that provide insight into the software
process, product or project. For example, two
teams using two different review approaches
give an indicator of the better approach.

Process Indicators
Project Indicators
Software Quality
Measurement and Metrics

Indicators
Process Indicators

Used to gain the insight into the efficacy of the


existing process.
These are collected across all projects and over a long
periods of time.
For example, PSP and TSP etc.

Project Indicators

Used to asses the status of the project.


Track potential risks.
Uncover problems areas before they occur.
Adjust the work flow and tasks accordingly.
Evaluate projects team ability to control software
quality.
For example, EVA etc.
Software Quality
Measurement and Metrics

Classification of Metrics

Software Quality
Measurement and Metrics

Quality Metrics

Provide indicators to improve the


quality of the product.
There are many quality attributes
such as maintainability, usability,
integrity and correctness etc.
(see McCalls Quality Model)

Software Quality
Measurement and Metrics

10

Availability

Availability is concerned with system failure


and its associated consequences.
The availability of a system is the probability
that it will be operational when it is needed.
This is typically defined as:

Mean Time to Failure


Mean Time to Failure + Mean Time to
Repair

From this come terms like 99.9% availability,


or a 0.1% probability that the system will not
be operational when needed.
Software Quality
Measurement and Metrics

11

Maintainability

Maintainability is the degree of easiness


with which a program can be corrected
if an error is encountered or changed if
user requirement changed.
Mean Time To Change (MTTC) is the
indirect measure to measure the
maintainability.
MTTC is the mean time to fulfill a
change request.
Lower MTTC means high maintainability.
Software Quality
Measurement and Metrics

12

Correctness

Correctness is the degree to


which the software performs its
required function.
A common way to measure the
correctness is Defect Density
(DD) No. of Defects reported by the
DD

user
Size of the program in KLOC

Software Quality
Measurement and Metrics

13

Usability

Usability is concerned with how easy it


is for the user to accomplish a desired
task and the kind of user support the
system provides.
It be broken down into:
1.
2.
3.
4.
5.

Learning system features


Using a system efficiently
Minimizing the impact of errors
Adapting the system to user needs
Increasing confidence and satisfaction
Software Quality
Measurement and Metrics

14

Integrity

Ability of the system to withstand


attacks on its security.
Integrity is measured in terms of
threat and security.
Threat probability that an attack of specific type
will occur within a given time.

Security

probability that an attack of specific


type will be repelled.
Integrity = [ 1 - (threat) * (1 security) ]
Software Quality
Measurement and Metrics

15

Performance
Performance is concerned with how long
it takes the system to respond when an
event occurs.
The response of the system to an event
can be characterized by:

Latency = response time event occurrence time.


Throughput = the number of transactions the
system can process in a second .

The Jitter of Response = the variation in


latency
The number of events not processed because
the system was too busy to respond
Software Quality
Measurement and Metrics

16

Occurrence of Events over time

MTBF

MTTD
MTTR
Time
Fault
occurs

Error
caused

Detection
Of error

Repair

Error
Fault
occurs caused

Defect Rate

Defect Injection Rate (DIR)

[(No of in process defects) + (No of Customer-reported


defects)]
Actual size of the product

Defect Removal Efficiency


(DRE)
Defect found by removal operation
Defect Present at removal operation

X100%

System Testing
Acceptance Testing
Warranty support
Customer reported during work product reviews
Technical Reviews

Defect Removal Effectiveness


No of defects removed (at the step entry)

X100%
[(no of defects existing at the step entry) + (No
of defects injected)]

Activities Associated with Defect Injection


and Removal
Development
Phase

Defect Injection

Defect removal

Requirements

Requirement
Requirement Analysis
gathering,
and and Review
development
of
functional specification

High level Design

Design Work

Design Inspections

Low level Design

Design Work

Design Inspections

Code Implementation

Coding

Code Inspections

Integration/Build

Integration process

Build
Testing

Unit Test

Bad Fixes

Testing Itself

Component Test

Bad Fixes

Testing Itself

System Test

Bad Fixes

Testing Itself

Verification

Defect Injection and Removal During One Process


Step

Undetected defects
Defect
Defects
Existing on Injected
Step Entry
during
Development

Defect
Detection
Defect
Repair

Incorrec
t
Repairs
Defects
Removed

Defect
Existing

Phase based Defect Effectiveness

High Level Design Inspection


Effectiveness
Defects Removed
Defects existing (Escaped from requirement
phase) + Defects Injected in High Level
Design

Phases

Defect Injected

Defect
Remov
ed

Defect
Remaine
d in
each
Phase

In-Process

Customer
Reporte
d

Requirements

85

15

88

12

High level
design

55

34

69

20

Low Level
Design

23

12

30

Code

44

40

Unit Testing

78

52

100

30

System
testing

67

23

40

50

Calculate

Defect
Defect
Defect
Defect
phase

Density if size is 5000 KLOC


Rate
Removal Efficiency
Removal Effectiveness of each

Software Maintenance
Metrics

Some Maintenance Metrics

Fix backlog and backlog management


index
Fix response time and fix
responsiveness
Percent delinquent fixes
Fix quality

Fix backlog and backlog


management index

BMI

Numbers of problem closed during the month


X100%
Numbers of problems arrivals during the month

As a ratio if BMI is larger than 100, it means


the backlog is reduced, if is less than 100
than it is increased

Fix response time and fix


responsiveness

Mean time of all problems from open to


closed

MT-pr1 + MT-pr2+.+MT-prn

The metric is operationalized as percentage


of delivered fixes meeting committed dates
to customer

Percent delinquent fixes


Number of fixes that exceeded the
response time criteria by severity
level
Number of fixes delivered in a
specified time

This metric is only for closed problems

X100%

Some other measures

Efficiency

Where
Size = KLOC
Effort = man year
Time = Hours
B = constant or a
scaling factor and is a
function of the project
size.

REQUIREMENT VOLATILITY

Software Quality Assurance

Defects Classification

Defects Classification
Error

1.

Errors are human mistakes.


For example, spell mistakes or syntactic
errors etc.

Defects

2.

Defects are improper program conditions that


generally results from errors, but not always.
For example, documentation errors do no
result in defects. And
Incorrect Packaging or exception handling
Software Quality
Measurement and Metrics

34

Defects Classification
Bugs / Fault

3.

It is a program defect that is encountered in operation


(alpha testing, beta testing or software operation)
All defects do not cause bugs.
For example, Y2K

Failure

4.

A failure is a malfunction of users installation.


It may result from a

Bug
Incorrect installation
Communication line hit
Hardware failure etc.
Software Quality
Measurement and Metrics

35

Defects Classification
Problems

5.

Problems are user encountered


difficulties.
Problems are human events whereas
Failures are system events.
A problem may result from:

Failure
Misuse
misunderstandings
Software Quality
Measurement and Metrics

36

Software Quality Assurance

Feel free to ask!


Thanks!

You might also like