Professional Documents
Culture Documents
The following is a fictional example illustrating a simple application of the 8-Step Metrics Progam that
will yield short-term feedback. Before committing a major investment in time, it is often valuable to
become familiar with a new program by applying it to a small project with a short time frame. The
experience gained should be applicable to the implementation of a larger program with long-term goals
and a broader scope. Most software organizations can replicate this example to yield valuable information
with a minimum of investment.
Background
Integrated Software is a small company specializing in integrated CASE tools. It has few
formal procedures, and its software development process is best described as traditional.
The company performs code inspections on an ad hoc basis, and a testing program is
established. Management is considering a major upgrade to the main product that will
involve significant re-engineering. Time is short, budgets are thin and experience with
metrics is non-existent.
Integrated Software’s past experience shows that some projects have had serious cost
overruns. The instability of a recent release resulted in a loss of customers. Projects that
are underestimated, over-budget, or that produce unstable products, have the potential to
devastate the company. Accurate estimates, competitive productivity, and renewed
confidence in product quality are critical to the success of the company.
Hoping to solve these problems as quickly as possible, the company management
embarks on the 8-Step Metrics Program by Software Productivity Centre Inc..
Estimates
The development staff at Integrated Software considers past estimates to have been
unrealistic as they were established using “finger in the wind” techniques. They suggest
that current plans could benefit from past experience as the present project is very similar
to past projects. The metrics coordinator narrows and restates Step 2, Goal 2 (Improve
software estimation):
Company Goal 1: Use previous project experience to improve estimations of
productivity.
Questions asked about the goal:
• What is the actual labor rate of past projects?
• How complicated is the software being developed?
• Does the labor rate vary for different types of software?
Productivity
Discussions about the significant effort spent in debugging center on a comment by one
of the developers that defects found early on in reviews have been faster to repair than
defects discovered by the test group. It seems that both reviews and testing are needed,
but the amount of effort to put into each is not clear. The metrics coordinator decides to
focus Step 2, Goal 8 (Improve staff productivity) around questions of rework.
Company Goal 2: Optimize defect detection and removal.
Questions asked about the goal:
• How much effort is spent in testing versus reviews?
• How many defects are discovered in testing versus reviews?
• How much effort is spent repairing defects discovered in reviews?
• How much effort is spent repairing defects discovered in testing?
• How efficient are reviews in removing defects?
• How efficient is testing in removing defects?
• What is the optimal defect detection efficiency to achieve in reviews prior to testing?
Quality
The test group at the company argues for exhaustive testing. This however, is
prohibitively expensive. Alternatively, they suggest looking at the trends of defects
discovered and repaired over time to better understand the probable number of defects
remaining.
The coordinator limits Step2, Goal 6 (Improve software quality) to establishing a
quantitative indication of stability at product release.
Company Goal 3: Ensure that the defect detection rate during testing is converging
towards a level that indicates that less than five defects per KSLOC will be
discovered in the next year.
Questions asked about the goal:
• How many defects have been discovered so far?
• How many defects have been repaired so far?
• What is the trend in number of defects discovered and repaired over time?
Figure A.1 :
Identifying data
to collect. Goal 1 Metrics Data to collect
1
2
3
4
etc.
Figure A.3 : Person Activity Start Complete Person
Data collection
form - Effort.
date date hours
A Code
B Code
C Code
A Review
B Review
C Review
A Recode
B Recode
C Recode
A Debug
B Debug
C Debug
D Test
E Test
0
Code Re v i e w Re c o d e Test De b u g
50
0
Review Recode Test Debug
Figure A.7 :
Efforts/Defects Effort / Defects
examples -
graph.
20
18
16
14
12
10
8
6
4
2
0
Review Test
0
Week Week Week Week Week Week Week
1 2 3 4 5 6 7
Figure A.9 :
Total defects
example - Total Defects
graph.
600
500 Defects
400 Discovered
Cumulative
300
Defects
200 Repaired
100 Cumulative
Target
0
Maximum
Week Week Week Week Week Week Week
1 2 3 4 5 6 7