Professional Documents
Culture Documents
CMMI Process
Performance Models
Software Engineering Institute
Carnegie Mellon University
Pittsburgh, PA 15213
Robert Stoddard, Kevin Schaaff,
Rusty Young, and Dave Zubrow
April 28, 2009
Speaker Biographies
Robert W. Stoddard currently serves as a Senior Member of the Technical Staff at
the Software Engineering Institute. Robert architected and designed several leading
measurement and CMMI High Maturity courses including: Understanding CMMI
High Maturity Practices, Improving Process Performance using Six Sigma, and
Designing Products and Processes using Six Sigma.
Rusty Young currently serves as the Appraisal Manager at the SEI. Rusty has 35+
years S&SW development, management, and consulting He has worked in large and
small organizations; government and private industry. His recent focus has been
CMMI High Maturity.
NO WARRANTY
THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE
MATERIAL IS FURNISHED ON AN AS-IS" BASIS. CARNEGIE MELLON UNIVERSITY
MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO
ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR
PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM
USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY
WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT,
TRADEMARK, OR COPYRIGHT INFRINGEMENT.
Use of any trademarks in this presentation is not intended in any way to infringe on the
rights of the trademark holder.
This Presentation may be reproduced in its entirety, without modification, and freely
distributed in written or electronic form without requesting formal permission. Permission is
required for any other use. Requests for permission should be directed to the Software
Engineering Institute at permission@sei.cmu.edu.
This work was created in the performance of Federal Government Contract Number
FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software
Engineering Institute, a federally funded research and development center. The
Government of the United States has a royalty-free government-purpose license to use,
duplicate, or disclose the work, in whole or in part and in any manner, and to have or
permit others to do so, for government purposes pursuant to the copyright license under
the clause at 252.227-7013.
Topics
Introduction
Review of Process Performance Models (PPMs)
Technical Process of Building PPMs
Questions
Review of CMMI
Process Performance
Models (PPMs)
What is a PPM?
OPP SP 1.5
PPMs are used to estimate or predict the value of a processperformance measure from the values of other process, product,
and service measurements
PPMs typically use process and product measurements collected
throughout the life of the project to estimate progress toward
achieving objectives that cannot be measured until later in the
projects life
Glossary
A description of the relationships among attributes of a process and
its work products that is developed from historical processperformance data and calibrated using collected process and
product measures from the project and that is used to predict results
to be achieved by following a process
Kevin Schaaff, Robert Stoddard
Rusty Young, Dave Zubrow
2009 Carnegie Mellon University
Software
Design
Systems
Testing
Integration Testing
Requirements
Management
Requirements
Elicitation
Customer
Acceptance
Testing
Project
Forecasting
Project
Planning
Project
Start
Proposal
Project
Finish
Kevin Schaaff, Robert Stoddard
Rusty Young, Dave Zubrow
2009 Carnegie Mellon University
Interactive Question #1
Do you now feel comfortable with your knowledge of the
healthy ingredients of a CMMI Process Performance Model
that were just presented?
1) Yes
2) No
10
Technical Process of
Building PPMs
Topics
Review of CMMI Process Performance Models (PPMs)
Technical Process of Building PPMs
Questions
12
Process-Oriented
Process-Agnostic
y
y
y
y
y
y
x
x
x
x
x
x
(e.g., $ Buckets,
% Performance)
x
x
(Balanced Scorecard)
x
x
y
y
X
x
Subordinate
Processes
x
13
14
15
Peer Review
Test
Requirements Development
Inspection
Integration Test
Requirements Analysis
16
Defects
injected
Design
Defects
injected
Implementation
Defects
injected
Transition
to Customer
Integration
Req.
Review
Design
Review
Code
Review
Defects
removed
Defects
removed
Defects
removed
Defects
removed
Defects
removed
Fielded
Defects
removed
Outcomes related
to key handoffs of
work during project
Outcomes related
to interim milestones
and mgt reviews
Outcomes related
to project completion
Outcomes related
to risks during the
project execution
Kevin Schaaff, Robert Stoddard
Rusty Young, Dave Zubrow
2009 Carnegie Mellon University
17
Examples of Outcomes
Injected
Defects
Volume
Escap
by type
ed def
ects b
y phas
es*
c
r
e*
u
o
s
e
of r
y
tion
t
i
l
a
i
r
b
u
a
l
d
Avai
Task
y
Sch
ela
d
k
edu
s
Ta
le V
ce
n
a
i
r
Task eff
a
V
t
s
o
aria
C
ort
nce
)
I
P
PI, S
C
(
Latent d
s
c
i
r
t
e
M
efect co
e
lu
a
V
ntent of
d
e
Earn
artifact*
Difficu
y* Rework
lty*
t
i
v
i
ct
Reqts V
u
d
olatility*
lity
a
u
Pro
or Q
o
P
of
n
io
t
t
c
a
f
s
is
t
a
S
o
r
C
Custome
Time to M
sts
o
a
r
y
k
C
t
i
e
l
i
Progress*
t
y
b
t
a
i
n
l
arra
as Re
h
W
c
u
s
s
ilitie
Kevin Schaaff, Robert Stoddard
Rusty Young, Dave Zubrow
2009 Carnegie Mellon University
18
19
People attributes
Environmental factors
Technology factors
Tools (physical or software)
Process factors
Customers
Suppliers
Other Stakeholders
20
Can also be factors for which the project team truly has no
direct nor indirect influence over
21
Interactive Question #2
Of the steps presented so far, which step do you believe
would be the most challenging for your organization?
1) Business Goal formulation to drive PPMs
2) Identification of critical subprocesses supporting goals
3) Identify outcomes to predict with PPMs
4) Identify controllable factors to make predictions with
5) Identify uncontrollable factors to make predictions with
22
Decision-making
1
Data collection
Data analyzed,
interpreted, & stored
3
2
800
Data stored
Measurement
Report
700
600
500
400
300
Prototype
Data
Repository
200
100
0
1997
1998
1999
2000
2001E
Measurements collected
Kevin Schaaff, Robert Stoddard
David Zubrow, October 2008
Rusty Young, Dave Zubrow
2008 Carnegie Mellon University
2009 Carnegie Mellon University
23
23
Date
Indicator Name/Title
Establish
Measurement
Objectives
Objective
Questions
Visual Display
Communicate
Results
100
80
60
40
20
Data Storage
Where
How
Security
Perspective
Specify
Measures
Data Elements
Definitions
Assumptions
Interpretation
Data Collection
How
When/How Often
Data Reporting
Responsibility
for Reporting
By/To Whom
How Often
Algorithm
Input(s)
By Whom
Form(s)
Documenting Measurement
Objectives, Indicators, and
Measures
Specify
Data
Collection
Procedures
Probing Questions
Specify
Analysis
Procedures
Analysis
Collect
Data
Analyze
Data
Evolution
Feedback Guidelines
X-reference
Communicate
Results
kaz6
24
Slide 24
kaz6
This slide seems out of place to me ... since the slides that follow cover some of the things noted in the template. the template.
Quite honestly, I would remove the slide since it is only repeating some of the criteria that is already being listed.
Mark Kasunic, 10/7/2008
Typical Impacts
Operational
Tactical
Strategic
More difficult to set strategy
More difficult to execute strategy
Contribute to issues of data
ownership
Compromise ability to align
organization
Divert management attention
Kevin Schaaff, Robert Stoddard
Rusty Young, Dave Zubrow
2009 Carnegie Mellon University
25
26
Double Duty
Effort data collection is for Accounting not Project Management
Overtime is not tracked
Effort is tracked only to highest level of WBS
27
8) Types of Data
Nominal
Examples
Defect types
Labor types
Languages
Attribute
(aka categorized or
discrete data)
A
Increasing
information
content
Ordinal
<
Interval
(aka variables
data)
Continuous
Ratio
<
B
Examples
Severity levels
Survey choices 1-5
Experience categories
A
0
Examples
Defect densities
Labor rates
Productivity
Variance %s
Code size SLOC
28
29
30
31
32
Continuous Discrete
Continuous
ANOVA
and Dummy
Variable
Regression
Correlation
& Linear
Regression
Discrete
Chi-Square,
Logit &
Logistic
Regression
Logistic
Regression
Kevin Schaaff, Robert Stoddard
Rusty Young, Dave Zubrow
2009 Carnegie Mellon University
33
34
A
1
1
1
2 3 4 5
2 3 4 5
Crystal Ball
then allows the
user to analyze
and interpret
the final
distribution of
C!
B
3
2 3 4
1
2 3 4 5
2 3 4 5
Crystal Ball
causes Excel to
recalculate all
cells, and then it
saves off the
different results
for C!
1 2 3 4 5 6 7 8 9 10
Kevin Schaaff, Robert Stoddard
Rusty Young, Dave Zubrow
2009 Carnegie Mellon University
35
Durations
Step
Best
Expected
Worst
27
30
75
45
50
125
72
80
200
45
50
125
81
90
225
23
25
63
32
35
88
41
45
113
63
70
175
10
23
25
63
500
36
The project is
almost guaranteed
to miss the 500 days
duration 100% of the
time.
With 90%
confidence, the
project will be under
817 days duration.
37
38
39
40
Interactive Question #3
Based on what you now know, which analytical modeling
technique do you believe would be most practical and useful
in your organization?
1) Statistical regression equations
2) Monte Carlo simulation
3) Probabilistic Modeling
4) Discrete Event Simulation
5) Other
41
No
Meet interim
objectives
Yes
Yes
No
Meet
objectives
Renegotiate
objectives
On
track
No
No
End of
project
Yes
Terminate
Note: This is
not meant to be
an
implementation
flowchart
Yes
No
Yes
Suppliers on
track
Yes
No
42
43
44
45
47
Interactive Question #4
Based on the information you saw in this mini-tutorial, would
you be interested in attending a full day tutorial on this
subject?
1) Yes
2) No
48
Questions?
Kevin Schaaff, Robert Stoddard
Rusty Young, Dave Zubrow
2009 Carnegie Mellon University
49
Contact Information
Kevin Schaaff
Email: kschaaff@sei.cmu.edu
Robert W. Stoddard
Email: rws@sei.cmu.edu
Rawdon Young
Email: rry@sei.cmu.edu
Dave Zubrow
U.S. mail:
Software Engineering Institute
Customer Relations
4500 Fifth Avenue
Pittsburgh, PA 15213-2612
USA
Email: dz@sei.cmu.edu
Customer Relations
Email: customerrelations@sei.cmu.edu
Telephone:
+1 412-268-5800
SEI Phone:
+1 412-268-5800
SEI Fax:
+1 412-268-6257
50
Backup Slides
QQual
Only phases
or lifecycles
are modeled
Only
uncontrollable
factors are
modeled
No
uncertainty
or variation
Only final modeled
outcomes
are
modeled
53
SP 2.2
When a subprocess is initially executed, suitable data for
establishing trial natural bounds are sometimes available from prior
instances of the subprocess or comparable subprocesses, processperformance baselines, or PPMs
54
SP 1.2
PPMs can provide a basis for analyzing possible effects of changes
to process elements
55
56
ns
o
i
t
c
tra
s
i
d
e or
s
i
o
fn
o
e
e
Degr
cluding
n
i
s
e
c
n
e
r
e
erf
External int
ons
i
t
a
z
i
n
a
g
r
o
other
Tempe
rature
s
c
i
m
o
Ergon
agement
n
a
m
to
y
it
im
x
ro
Access or p
olders
Degree of Security Classification
and other stakeh
Other Visual o
r Audio Distra
ctions
Kevin Schaaff, Robert Stoddard
Rusty Young, Dave Zubrow
2009 Carnegie Mellon University
57
chnology
e
T
f
o
s
s
e
n
New
Mature tools
og
l
o
hn
c
e
et
e
r
Deg
en
v
o
r
yp
58
Quality of artifacts
(Input to or Output from
a work task)
Timeliness of Artifacts
Task Interdependence
Complexity of Artifacts
Measures of bureaucracy
Readability of Artifacts
Resource contention between tasks
Any of the criteria for
Difficulty of a work task
good reqts statements
Number of people involved with a work task
Degree of Job Aids, Templates, Instructions Any of the criteria for
good designs
Choices of subprocesses
Peer Review Measures
Code measures
Modifications to how work
Test Coverage
(Static and Dynamic)
Tasks are performed
Measures
Kevin Schaaff, Robert Stoddard
Rusty Young, Dave Zubrow
2009 Carnegie Mellon University
59
60
61
62
63
64
65
66
67
Identifying Outliers
Interquartile range description A quantitative method for
identifying possible outliers in a data set
Procedure
Determine 1st and 3rd quartiles of data set: Q1, Q3
Calculate the difference: interquartile range or IQR which equals
Q3 minus Q1
Lower outlier boundary = Q1 1.5*IQR
Upper outlier boundary = Q3 + 1.5*IQR
68
1
Q3
Procedure
1. Determine 1st and 3rd
quartiles of data set: Q1,
Q3
2. Calculate the difference:
interquartile range or
IQR
3. Lower outlier boundary =
Q1 1.5*IQR
4. Upper outlier boundary =
Q3 + 1.5*IQR
Q1
333
50
40
30
27
25
22
20
18
16
16
13
4
Upper outlier boundary
30 + 1.5*14 = 51
3
Lower outlier boundary
16 1.5*14 = -5
69
70
71
Programmatic Aspects
of Building PPMs
Topics
Review of CMMI Process Performance Models (PPMs)
Technical Process of Building PPMs
Programmatic Aspects of Building PPMs
Questions
73
74
Process knowledge
Build team needs to understand the process
Build team needs to understand the context in which the PPMs will
be used
75
Process simulation
Modeling process activities with a computer
Discrete event process simulation
Probabilistic networks
Using the laws of probability instead of statistics to predict outcomes
76
Topics
Review of CMMI Process Performance Models (PPMs)
Technical Process of Building PPMs
Programmatic Aspects of Building PPMs
Questions
77
78
79
Topics
Review of CMMI Process Performance Models (PPMs)
Technical Process of Building PPMs
Programmatic Aspects of Building PPMs
Questions
80
81
82
83
Topics
Review of CMMI Process Performance Models (PPMs)
Technical Process of Building PPMs
Programmatic Aspects of Building PPMs
Questions
84
(----
----)
PPM
Prediction
Spec
Interval
Limits
85
Flags
Apparent p value chasing
Inordinately high R-squared
Awareness
When unstable data used
Source of data
Kevin Schaaff, Robert Stoddard
Rusty Young, Dave Zubrow
2009 Carnegie Mellon University
86
Process changes
Process drift
New platforms, domains, etc.
Voice of the Customer
Changing business needs
87
88