You are on page 1of 33

SOFTWARE

PROJECT
MANAGEMENT
OBJECTIVES
Metrics in the Process and Project Domains
•Process Metrics and Software Process Improvement
•Project Metrics
Software Measurement
•Size-Oriented Metrics
•Function-Oriented Metrics
•Object-Oriented Metrics
•Use-Case–Oriented Metrics

Metrics for Software Quality


•Measuring Quality
•Defect Removal Efficiency
Integrating Metrics within the Software Process
•Arguments for Software Metrics
•Establishing a Baseline
•Metrics Collection, Computation, and Evaluation

Metrics for Small Organizations


SOFTWARE PROJECT MANAGEMENT 2
MEASUREMENT
• Measurement can be applied to the software process
with the intent of improving it on a continuous basis
• Measurement can be used throughout a software
project to assist in estimation, quality control,
productivity assessment, and strategic decision making
as a project proceeds

SOFTWARE PROJECT MANAGEMENT 3


METRICS IN THE PROCESS AND PROJECT
DOMAINS

SOFTWARE PROJECT MANAGEMENT 4


• Process Metrics: collected across all projects over long
periods of time; to provide a set of process indicators
that lead to process improvement

• Project Metrics: enable a software project manager to:


• assess the status of an ongoing project to adjust
work flow or tasks
• track potential risks & uncover problem areas
• evaluate the project team’s ability to control quality

SOFTWARE PROJECT MANAGEMENT 5


PROCESS METRICS & SOFTWARE PROCESS IMPROVEMENT

• Process is at the center of a triangle connecting three


factors that have a deep influence on software quality
and organizational performance.

ease
of communication
Complexity
and collaboration
etc.

deadlines, business
rules etc.

Skill &
motivation

Methods
& tools
packages/suites of
software tools etc.
SOFTWARE PROJECT MANAGEMENT 6
PROCESS METRICS & SOFTWARE PROCESS IMPROVEMENT

• Process metrics can be derived from:


• Process Outcomes e.g. errors uncovered before
release of the software, defects delivered to and
reported by end users, work products delivered
(productivity), human effort and calendar time spent
etc.
• Measuring characteristics of software engineering
tasks e.g. measure the effort and time spent
performing the umbrella activities and the generic
software engineering activities

SOFTWARE PROJECT MANAGEMENT 7


PROCESS METRICS & SOFTWARE PROCESS IMPROVEMENT

Software Metrics Etiquette


1. Use common sense when interpreting metrics data
2. Provide regular feedback to the individuals and teams
who collect measures and metrics
3. Don’t use metrics to value individuals
4. Work with practitioners and teams to set clear goals
and metrics that will be used to achieve them
5. Never use metrics to threaten individuals or teams
6. Metrics data that indicate a problem area should not
be considered “negative.” These data are merely an
indicator for process improvement
7. Don’t obsess on a single metric to the exclusion of other
important metrics

SOFTWARE PROJECT MANAGEMENT 8


PROJECT METRICS
• Project metrics and the indicators derived from them are
used by a project manager and a software team to
adapt project workflow and technical activities
• Metrics of past projects are used to make effort and time
estimates for current software work
• As a project proceeds:
• Effort and calendar time expended are compared to
original estimates
• Production rates (models created, review hours,
function points, LOC delivered) are measured
• Errors are tracked

SOFTWARE PROJECT MANAGEMENT 9


PROJECT METRICS
• The intent of project metrics is:
1. minimize development schedule by making
adjustments to avoid delays, problems and risks
2. assess product quality and modify the technical
approach to improve quality

Quality improves  defects are minimized  amount of


rework reduced  reduction in overall project cost

SOFTWARE PROJECT MANAGEMENT 10


SOFTWARE MEASUREMENT

SOFTWARE PROJECT MANAGEMENT 11


• Measurements in the physical world can be categorized
in two ways:
1. direct measures (e.g., the length of a bolt)
2. indirect measures (e.g., the “quality” of bolts
produced, measured by counting rejects)

• Software metrics can be categorized similarly


1. Direct measures:
• of the software process (cost and effort applied)
• of the product (LOC produced, execution speed,
memory size, and defects reported over some set
period of time)
2. Indirect measures: quality, complexity, efficiency,
reliability, maintainability etc.

SOFTWARE PROJECT MANAGEMENT 12


SIZE-ORIENTED METRICS

• Two projects can be compared based on such size-


based attributes
• Issue: Most of the controversy swirls around the use of
lines of code as a key measure because it’s a language
dependent thing

SOFTWARE PROJECT MANAGEMENT 13


FUNCTION-ORIENTED METRICS
• Use a measure of the functionality delivered by the
application
• Most widely used function-oriented metric is the function
point (FP) - based on characteristics of information
domain and complexity
• Issue: reliability of FP counts i.e. whether two individuals
performing a FP count for the same system would
generate the same result

SOFTWARE PROJECT MANAGEMENT 14


OBJECT-ORIENTED METRICS
Number of scenario scripts
• Scenario script: a detailed sequence of steps that
describe the interaction between the user and the
application
• Each script is organized into triplets of the form
{initiator, action, participant}
initiator  object that requests some service
action  the result of the request
participant  the server object that satisfies the request
• The number of scenario scripts is directly correlated to
the size of the application and to the number of test
cases

SOFTWARE PROJECT MANAGEMENT 15


OBJECT-ORIENTED METRICS
Number of key classes
• Key classes: highly independent components that are
defined early in object-oriented analysis
• Key classes are central to the problem domain
• Number of key classes is an indication of the amount of
effort required to develop the software and also an
indication of the potential amount of reuse to be
applied

SOFTWARE PROJECT MANAGEMENT 16


OBJECT-ORIENTED METRICS
Number of support classes
• Support classes: required to implement the system but
are not immediately related to the problem domain e.g.
GUI classes

SOFTWARE PROJECT MANAGEMENT 17


OBJECT-ORIENTED METRICS
Number of subsystems
• Subsystem: a combination of classes that support a
function that is visible to the end user of a system
• Once subsystems are identified, a schedule can be
formulated and work can be partitioned among project
staff

SOFTWARE PROJECT MANAGEMENT 18


USE-CASE–ORIENTED METRICS
• Use case: a method for describing customer-level or
business domain requirements of software features and
functions
• Defined early in the software process
• Number of use cases is directly proportional to the size of
the application and to the number of test cases to be
designed
• No standard size for a use case: use cases can be
created at different levels of abstraction
• Use-case points (UCPs): a mechanism for estimating
project effort and other characteristics (like FP); is the
number of actors and transactions

SOFTWARE PROJECT MANAGEMENT 19


METRICS FOR SOFTWARE QUALITY

SOFTWARE PROJECT MANAGEMENT 20


Quality of a system, application, or product is:
1. Requirements that describe the problem
2. Design that models the solution
3. Code that leads to an executable program
4. Tests that exercise the software to uncover errors

SOFTWARE PROJECT MANAGEMENT 21


MEASURING QUALITY
1. Correctness: degree to which the software performs its
required function
 defect - a verified lack of conformance to requirements
 most common measure for correctness is defects per
KLOC
2. Maintainability: ease with which a program can be
corrected if an error is encountered, adapted if its
environment changes, or enhanced if the customer
desires a change in requirements
 no way to measure maintainability directly
 mean-time-to-change (MTTC) metric: time it takes to
analyze the change request, design, implement and
test it, and distribute the change to all users

SOFTWARE PROJECT MANAGEMENT 22


MEASURING QUALITY
3. Integrity: a system’s ability to withstand attacks (both
accidental and intentional) to its security
 Attacks can be made on all three components of
software: programs, data and documentation
 To measure integrity:
Integrity = Ʃ[1 - (threat x (1 - security))]
1. Threat: the probability that an attack of a specific
type will occur within a given time
2. Security: the probability that the attack of a specific
type will be repelled

4. Usability: attempt to quantify ease of use


 If a program is not easy to use, it often fails, even if
the functions that it performs are valuable

SOFTWARE PROJECT MANAGEMENT 23


DEFECT REMOVAL EFFICIENCY (DRE)
• A measure of the filtering ability of quality assurance and
control actions as they are applied throughout all
process framework activities
• Provides benefit at both the project and process level
DRE = E/(E + D)
E = number of errors found before delivery of the
software to the end user
D = number of defects found after delivery
• Ideal value for DRE is 1 i.e. no defects are found in the
software
• DRE encourages to establish techniques for finding as
many errors as possible before delivery

SOFTWARE PROJECT MANAGEMENT 24


DEFECT REMOVAL EFFICIENCY (DRE)
• DRE can also be used within the project to assess a
team’s ability to find errors before they are passed to the
next framework activity
• Example: requirements analysis produces a requirements
model that is reviewed to find and correct errors, errors
not found during the review are passed on to design
DREi = Ei/(Ei + Ei+1)
Ei = number of errors found during software
engineering action i
Ei+1 = number of defects found during action i + 1 that
are traceable to errors that were not discovered in
software engineering action i

SOFTWARE PROJECT MANAGEMENT 25


INTEGRATING METRICS WITHIN
THE SOFTWARE PROCESS

SOFTWARE PROJECT MANAGEMENT 26


ARGUMENTS FOR SOFTWARE METRICS
Why is it so important to measure the process of software
engineering and the product (software) that it produces?
• If you do not measure, there no real way of determining
whether you are improving. And if you are not
improving, you are lost
• collection of quality metrics enables an organization to
tune its software process to remove the causes of
defects that have the greatest impact on software
development

SOFTWARE PROJECT MANAGEMENT 27


ESTABLISHING A BASELINE
Attributes of a baseline:
1. data must be reasonably accurate—estimates about
past projects are to be avoided
2. data should be collected for as many projects as
possible
3. measures must be consistent
4. applications should be similar to work that is to be
estimated

SOFTWARE PROJECT MANAGEMENT 28


METRICS COLLECTION, COMPUTATION, & EVALUATION
historical investigation of past projects

metrics must be evaluated and


applied during estimation, technical
work, project control, and process
improvement

focuses on the underlying reasons


for the results obtained
and produces a set of indicators
that guide the project or process

SOFTWARE PROJECT MANAGEMENT 29


METRICS FOR SMALL ORGANIZATIONS

SOFTWARE PROJECT MANAGEMENT 30


• Time (hours or days) elapsed from the time a request is
made until evaluation is complete, tqueue
• Effort (person-hours) to perform the evaluation, Weval
• Time (hours or days) elapsed from completion of
evaluation to assignment of change order to personnel,
teval
• Effort (person-hours) required to make the change,
Wchange
• Time required (hours or days) to make the change,
tchange
• Errors uncovered during work to make change, Echange
• Defects uncovered after change is released to the
customer base, Dchange

SOFTWARE PROJECT MANAGEMENT 31


Once these measures have been collected for a number
of change requests, it is possible to compute:
1. total elapsed time from change request to
implementation of the change
2. percentage of elapsed time absorbed by initial
queuing, evaluation and change assignment, and
change implementation
3. defect removal efficiency as
DRE = Echange/ (Echange + Dchange)

SOFTWARE PROJECT MANAGEMENT 32


ACTIVITY
• Explain the following quote in your own words:
“Software metrics let you know when to laugh and when
to cry.”

• Explain the following quote in your own words & clarify


how is it linked to software metrics:
“Not everything that can be counted counts, and not
everything that counts can be counted.”

SOFTWARE PROJECT MANAGEMENT 33

You might also like