Professional Documents
Culture Documents
Introduction
Software Engineering
1
Introduction
1 Software Engineering
Part I
2 Software Engineering Activities
Introduction
3 Software Project Artifacts
1 Software Engineering
Thus. . . P 4 Introduction
1 Software Engineering
1 Software Engineering
Requirements specification.
Software architecture documentation. 2 Software Engineering Activities
Design documentation.
Source code. 3 Software Project Artifacts
Test procedures, cases, . . . .
4 Software Project Quality
All under configuration management.
Inspections. Part II
Formal methods.
Testing. The Software Engineering Process
Project control techniques:
I Predict cost.
I Manage risks.
I Control artifacts (configuration management).
A typical roadmap
10 Quality 10 Quality
1. Understand nature and scope of the product 1. Understand nature and scope of the product
2. Select process and create plan(s). 2. Select process and create plan(s).
3. Gather requirements 3. Gather requirements
4. Design and build the product. 4. Design and build the product.
5. Test the product. 5. Test the product.
6. Deliver and maintain the product. 6. Deliver and maintain the product.
A typical roadmap Select process and create plan(s). A typical roadmap Select process and create plan(s).
A typical roadmap Design and build the product. A typical roadmap Test the product.
1. Understand nature and scope of the product 1. Understand nature and scope of the product
2. Select process and create plan(s). 2. Select process and create plan(s).
3. Gather requirements 3. Gather requirements
4. Design and build the product. 4. Design and build the product.
5. Test the product. 5. Test the product.
6. Deliver and maintain the product. 6. Deliver and maintain the product.
5 A typical roadmap
10 Quality
11 Capability assessment
Perspectives on software engineering Object-oriented programming Perspectives on software engineering Object-oriented programming
1. Structured programming
2. Object-oriented programming Encapsulates data in ADT.
3. Reuse and components
Correspondence with “real” application objects.
4. Formal methods
Easier to understand, evolve.
Design patterns can be used to describe reusable design
solutions.
1. Structured programming
2. Object-oriented programming Compare with other engineering disciplines (e.g. car models).
3. Reuse and components Reuse should be aimed for from the start:
4. Formal methods I design modular systems with future reuse in mind
I knowledge of what is available for reuse
See also: components (javabeans, COM: reuse binaries) and
frameworks.
Perspectives on software engineering Formal methods Perspectives on software engineering Formal methods
1. Structured programming
2. Object-oriented programming Compare with other engineering disciplines that have a solid
3. Reuse and components supporting base in mathematics.
4. Formal methods Formal specifications: use (first order) logic ⇒ unambiguous, can
be formally studied (e.g. consistency).
Formal transformations: from specifications over design to code
⇒ code is guaranteed to be equivalent with specifications.
5 A typical roadmap
10 Quality
11 Capability assessment
5 A typical roadmap
10 Quality
11 Capability assessment
Process alternatives The waterfall process model Process alternatives The spiral model
1 Requirements analysis:
I Concept analysis: overall definition of application philosophy. 1. The waterfall process model
2 Design produces diagrams & text. 2. The spiral model
I Architectural design. 3. The incremental process model
I Object-oriented analysis: determine key classes. 4. Trade-offs
I Detailed design.
3 Implementation produces code & comments.
4 Test produces test reports and defect descriptions.
I Unit testing.
I Integration testing.
I Acceptance test.
Process alternatives The incremental process model Process alternatives The incremental process model
5 A typical roadmap
1. Introduction
6 Perspectives on software engineering
2. Documentation Standards
3. An Approach
7 Key expectations (Humphrey) 4. Document management
5. Configuration Management
8 Process alternatives
10 Quality
11 Capability assessment
1. Introduction
2. Documentation Standards Usual rules about documenting code.
3. An Approach In addition, context should be documented.
4. Document management I Relationship of code/class design to requirements. (Implements
5. Configuration Management requirement 1.2)
A project is the whole set of coordinated, well-engineered artifacts,
including the documentation suite, the test results and the code.
Documentation and Configuration Management Documentation Standards Documentation and Configuration Management Documentation Standards
1. Introduction
Standards improve communication among engineers.
2. Documentation Standards
3. An Approach To be effective, standards must be perceived by engineers as
4. Document management being helpful to them
5. Configuration Management ⇒ Let development team decide on standards to apply.
+ Motivation.
- Groups with different standards in organization: makes process
comparison & improvement (CMM) more difficult.
⇒ Standards should be simple and clear.
Documentation and Configuration Management An Approach Documentation and Configuration Management An Approach
1. Introduction
2. Documentation Standards Document management requires
3. An Approach Completeness (e.g. IEEE set).
4. Document management Consistency.
5. Configuration Management I single-source documentation: specify each entity in only one place
(as much as possible, e.g. user manual..).
I use hyperlinks, if possible, to refer to entities.
Configuration (coordination of versions).
Documentation and Configuration Management Configuration Management Documentation and Configuration Management Configuration Management
1. Introduction
2. Documentation Standards The SCMP specifies how to deal with changes to documents: e.g.
3. An Approach To change the API of a module, all clients must be asked for
4. Document management approval by email.
5. Configuration Management Use system to keep track of configuration items and valid
combinations thereof.
See CVS: check-in, check-out, tagging procedures.
Example SCMP: page 63 . . . in book.
5 A typical roadmap
10 Quality
11 Capability assessment
1. Quality attributes
2. Quality metrics Reviews: SCMP, process, SPMP
3. Quality assurance Inspections: requirements, design, code, . . .
4. Inspections Testing
5. Verification and Validation I Black box.
I White (glass) box.
I Grey box.
Ideally, by external organization.
Quality Inspections
Inspecting requirements. Build inspections into schedule (time for preparation, meeting).
faulty If the temperature is within 5.02% of the maximum allowable limit, Prepare for collection of inspection data.
as defined by standard 67892, then the motor is to be shut down. I Number of defects/KLOC, time spent.
I Form, e.g. with description, severity.
correct If the temperature is within 5.02% of the maximum allowable limit, I Who keeps inspection data, usage of . . .
as defined by standard 67892, then the motor is to be powered
down. Assign roles.E.g. author, moderator/recorder, reader or, minimally,
author/inspector.
! “shut down” 6= “power down”
Ensure that each participant prepares: bring filled defect forms to
! Very expensive to find and fix after implementation.
meeting.
1. Quality attributes
2. Quality metrics Validation: are we building the right product?
3. Quality assurance Test product.
4. Inspections Verification: are we building the product right (process)?
5. Verification and Validation I Do the requirements express what the customer wants? (inspection
requirements, . . . )
I Does the code implement the requirements? (inspection)
I Do the tests cover the application (inspect STD).
5 A typical roadmap
10 Quality
11 Capability assessment
Capability assessment Team Software Process (TSP) Capability assessment Capability Maturity Model (CMM)
Introduction
13 Teams 13 Teams
16 Planning 16 Planning
cost
defect density
Introduction Introduction
Teams Teams
1 Select team leader: ensures all project aspects are active, fills any
3 leader responsibilities:
gaps.
I propose strawman artifact (e.g. SRS, design)
2 Designate leader roles and responsibilities: I seek team enhancement and acceptance
I team leader (SPMP) I ensure designated artifact maintained and observed
I configuration management leader (SCMP) I maintain corresponding metrics, if any
I quality assurance leader (SQAP, STD)
I requirements management leader (SRS) 4 Designate backup for each leader. E.g. team leader backs up
I design leader (SDD) implementation leader, CM leader backs up team leader etc.
I implementation leader (code base)
13 Teams
A risk is something which may occur in the course of a project and
which, under the worst outcome, would affect it negatively and
14 Risk Management
significantly.
There are 2 types of risks:
15 Choosing tools and support
Risks that can be avoided or worked around (retired), e.g. “project
leader leaves”; retire by designating backup person.
16 Planning
Risks that cannot be avoided.
17 Feasability Analysis
18 Cost Estimation
13 Teams
17 Feasability Analysis
18 Cost Estimation
18 Cost Estimation
SCMP
BEGIN TESTING 1 External milestones (e.g. delivery ).
milestones SQAP DELIVER
2 Internal milestones to back up external ones (start testing).
SPMP REQUIREMENTS 3 Show first iteration, establish minimal capability (exercising
FROZEN process itself)
iteration 1
4 Show task for risk identification & retirement.
5 Leave some unassigned time.
iteration 2 6 Assign manpower (book p. 94).
The schedule will become more detailed as the project progresses and
risk the schedule is revisited.
management
Assumes 2 iterations.
13 Teams
Analysis based on the means (costs) and benefits of a proposed
project. These are computed over the lifetime (incl. the development)
14 Risk Management of the system.
Estimate:
15 Choosing tools and support
Lifetime (e.g. 5yrs).
Costs: development (see below), maintenance, operation.
16 Planning
Benefits: savings, productivity gains, increased production, etc.
17 Feasability Analysis
18 Cost Estimation
5000 12500
0 0 0 5000 5000 1
P = Σj=n
j=1 Fj [ ]
1 2500 2234 (1 + i)j
2 2500 1993
where Fj is the benefit in year j.
3 2500 1779
⇒ Compute the solution of
4 2500 1589
5 2500 1419 Σj=n j
j=0 Fj X = 0
13 Teams
Even before architecture and design! (( Compare: Ne/m3 in
building industry).
14 Risk Management
Cost estimates can be refined later in the project.
15 Choosing tools and support Based on KLOC estimate.
KLOC estimate based on experience, outline architecture, . . . or
16 Planning on function points estimation (see book p. 97 – 104).
KLOC → cost using COCOMO.
17 Feasability Analysis
18 Cost Estimation
Based on PM, TDEV from the basic model, in the intermediate model, 13 Teams
the basic PM estimate is multiplied with factors depending on:
Product attributes: reliability, database size, complexity. 14 Risk Management
Computer attributes: resource constraints, stability hard/software
environment. 15 Choosing tools and support
Personnel attributes: experience with application, programming
language, etc. 16 Planning
Project attributes: use of software tools, project schedule.
The model can be calibrated, based on experience and local factors. 17 Feasability Analysis
18 Cost Estimation
13 Teams
5 Work packages, schedule and budget.
1 Work packages (sketchy before architecture is established) 14 Risk Management
2 Dependencies.
3 Resource requirements (estimates)
15 Choosing tools and support
4 Budget and resource allocation (person-days, money for S&HW)
5 Schedule.
16 Planning
See example SPMP on p. 123 – 134.
17 Feasability Analysis
18 Cost Estimation
Injection Phase A table per phase (example on p. 113) containing actual data and
Detection phase detailed design implementation company norms (or goals).
requirements The example on the next slide concerns requirements, 200 of
detailed 2 (5) which have been established in 22 hrs., a productivity of 9.9
requirements requirements/hr.
design 0.5 (1.5) 3 (1) Since we found 6/100 defects by inspection, vs. the norm of
implementation 0.1 (0.3) 1 (3) 2 (2) 4/100, we can predict that the defect rate will be the same and
Numbers between parentheses are planned, others are actual results. thus there will be 6/4 × r , where r is the historic defect rate,
defects in the requirements.
Inroduction
21 Inroduction 21 Inroduction
Inroduction Inroduction
Inroduction Inroduction
21 Inroduction
22 Expressing Requirements
Conceptual model.
Use cases.
23 Rapid Prototyping
Data Flow Diagrams.
24 Detailed Requirements State transition diagrams.
Class diagram or EAR diagram (for data).
25 Desired Properties of D-requirements GUI mock screen shots (p.m.)
26 Organizing D-requirements
21 Inroduction
If the requirement is simple and stands alone, express it in clear
sentences within an appropriate section of the SRS.
22 Expressing Requirements
If the requirement is an interaction involving the application,
express it via a use case.
23 Rapid Prototyping
If the requirement involves process elements taking input and
producing output, use a DFD.
24 Detailed Requirements
If the requirement involves states that (part of) the application can
be in, use state transition diagrams.
25 Desired Properties of D-requirements
risk
26 Organizing D-requirements
management
27 Metrics for Requirements
Functional requirements. 2.3 Each submission has a status consisting of a set of reports
Nonfunctional requirements: submitted by PC members (see section 1) and a summarizing
I Performance: time, space (RAM,disk), traffic rate. value which is one of . . . .
I Reliability and availability. ...
I Error handling
I Interface requirements 2.4 An author is able to view the status of her submissions via the
I Constraints (tool, language, precision, design, standards, HW) website, after proper authorization, see section 3.
Inverse requirements ...
4.1 Excluding network delays, the public web site component of the Reliability:
system will generate an answer to each request within a second, 7.1 The system shall experience no more than 1 level one faults per
provided the overall load on the system does not exceed 1.4. month.
4.2 The size of the executable for the main application program ...
(middle tier) will not exceed 6MB.
Availability:
4.3 The size of the database will not exceed n × 2000 bytes, excluding
7.2 The system shall be available at all times, on either the primary or
the size of the submissions and PC reports, where n is the
backup computer.
number of submissions.
...
...
7.3 The cgi program will always return a page. If the middle tier
application does not return any data within the time frame
mentioned in section 1, the cgi program will generate a page 5.1 The protocol of the application server has the following syntax:
request : N \n [key = value\n\n ]*
containing a brief description of the fault and the email address of reply : N \n [ key = value \n\n]*
the support organization. name : any string not containing \n or “=”
where N
value : any string not containing the sequence “\n\n”
7.3 There will be a maximum size, selectable by the administrator, of
indicates the number of (key,value) pairs in the message.
any page sent by the cgi program. If the system attempts to
exceed this size, an appropriate error message will be sent ...
instead.
...
9.1 The system will use the mysql database management system.
What the system will not do.
9.2 The target system is any linux system with a kernel v.2.4 or higher.
10.1 The system will not provide facilities for backup. This is the
9.3 The cgi program will generate only html according to the WC3
responsibility of other programs.
standard v.2. No frames, style sheets, or images will be used,
making it usable for text-only browsers. ...
...
21 Inroduction
22 Expressing Requirements
Traceability.
23 Rapid Prototyping Testability and nonambiguity.
Priority.
24 Detailed Requirements Completeness.
Consistency.
25 Desired Properties of D-requirements
26 Organizing D-requirements
Backward to C-requirements.
Forward to
→ Design (module, class) “the system will generate html pages” is ambiguous
→ Code ((member) function)
⇒ Specify exact standard or html-subset.
→ Test.
Example: Req. 2.3 → class SubmissionStatus → Test 35 “The system will have user-friendly interface”
Also for nonfunctional requirements: e.g. a performance ⇒ Specify time to learn for certain category of users.
requirement probably maps to only a few modules (90/10 rule).
Example: req. 7.3 may map to a special subclass of ostream
which limits output etc.
Priority Completeness
21 Inroduction
21 Inroduction
26 Organizing D-requirements
RID Priority Responsible Inspection Status Test More risks, some risks retired.
1.2 E DV OK 50% - More detailed cost estimate.
... ... ... ... ... ... More detailed schedule, milestones.
Designate architects.
28 Design Steps
Part V 29 UML
31 Architectural Design
32 Design Patterns
33 Detailed Design
28 Design Steps
29 UML
1 Build domain model.
30 The Domain Model 2 Select architecture.
3 Detailed design.
31 Architectural Design 4 Inspect and update project.
32 Design Patterns
33 Detailed Design
28 Design Steps
33 Detailed Design
UML UML
lift_receiver
start_dial_tone
dial_number
stop_dial_tone
dial_numbers
ring_phone
Often elaboration of use case.
start_ring_tone
answer
stop_ring_tone
connect connect
hangup
disconnect disconnect
28 Design Steps
29 UML
31 Architectural Design
32 Design Patterns
33 Detailed Design
28 Design Steps
29 UML
Verify w.r.t. requirements:
All concepts represented? 30 The Domain Model
Use cases supported?
Dynamic models correct? 31 Architectural Design
32 Design Patterns
33 Detailed Design
28 Design Steps
29 UML
See book
30 The Domain Model E. Gamma, R. Helm, R. Johnson, J. Vlissides, “Design
Patterns – Elements of Reusable Object-Oriented Software”,
Addison-Wesley, 1995.
31 Architectural Design
32 Design Patterns
33 Detailed Design
28 Design Steps
Add support classes, member functions:
29 UML I Requirements.
I Data storage.
30 The Domain Model I Control.
I Architecture.
31 Architectural Design Determine algorithms.
Add invariant description to each class, where appropriate.
32 Design Patterns Add pre/postconditions to each non-trivial member function.
33 Detailed Design
Part VI 35 Hints
Implementation 36 Quality
Time spent
A unit is the smallest part of the implementation that is separately
maintained (and tested): typically class or (member) function. For each residual detailed design (extra members..)
unit: coding
Plan structure and residual design. Fill in pre- and postconditions. self-inspection
unit testing
Self-inspect residual design. review
Write code and unit test program/functions. repair
Inspect. Defects
Compile & link.
Severity: major (requirements unsatisfied), trivial,
Apply unit tests (autotools: make check ). other.
and collect process metrics and update SQAP,SCMP Type (see quality).
Source: requirements, design, implementation.
Hints Hints
34 Preparation and Execution Try reuse first. E.g. use STL instead of own container implementation.
Enforce intentions. Prevent unintended use (better: use that can lead
35 Hints to invariant violation).
Strongly typed parameters: e.g. use const,
reference parameter i/o pointer if null is not allowed.
36 Quality
Define things as locally as possible.
Use patterns such as singleton if appropriate.
37 Personal Software Documentation Make members as private as possible.
Include example usage in documentation (E.g.
38 Updating The Project \example in doxygen)
Always initialize data members, variables.
Quality Quality
Classes Overall
Class Data Members
C1 Appropriate name? consistent with requirements,
design? sufficiently general/specialized? A1 Necessary?
C2 Could it be an abstract class? A2 Could it be static?
C3 Header comment describing purpose? A3 Could it be const?
C4 Header references requirements or design A4 Naming conventions applied?
element(s)? A5 As private as possible?
C5 As private as can be? (e.g. nested) A6 Attributes are orthogonal?
C6 Operators allowed? (gang of three) A7 Initialized?
C7 Documentation standards applied?
Quality Quality
Function Declarations
F1 Appropriate name? consistent with requirements,
design? sufficiently general/specialized?
Class Constructors
F2 As private as possible?
O1 Necessary? F3 Should it be static?
O2 Would a factory method be better? F4 Maximal use of const?
O3 Maximal use of initialization list? F5 Purpose described?
O4 Private as possible? F6 Header references requirements or design
O5 Complete? (all data members) element(s)?
F7 Pre- postconditions, invariants stated?
F8 Documentation standards?
F9 Parameter types as tight as possible for correct
functioning?
Quality Quality
Source code.
34 Preparation and Execution
Defect log.
35 Hints Defect type
Personal phase (residual design, personal
36 Quality inspection, personal unit test) during which
injected/removed.
37 Personal Software Documentation Time log: time spent on residual design, coding, testing.
Engineering notebook. Status, notes, . . . .
38 Updating The Project
Bring to exam!
35 Hints SQAP
Coding standards.
36 Quality Process metrics data; e.g. from inspections, personal
software documentation.
37 Personal Software Documentation SCMP Location of implementation CI’s.
39 Introduction
Part VII
40 Unit Testing
Integration and Testing
41 Integration and System Testing
Introduction Introduction
The space of test data can be divided into classes of data that should
be processed in an equivalent way: select test cases from each of the
classes.
Example: search value in an array
Black Box: based on requirements/specifications only, without Input classes:
considering design.
Array Element
White Box: based on detailed design; attempts code coverage and
looks at weak spots in design. single value present
single value not present
> 1 value first in array
> 1 value last in array
> 1 value middle in array
> 1 value not in array
.
1 Policy: Responsibility of author? By project team or external QA
Use knowledge of code to derive test data (e.g. further classes): team? Reviewed by?
path testing ensures that test cases cover each branch in the flow 2 Documentation (see next slide): Incorporate in STD? How to
graph. incorporate in other types of testing? Tools?
Insert assertions to verify (at run time) predicates that should hold 3 Determine extent of tests. Prioritize tests: tests that are likely to
at that point. (E.g. assert macro in C, C++). uncover errors first.
4 Decide how and where to get test input.
5 Estimate required resources (e.g. based on historic data).
6 Arrange to track metrics: time, defect count & type & source.
1 Decide how and where to store, reuse, code the integration tests
When testing integrated components or modules: look for errors that
(show in project schedule).
misuse, or misunderstand the interface of a component:
2 Execute unit tests in context of the build.
Passing parameters that do not conform to the interface
3 Execute regression tests. specification, e.g. unsorted array where sorted array expected.
4 Ensure build requirements and (partial) use cases are known. Misunderstanding of error behavior, e.g. no check on overflow or
5 Test against these requirements and use cases. misinterpretation of return value.
6 Execute system tests supported by this build.
D. Vermeir,September 2008