You are on page 1of 12

A “quote” first

“The software developers when estimate


Software Development that they are 90% done, but it takes 90% to
Best Practices get the rest of the way to 100% done.”

In short, “90% done, 90% still remains.”

Objectives of a
Why This Session?
Modern SD Process
• To provide a modern view of SD process • Apply an iterative, use-case driven, architecture-
(approach, architecture, modeling etc.). centric process to the development of a robust
• To investigate problems (symptoms), root causes design model.
(diagnosis) and solutions (best practices) in team • Apply UML to represent the design model.
SD process.
• Apply the concepts of object orientation:
• To address the issues related to risks, quality and
testing of the software product. abstraction, encapsulation, inheritance and
polymorphism at design and implementation level.
• To have a prelude idea for Rational Unified
Process (RUP).

Objectives (Cond..) Audience and Prerequisites


• Understand the different views of a software Audience
architecture, the key mechanism that are defined
Software practitioners, Software Engineers,
in support of that architecture, and the effect of the
architecture and the mechanisms on the product Developers, Analysts, Senior Programmers,
design. Designers, Architects.
• Describe some basic design considerations, Prerequisites
including the use of the patterns. • Some exposure of Object Technology.
• Some software development experience.

1
A Broad Perspective Ahead Situation Analysis

• Explore the symptoms and root causes of • World economies are becoming software
dependent (Should we put a big question mark
software development problems.
• Examine SIX best practices for software ?
( ) today.
development. • Applications are expanding in size, complexity,
distribution and importance.
• Apply these practices to address the root
• Business is demanding increased productivity and
causes of the problems. quality in less time and within budget.
• Not enough qualified manpower is available.

A Challenging Scenario So, Scenario is


• Team sizes are increasing. • There are many software products under
• Specialization is growing (New Specific use.
Roles such as Performance Engineer & QA, • There are many software products under
Integrator & Tester, Release Engineer etc.) development too.
• There are too many projects underway,
• Rapid advancement in technology. some of them are at concept level, others
• Distribution at large. are verge of completion and some others are
under a middle state.

During the Development During the Development

There are many successes and,


There are too many failures too.

2
Symptoms for SD Problems
Symptoms for SD Problems
(Contd.)
• Inaccurate understanding of end-user needs. • Poor software quality.
• Inability to deal with changing requirements. • Unacceptable software performance.
• Modules that don’t fit together. • Team members in each other’s way are unable to
• Software that’s hard to maintain and/or extend. reconstruct who changed what, when, where, why.
• Late discovery of serious project flaws. • An untrustworthy build-and-release process.

Unfortunately, What happens is- Root Causes of SD Problems

Treating symptoms does not treat the disease, e.g. • Insufficient requirement management.
Late discovery of serious project flaws is only a • Ambiguous & imprecise communications.
symptom of larger problems. • Brittle architecture.
We have to diagnose for the symptom i.e. • Overwhelming complexity.
subjective project status assessment for above
• Undetected inconsistencies among requirement,
symptom. designs, and implementations.

Root Causes of SD Problems


Actually, what we should do-
(Contd..)
• Insufficient Testing. We should attack these root causes (diagnosis) so
• Subjective project status assessment. that we are not only in position to get rid off the
• Delayed risk reduction due to waterfall approach. system, but also we are in a better position to
provide quality to the software in a repeatable and
• Uncontrolled change preparation.
productive manner.
• Insufficient Automation.

3
Best Practices of SE
SIX Best Practices
(Schematic)
• Develop iteratively. Develop Iteratively
• Manage requirements properly.
• Use component architectures. Use
Manage Model Verify
Component
• Model the software visually. Requirements
Architectures
Visually Quality

• Verify quality.
• Manage control changes. Control Changes

Expect Outcome Practice 1: Develop Iteratively

Develop Iteratively
Best practices enable high-performance teams
resulting into more successful projects in terms of
Use
quality, timeliness and profits. Manage
Component
Model Verify
Requirements Visually Quality
Architectures

Control Changes

Traditional
Practice 1: Develop Software
Waterfall
Iteratively
Development
• An initial design is likely to be flawed with Requirement
Analysis
respect to its key requirements.
Design
• Late-phase discovery of design defects
results in costly over-runs and/or project Code &
Unit Testing
cancellation.
Subsystem
The time and money spent implementing a Testing
faulty design are not recoverable. System
Testing

Time

4
Waterfall Development Delays Apply the Waterfall Iteratively to
Reduction of Risk System Increments
Iteration 1 Iteration 2 Iteration 3
R R R
R D D D

I C C C

S R
Waterfall T1

T2
T1

T2
T1

T2

K D

T1
TIME
T2
-Earliest iterations address greatest risks.
-Each iteration produces an executable release,
an additional increment of the system.
-Each iteration includes integration and test.
TIME

Waterfall Development Delays


Iterative Development Characteristics
Reduction of Risk
• Critical risks are resolved before making large
R investments.
I • Initial iterations enable early user feedback.
S Waterfall • Testing and integration are continuous.
K • Objective milestones provide short-term focus.
Iterative
• Progress is measured by assessing
implementations.
• Partial implementations can be deployed.
Iteration Iteration Iteration Iteration

TIME

Problems Addressed by Iterative


Practice 2: Manage Requirements
Development
Root Causes Solutions
• Insufficient requirements • Enables and encourages user Develop Iteratively
feedback.
• Ambiguous communications • Serious misunderstandings evident
early in the life cycle. Use
• Development focuses on critical
Manage Model Verify
• Overwhelming complexity Component
issues. Requirements Visually Quality
• Subjective assessment Architectures
• Objective assessment thru testing.
• Undetected inconsistencies
• Inconsistencies detected early.
• Poor testing
• Testing starts earlier.
• Waterfall development Control Changes
• Risk Identified and addressed
early.

5
Definitions: Requirements and Their
Practice 2: Manage Requirements
Management
• Elicit, organize and document required functionality and • A requirement is a condition or capability to
constraints.
which the system must conform.
• Evaluate changes and determine their impact.
• Track and document tradeoffs and decisions. • RM is a systematic approach to
Eliciting, organizing and documenting the
Requirements are dynamic- Expect to change during requirements of the systems and
software development.
Ensuring the agreement between the client and the
development team on the changing requirements.

Requirements Trace to Many Project


Terms of Agreement
Elements
The Goal
System
Client
to be built Design & Project
Implementation Management
Requirements
Verification Surrogate
Goal Requirements

Use Case Model Change


Management QA & Test
Requirements

A proxy for the client Requirement Management

Problems Addressed by Requirements


Catching Requirements Errors Early
Management
• Effective analysis of the problem and elicitation of the user Root Causes Solutions
needs. • Insufficient requirements • Disciplined approach to requirement
gathering.
• Agreement with the clients on the requirements and vice • Communications based on the specified
• Ambiguous communications
versa. requirements.
• Prioritizing, filtering and tracing the
• Model interaction between the user and the system. • Overwhelming complexity
requirements.
• Subjective assessment
• Establish a baseline and change control process. • Poor testing
• Objective assessment of functionality
and performance.
• Record of forward and backward trace of the requirements. • Undetected inconsistencies • Inconsistencies detected early.
• Insufficient automation • Provision of a repository tool for
• Use of iterative process. requirements, attributes, and tracing
with automatic links to the documents.

6
Practice 3: Use Component- Based
Definition of Software Architecture
Architectures
SA is defined as a collection (or organization) of
Develop Iteratively significant components of software system and it involves
decisions like:
• Selection of the structural elements and their interfaces by
Use which a system is composed off.
Manage Model Verify
Component
Requirements Visually Quality • Behavior as specified in collaborations among those
Architectures
elements.
• Composition of these structural and behavioral elements
Control Changes into progressively larger subsystems.
• Architectural style that guides this organization of
components.

Issues Concerning Architecture


Issues Concerning Architecture
(Contd..)
Following issues influence SA structurally and • Comprehensibility
behaviorally: • Economical and Technological Constraints and
• Usage Tradeoffs
• Functionality • Aesthetics
• Performance • Reuse
• Resilience

Resilient & Component Based Resilient & Component Based


Architectures Architectures (Contd..)
• Good architectures, that meet their requirements, • A component-based architecture permits
are resilient and are component based -Reuse or customization of existing components
• A resilient architecture enables -Choice of thousands of commercially available
- Improved maintainability and extensibility components
-Clean division of work among teams of -Incremental evolution of existing software
developers
-Encapsulation of hardware and system
dependencies

7
Problems Addressed by Component
Architectures
Practice 4:Visually Model Software

Root Causes Solutions


• Brittle Architectures • Component facilitates resilient Develop Iteratively
architectures
• Reuse of commercially available
components and framework is Use
facilitated Manage Model Verify
Component
• Modularity enables separation of Requirements Visually Quality
• Overwhelming complexity Architecture
concerns
• Components provide a natural basis
• Uncontrolled change for configuration management.
• Visual modeling tools provide Control Changes
• Insufficient Automation automation for component based
design.

Process 4: Visually Model Software UML: A Tool for Visual Model

• Capture the structure and behavior of architectures The Unified Modeling Language (UML) is
• Show how the elements of the system fit together a language for
• Hide or expose details as appropriate for the task • Specifying
• Maintain consistency between a design and its • Visualizing
implementation
• Promote unambiguous communication.
• Constructing
Visual modeling improves our ability • Documenting
to manage software complexity the artifacts of a software intensive system.

Diagrams are views of a model Diagrams are views of a model


Activity
Activity Use-Case
Use-Case Class
Class
Diagrams
Diagrams Diagrams
Diagrams Diagrams
Diagrams A model is a complete description of a
system from a particular perspective.
Object
Object Sequence
Sequence • Use case diagrams are used to illustrate user interaction
Diagrams
Diagrams Models Diagrams
Diagrams with the systems.
Collaboration State
• Class diagrams are used to illustrate logical structure.
Collaboration State
Diagrams
Diagrams Diagrams
Diagrams • Object diagrams are used to illustrate objects and links.
• State diagrams are used to illustrate the behavior.
Deployment
Deployment Component
Component • Component diagrams are used to illustrate physical
Diagrams
Diagrams Diagrams
Diagrams structure of the system.

8
Diagrams are views of a model Visual Modeling and Iterative
(Contd..) Development (Changes to Design)
• Deployment diagrams are used to show the mapping of Initial
the software to hardware. Requirements

• Interaction diagrams (collaboration and sequence Risk


diagrams) are used to illustrate behavior. Targeting
• Activity diagrams are used to illustrate the flow of events Requirements
Assessment
in a use case. Analysis & Design

Implementation &
Testing

Deployment

Visual Modeling and Iterative Visual Modeling and Iterative


Development Development (Change Assessment)
Initial
During implementation and testing within Requirements
an iteration, it is likely that architectural or
Risk
design changes are made via changes to the Targeting
source code. Some changes are intentional Assessment
Requirements
Analysis & Design
and others are inadvertent.
Implementation
Implementation&
&
Testing
Testing

Deployment

Problems Addressed by Visual Modeling Practice 5: Verify Software Quality

Root Causes Solutions


• Insufficient requirements • Use cases and scenarios unambiguously Develop Iteratively
specify behavior.
• Ambiguous communications
• Models capture software design s
unambiguously.
• Brittle architectures • Non-modular or inflexible architectures Use
are exposed. Manage Model Verify
Component
• Overwhelming complexity • Unnecessary detail hidden when Requirements Visually Quality
appropriate. Architectures
• Unambiguous designs reveal
• Undetected inconsistencies inconsistencies more readily.
• Poor testing • Application quality starts with good
design. Control Changes
• Insufficient automation • Visual modeling tools provide support
for UML modeling.

9
Iterative Development Permits
Practice 5: Verify Software Quality
Continuous Testing
Software problems are 100 to 1000 times more
Costly to find and repair after deployment Iteration 1 Iteration 2 Iteration 3
C
R R R
O
D D D
S
C C C
T
T1 T1 T1

T2 T2 T2

Test Test Test


TIME
Plan Plan Plan
Deployment Test Design Design Design
Life Implement Implement Implement
Cycle Execute Execute Execute
Evaluate Evaluate Evaluate
Development

Testing in an Iterative Environment Automation Reduces Testing Time and Effort


R Iteration 1 Iteration 2 Iteration 3
E One Manual Test Cycle
Q’ 13000 Tests 2 Weeks 6 People
T

Test
T Test Suite 1 Test Suite 2 Test Suite 3 Automation

E 13000 Tests
S 6 Hours Run More Tests More Often
1 Person
T

Dimensions of Software Quality Problems addressed by Verifying Quality


Root Causes Solutions
Type Why? How? • Subjective Assessment • Testing provides objective
project status assessment
Functionality Does my application Test cases for each • Undetected Consistencies • Objective assessment exposes
do what is required? scenario implemented inconsistencies early
Reliability Does my application Analysis tools and • Testing and verification are
• Poor Testing
leak memory? code instrumentation focused on high risk areas
• Defects are found earlier and
• Insufficient Automation are less expensive to fix
Application Does my application Check performance for
respond acceptably? each use-case/ scenario • Automated testing tools provide
Performance implemented testing for reliability,
Does my system Test performance for functionality and performance.
System
perform under all use cases under
Performance production load? authentic and worst-
case load

10
Practice 6: Control Changes to Software Practice 6: Control Changes to Software

Factors of Multiplicity
Develop Iteratively • Multiple Developers
• Multiple Teams
• Multiple Sites
Use • Multiple Iterations
Manage Model Verify
Component • Multiple Releases
Requirements Visually Quality
Architectures
• Multiple projects
• Multiple platforms

Control Changes Without explicit control,


Parallel development degrades to chaos.

Three Major Aspects of a Change


Three Common Problems
Management System
• Simultaneous Update: Undesired update by later • Change Request Management (CRM): Cost and
on the work of the former. schedule impacts of a requested changes to the
existing product.
• Limited Notification: All team members are not
notified about fixing of a problem in shared • Configuration Management (CM): Defining
configuration; Building, labeling and collecting
artifacts. versioned artifacts; Maintaining Trace-ability
• Multiple versions: it is usual to have multiple between versions.
versions of an artifact in different stages at the • Measurement: Describing the state of the product
same time. e.g. one release is in customer use, one in terms type, number, rate and severity of the
is in test, and one is still under development. defects found and/ or fixing during the course of
development.

Concepts of Configuration and Control Changes Support


Change Management All Other Best Practices
• Decompose the architecture into subsystems and assign • Progress is incremental only if
• Develop Iteratively changes to artifacts are controlled.
responsibility for each subsystem to a team.
• Establish secure workplaces for each developer • Manage Requirements • To avoid scope creep, assess the
- Provide isolation from changes made at other work impact of the proposed changes
before approval.
places. • Components must be reliable i.e.
• Use Component
-Control all software artifacts- models, code, docs etc. Architectures
correct versions of all constituent
parts are found.
• Establish an integration work place.
• Establish an enforceable change control mechanism. • Model Visually • To assure convergence,
incrementally control models as
• Know which changes appear in what releases. designs stabilize.
• Release a testes baseline at the completion of each • Verify Quality • Test are only meaningful if the
iteration. versions of the items under test are
known and the items protected
from changes.

11
Problems Addressed by Controlling
Changes
Best Practices Reinforce Each Other
Ensures users involved Manage
As requirements evolve Requirements
Root Causes Solutions
• Insufficient requirements • Requirement change workflow is Validates Architectural Use
defined and repeatable Early Component
• Ambiguous communications • Change requests facilitate clear Architecture
communications
• Overwhelming complexity • Isolated workspaces reduce Develop Address complexity of design/ Model
interference from parallel work Iteratively Implementation incrementally Visually
• Subjective assessment
• Change rate statistics are good
metrics for objectively assessing
• Undetected inconsistencies project status Measure quality early and often Verify
• Workspaces contain all artifacts, Quality
• Uncontrolled Change facilitating consistency
• Insufficient Automation • Change propagation is controlled. Evolves baseline incrementally Control
• Changes maintained in a robust,
Changes
customizable system

Summary: Best Practices of


The RUP Software Develoment
• The RUP (stands for Rational Unified Process) The result is software that is
brings the six best practices together in a form that
is suitable for a wide range of projects and - On Time
organizations. - On Budget
• It has four major phases- Inception, Elaboration,
Construction, and Transition that work on - Meets Users Needs
organization along the contents of business
modeling, requirements, analysis and design,
implementation, test, deployment, configuration
and change management, project management and
environment.

12

You might also like