You are on page 1of 58

DHS Office of Systems and Technology

Guideline Document for:


Agile System Implementation

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

SECTION 1: PLANNING AGILE


PROJECTS ....................................................................................
5
1.1
Establish and Utilize a Single Product
Backlog ............................................................................................
5
1.2
Recommended Unique Requirement
Priorities ..........................................................................................
5
1.3
Size
Estimation ...............................................................................................................
............................
5
1.4
Function Point
Estimate ..................................................................................................................
...........
7
1.5
System Roadmap: Initial Plan of
Work .......................................................................................................
7

SECTION 2: SYSTEM MANAGEMENT AND ENGINEERING


APPROACH ..............................
8
2.1
Defined Project Engineering
Strategy .........................................................................................................
9
2.2
Scrum Management
Approach ...................................................................................................................
9
2.2.1
Planning and Executing the Four Week Engineering Sprints .....................................................................
9

2.2.1.1 Sprint Planning Part


1 ............................................................................................................................
9

2.2.1.2 Sprint Planning Part


2 ..........................................................................................................................
10

2.2.1.2.1 FEATURE TEAM SPRINT BACKLOGS & BURN-DOWN


CHARTS .......................................................
10

2.2.1.3
Sprint Charters .....................................................................................................................................
10

2.2.1.4 Daily Stand-up Scrum


Meetings ...........................................................................................................
11

2.2.1.5
Sprint Review Meetings .......................................................................................................................
11

2.2.1.6
Sprint User Acceptance ........................................................................................................................
11

2.2.1.7
Retrospectives ....................................................................................................................................
.
12

2.2.1.7.1 FEATURE TEAM SPRINT


RETROSPECTIVES.....................................................................................
12

2.2.1.7.2
SOLUTION RETROSPECTIVES .........................................................................................................
12

2.2.1.8
Pre-Sprint Preparation .........................................................................................................................
12
2.2.2
Graphical Depiction of Planned Sprint Cycle ...........................................................................................
12

2.3
Sprint 1: Project Startup
Activities ............................................................................................................
12
2.4
Solution Engineering
Approach ................................................................................................................
13
2.4.1
Collective Ownership of the Solution ......................................................................................................
13
2.4.2
Sprint Execution User Validation .............................................................................................................
13
2.4.3
Training Material Production ...................................................................................................................
14
2.4.4
Continuous Integration ............................................................................................................................
14

2.4.4.1 Software Builds: Code


Compilation .....................................................................................................
14

2.4.4.1.1
SOFTWARE VERSION CONTROL .....................................................................................................
15

2.4.4.1.2 DEDICATED BUILD MACHINE/SERVER & CONTINUOUS INTEGRATION


SERVER ...........................
15

2.4.4.1.3
AUTOMATED BUILDS .....................................................................................................................
15

2.4.4.1.4
PRIVATE DEVELOPER BUILDS .........................................................................................................
15

2.4.4.1.5
FAST/QUICK BUILD CYCLES ............................................................................................................
15

2.4.4.2
Continuous Database Integration ........................................................................................................
15

2.4.4.2.1 VERSION CONTROLLED DATA DEFINITION LANGUAGE (DDL)


SCRIPTS .........................................
15

2.4.4.2.2
AUTOMATED DATABASE INTEGRATION ........................................................................................
16

2.4.4.2.3 DATABASE ADMINISTRATION AND


TUNING .................................................................................
16

2.4.4.3 Test Driven Development & Continuous


Testing .................................................................................

16

AR DHS Agile System Implementation Guidelines v0_13.docx


Page 2

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

2.4.4.3.1
AUTOMATED UNIT TESTING ..........................................................................................................
16

2.4.4.3.2 MAINTAINING ROBUST UNIT


TESTS ..............................................................................................
16

2.4.4.3.3 CREATION AND MAINTENANCE OF TEST


DATA .............................................................................
16

2.4.4.3.4 UNIT TESTING FOR IDENTIFIED


DEFECTS .......................................................................................
17

2.4.4.3.5 AUTOMATED DATA CONVERSION


TESTING ..................................................................................
17

2.4.4.4 Code Inspection and


Review ................................................................................................................
17

2.4.4.4.1 AUTOMATED CODE VERIFICATION: CODING


STANDARDS ............................................................
17

2.4.4.4.2
MANAGING CODE COVERAGE .......................................................................................................
18

2.4.4.4.3 ELIMINATION OF DUPLICATE


CODE ..............................................................................................
18

2.4.4.5
Documentation
Compilation ...............................................................................................................
18

2.4.4.5.1 AUTOMATED API COMPILATION: CODE


SPECIFICATIONS .............................................................
18

2.4.4.5.2 AUTOMATIC CREATION OF A SOLUTION DATA


DICTIONARY ........................................................
18

2.4.4.5.3
PRODUCT DASHBOARD PUBLISHING ............................................................................................
18
2.4.5
Periodic Automated Environmental
Refresh ...........................................................................................
19

2.4.5.1 Automated Environmental Refresh Approach and


Strategy ...............................................................
19

2.4.5.2
Temporary Environmental
Archive ......................................................................................................
19

2.4.5.3 Reestablish Base system


Configuration ...............................................................................................
19

2.4.5.4
Code
Migration ....................................................................................................................................
19

2.4.5.5
Structural Database
Modifications ......................................................................................................
20

2.4.5.6
Configuration Data
Load ......................................................................................................................
20

2.4.5.7 Loading of Test


Data ............................................................................................................................
20

2.4.5.8
Shake-down
Testing .............................................................................................................................
20
2.4.6
Integration Test and Regression
Test ......................................................................................................
20

2.4.6.1
Verification.....................................................................................................................................
......
21
2.4.7
Pair
Programming .................................................................................................................................
...
21
2.4.8
Design
Approach ......................................................................................................................................
21

2.4.8.1
Configuration vs.
Customization ..........................................................................................................
21
2.4.9
Database
Customization ..........................................................................................................................
21
2.4.10
Solution
Refactoring ................................................................................................................................

22

2.4.10.1
Load and Performance
Testing ........................................................................................................
22
2.5
Executing a Production Release
Sprint .....................................................................................................
22
2.5.1
Release
Planning ......................................................................................................................................
23

2.5.1.1
Release Sprint
Staffing .........................................................................................................................
23
2.5.2
Planning End-User
Training......................................................................................................................
23
2.5.3
Stakeholder Involvement and
Communication .......................................................................................
24
2.5.4
Technical
Implementation .......................................................................................................................
24
2.5.5
Load and Performance
Testing ................................................................................................................

24
2.5.6
Formal User Revalidation &
Acceptance .................................................................................................
24
2.5.7
Execution of Data Conversion in
Production ...........................................................................................
24

SECTION 3: SUPPORT
PROCESSES ...........................................................................................
...
25
3.1
Continuous Process
Improvement ............................................................................................................
25
3.2
Project Configuration
Management .........................................................................................................
25
3.3
Risk and Issues
Management ................................................................................................................
...
25
3.3.1Contingency
Planning ..............................................................................................................................
26

AR DHS Agile System Implementation Guidelines v0_13.docx

Page 3

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

3.4
Assumption and Constraint
Management ................................................................................................
26
3.5
Performance
Metrics ...................................................................................................................
.............
27
3.5.1
Scope & Schedule
Management ..............................................................................................................
27

3.5.1.1 Function Point Earn


Rate .....................................................................................................................
28

3.5.1.2 Hours per Function


Point .....................................................................................................................
29
3.5.2
Team Performance
Metrics .....................................................................................................................
30

3.5.2.1 Planned vs. Actual Feature Team Velocity by


Iteration .......................................................................
30

3.5.2.2 Actual Feature Team Member Velocity by


Iteration ...........................................................................
31

3.5.2.3
Remaining Product
Backlog .................................................................................................................
32
3.5.3
Quality
Metrics ........................................................................................................................................
33

3.5.3.1 Iteration Unit Test


Density ...................................................................................................................
33

3.5.3.2 Solution Unit Test


Density ...................................................................................................................
34

3.5.3.3
UAT Defect
Density ..............................................................................................................................
35

3.5.3.4

Failed
Builds .........................................................................................................................................
35

3.5.3.5
Failed Environmental
Refreshes ..........................................................................................................
36
3.6
End User
Training ..................................................................................................................
...................
37
3.7
Stakeholder Involvement and Communications
Management .................................................................
37

SECTION 4:
PERSONNEL .....................................................................................
............................
38
4.1
AGILE PROJECT
ORGANIZATION .....................................................................................................
..........
38
4.1.1SUBJECT MATTER EXPERTS
INVOLVEMENT .............................................................................................
39
4.1.2
Project
OWNER ........................................................................................................................................
40

4.1.3
AREA PRODUCT
OWNERS ........................................................................................................................
40
4.1.4
MASTER SCRUM
MASTER ........................................................................................................................
41
4.1.5
SCRUM
MASTERS .....................................................................................................................................
41
4.2
FEATURE
TEAMS ...................................................................................................................
....................
42
4.2.1
FEATURE TEAM RAMPUP .......................................................................................................................
42

SECTION 5:
PROJECT FACILITIES AND
RESOURCES ..............................................................
43

AR DHS Agile System Implementation Guidelines v0_13.docx Page 4

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

Section 1: Planning Agile Projects


Planning an Agile software development project is similar in many ways to planning projects that use
alternative system implementation lifecycles. The main difference between the two approaches is that agile
projects consider initial plans to be estimates of the work that will be accomplished and not final
commitments that fully describe every activity that will be completed, the exact effort the activity will take,
and the exact dates in which it will happen. In other words Agile approaches are more agile and flexible,
and that flexibility begins during the planning process. Agile software development approaches also focus
on addressing the most important and most challenging components of the project as early as possible to
reduce risk; this is accomplished through the use of unique priorities for each story/requirement.

1.1 Establish and Utilize a Single Product Backlog


The most important asset that is developed and regularly evolved on an Agile project is the Product Backlog.
The backlog, as the name suggests, is a backlog of work that must be completed in delivering the intended
system and functionality to end users. Each project should have one, and only one, product backlog file. Project
teams should use the established backlog template as a starting point for their specific project/product backlog,
DHS_TP001_Requirements_Product_Backlog (see https://ardhs.sharepointsite.net/guidelines/Shared
%20Documents/TP001_Requirements_Product_Backlo g.xlsx).

The assigned Product Owner should meet with business users to establish the initial version of the
project/product backlog and establish an overall vision for the solution. Note that the backlog will change
and grow with the project as additional tasks are identified and added, so the intent of the initial backlog is
to establish a good enough understand to be able to estimate the amount of time and resources needed to
complete the project successfully.

1.2 Recommended Unique Requirement Priorities

Teams must indicate the unique priority/importance of each requirement in the backlog. Note that the
assigned unique importance must be a unique number. Teams should determine what scale will be used
for priorities (e.g. from 1 to N, where N is ten times the total number of requirements in the Product
Backlog) for each and every requirement included in the Product Backlog. In the example provided,
priorities would be assigned in increments of 10; this readily enables the team to insert new requirement
priorities between existing requirements as the project progresses. If the team believes that alternative
approaches would be more effective (based on past experiences) they are free to use these practices
once the project is underway and opportunities for improvement are identified during the Sprint
retrospective meetings.

The teams initial priorities will serve as the initial starting point for the Product Owner to establish overall
priorities for the solution during project startup (see section 4.1.2 Project OWNER for information about the
Product Owners role). The assigned priorities are also expected to heavily influence the proposed
development roadmap defined in section 1.5 System Roadmap: Initial Plan of Work and the themes of
each resulting Sprint.

1.3 Size Estimation


During the initial planning stages the team is also required to assign each requirement a size estimate for
all requirements. DHS uses a generic size unit, such as Story Points, to estimate the size using the

Fibonacci sequence included in the table below and in the product backlog template. Teams should
establish definitions or guidance for the use of each numeric size that will be used by the Feature Teams
on the project to establish a common understanding of size expectations. This should broker more uniform
estimation across teams (on multi-team projects). The definitions must include information from all of the
software engineering disciplines relevant to the project, including data conversion, interface, database,
user interface and architectural elements and any other relevant components/competencies.

AR DHS Agile System Implementation Guidelines v0_13.docx Page 5

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

Note that DHS does not intend to use or control projects with the above mentioned generic size estimates;
the size estimates are used for planning Sprints, specifically how much scope it is believed can be
addressed within each Sprint and are NOT considered commitments.

Teams should also define which sizing metric would be most beneficial to use on a given project;
specifically the use of Ideal Days versus an arbitrary generic size unit or perhaps even using Function
Points directly. The approach should be a simple mechanism to use on the project for all staff that quickly
and effectively enable the tracking of software engineering tasks.

Size Component

Project Definition and Guidance About Size

0- Zero

1
-Very Tiny

2
Tiny

3
- Almost Tiny

5
- Very Small

8
Small

13
Medium

21
Large

34
- Very Large

55
Huge

89
Massive

144 Need more

<Note that DHS expects Requirements/Stories of this size to be broken down into
information

smaller units of work before being allocated to a particular Sprint.>

AR DHS Agile System Implementation Guidelines v0_13.docx Page 6

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

1.4 Function Point Estimate


Before being approved by DHS, all candidate Agile system implementation projects are required to
estimate the size of the project in Function Points. The Function Point estimate does not have to be tied to
individual requirements in the initial Product Backlog, though it is expected that the planning team will use
the requirements during the estimation. Because many of DHS project are required to track and cost
allocate funds across various Federally funded programs, teams may be required to provide a Function
Point Estimates for sets of requirements for the included program areas (e.g. Medicaid and SNAP). DHS
will use the total supplied Function Point estimate as an obligation/commitment governing the amount of
functionality delivered on the project.

Overall Function Point Estimate for Solution:

Area 1 Function Point Estimate:

Area 2 Function Point Estimate:

1.5 System Roadmap: Initial Plan of Work


During project planning, project teams should create a detailed roadmap that includes a high-level list of
activities that are expected to be completed during the project. Project teams should use DHS standard

roadmap template, TP001_Agile_Project_Roadmap_Template.pptx (see


https://ardhs.sharepointsite.net/guidelines/Shared%20Documents/TP002_Agile_Project_Roadmap_Temp
late.pptx) when constructing their roadmap. The roadmap should include the initial identified/desired
software releases, as well as the intended theme of each Sprint in the project.

Sprint themes on the roadmap will likely correlate with the eventual Sprint goal and identify the prioritized
features that are aligned with each Sprint in a way that addresses both business priorities and the greatest
risk early in the project.

AR DHS Agile System Implementation Guidelines v0_13.docx Page 7

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

Section 2: System Management and Engineering


Approach
DHS preferred approach to new system development project is an Agile approach based on Scrum and
Extreme Programming practices. With this approach and industry-standard practices, teams are able to
iteratively evolve software solutions while addressing the highest risk components early. (See the following
diagram for an illustration of DHS approach). Project teams are free to explore alternative Agile
approaches (e.g. Crystal Clear and Unified Process) and to enhance the Agile approach shown below,
however project teams must clearly demonstrate the justification for and benefit to before any alternative
approach is categorically approved for use.

AR DHS Agile System Implementation Guidelines v0_13.docx Page 8

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

2.1 Defined Project Engineering Strategy


Before starting a system development project, project teams should define a project engineering strategy
and describe how the strategy can be used to deliver the intended solution and meet stated goals and
needs associated with the project. A key component of the defined engineering strategy should include the
definitions of done at three (3) different levels:

The definition of done at the solution level is used to determine when the system is finished and meets
DHS intended need(s)

The definition of done at a Sprint level indicates when a Sprint goal is achieved (see section 2.2.1 Planning
and Executing the Four Week Engineering Sprints);

The definition of done for the various coding related activities such as development, data conversion,
training material creation, and interface development.

Scrum Management Approach


As a general practice, DHS understands and respects the Agile and Scrum principles of self-organizing
teams and shared ownership of the project processes and results for their project. During project planning,
the project team project team provides details of the project management and project control methods that
they intend to use. The intent of these details is not to identify fixed processes and approaches for the
project, but rather to identify practices that the project team believes are well suited for the project. In other
words, the details are considered to be initial conventions that teams may use over the first few iterations,
and once the project is underway, identify and implement process improvements throughout the project.

Planning and Executing the Four Week Engineering Sprints

In using the Scrum system, a good practice and approach is using four week engineering Sprint/iterations.
During project planning, the project team should define the initial Sprint duration and configuration that will
be used to manage the development activities. This guideline document provides an initial standard and
structure for how a Sprint cycle should be executed.

As a general guideline, a good approach is one that efficiently and effectively integrates the system in a
timely fashion. Project teams are encouraged to work collaboratively with DHS management and involved
stakeholders, throughout the project to a establish project approach that is best suited to meeting the
specified needs of the project champion/sponsor.

Sprint Planning Part 1

In Part 1 of Sprint planning, the Project Owner (and Area Product Owners on large projects), executive
management and users meet with representatives from each of the Feature Teams and the Scrum Masters
to establish a goal for the Sprint. These meetings are also used to identify what functionality will be built in
the coming Sprint.

The part 1 Sprint planning meeting lasts no more than 4 hours and during the meeting, business priorities
are discussed and then contrasted against the priorities in the Product Backlog. If necessary the backlog
priorities may be changed to reflect new information or changes in business priorities. As the top priorities
are selected for inclusion in the scope of the Feature Teams Sprints, careful attention is paid to the size of
work being taken by each team in light of the teams corresponding velocity (for more information about
Feature Team velocity see section 2.2.1.3 Sprint Charters).

It is important to note that, for large projects with multiple Feature Teams, the representatives from each
Feature Team are fully authorized and responsible for the assignment of work to their associated team. At
the end of the part 1 planning, the Feature Team is fully committed to delivering the work they have agreed
to take on. Additionally, as a good practice, DHS uses an approach that openly allows all Feature

AR DHS Agile System Implementation Guidelines v0_13.docx Page 9

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

Team members the opportunity to participate as the team representative in the part 1 planning meeting
over a period of time. The establishment of a rotating schedule of the team representative(s) is a good
practice for accomplishing this.

During project planning, the project team defines the accepted convention for how Feature Team members
are identified to attend the part 1 planning meeting and how the teams will maintain shared commitment
for the selected scope. Other conventions about how the part 1 planning meeting (such as when in the
Sprint cycle: beginning or end) is conducted should also be noted. Teams should also consider how the
Sprint 1 planning meeting can be structured and managed to prevent overrunning the allocated 4 hour
duration.

Sprint Planning Part 2

The second part of planning for a Sprint involves the individual Feature Teams meeting and discussing the
work it committed to complete and how it intends to build the functionality into a product increment during
the Sprint. For large multi-team projects, Area Product Owners should be available to assist the Feature
Team in the detailed planning of their Sprint activities, however the Area Product Managers may be
required to split her/his time across multiple Feature Teams and hence may not be available to a specific
Feature Team for the full duration of the part 2 planning meeting. As with the first Sprint planning meeting,
the second planning meeting lasts no more than four hours.

During project planning, the team decides how they will conduct the second part of the Sprint planning
meeting while recognizing that self-organizing teams may alter those practices at a future time.

FEATURE TEAM SPRINT BACKLOGS & BURN-DOWN CHARTS

The main output of the part 2 planning meeting is the Sprint Backlog, which includes the tasks the Feature
Team will complete in building the required features, estimates of the amount of time it will take to realize

those features (preferably in Ideal Days) and a Sprint burn-down chart template. The use of specific media
for the Sprint backlog items is specified by each of self-organized Feature Teams. Teams are encouraged
to use physical Sprint backlog media to track progress unless other specific practices are justified.

During project planning, the project team should define how the burn-down chart will be updated each day
and how records will be kept to record the daily progress (e.g. picture of the team backlog are taken and
electronically filed). Additionally, the project team should define how tasks are identified and estimated and
how unexpected complexity is to be handled.

As a general guideline, each requirement should be broken down into smaller tasks during the part 2
planning meeting and during this decomposition each tasks should be assigned an Ideal Day estimate,
which is maintained throughout the Sprint, by the Feature Team.

Sprint Charters

One of the main outputs of both of the Sprint planning meetings is a one page charter that indicates the
goal of the Sprint, the dates for the beginning and ending of the Sprint, the time and location of the project
daily Scrum meeting as well as dates, times and locations for the Sprint Review meeting and the solution
retrospective meeting. For large, multi-team project, the solution Sprint Charter is made available to all
stakeholders interested in the project. Additionally, the Feature Teams create auxiliary charters that include
relevant details about the scope of work they have selected along with relevant details about their teams
plans for the Sprint.

As a general guideline, the Feature Team representatives bring a partially completed team charter to the
part 1 planning meeting. This partially completed charter includes the teams targeted velocity (in generic
size count) and the Feature Teams expected availability over the Sprint (e.g. accounting for known
vacations and holidays). While the content of the charters may evolve over time, project teams are
required to use the established template, TP003_Sprint_Charters_Template (see

AR DHS Agile System Implementation Guidelines v0_13.docx Page 10

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

https://ardhs.sharepointsite.net/guidelines/Shared%20Documents/TP003_Sprint_Charters_Template.doc x), at
the beginning of the project.

Daily Stand-up Scrum Meetings

For large multi-team projects, Daily Scrum meetings are conducted at two levels each day; at the Feature
Team level for each team and at the Enterprise level where representatives from each Feature Team
attend with the Product Owner and Area Product Owners; small single team projects only require one Daily
Scrum meeting. Regardless of which level the daily Scrum meeting is for, it never lasts more than 15
minutes, requires that all team members attend, stand, and provide answers to the following 3 questions:

What was completed since the last daily Scrum Meeting?

What will be finished by the next daily Stand-up Meeting?

What is preventing work from being completed?

During project planning, the project team defines how will keep the daily meetings under the 15 minute
limit, how they will handle non-Feature Team (uninvited) participation (not attendance), and how they will
schedule the Feature Team level meetings to facilitate a consolidated and accurate solution level meeting.

Sprint Review Meetings

Simply put, the Sprint review meeting is a 4 hour demonstration of the new products and features that were
built during the latest 4 week Sprint. During the meeting representatives from the Feature Team(s) present

the results of their work to the Product Owner, Area Product Owners, management, users and other
stakeholders. The big-picture intent of the review meetings is to validate the solutions progress against
DHS needs and priorities and to mitigate project risk.

Presenters at the review meeting should do as little as possible in preparation of the meeting; the meeting
is not intended to be a polished presentation requiring additional project resources to prepare, but rather a
functional view of the solution and its progress. During project planning, the project team should define
how the meeting will be organized, how Feature Teams might identify presenters, and how feedback will be
collected.

As a general guideline, DHS uses the review meetings as a stakeholder involvement and communication
forum for many of the project stakeholders. During project planning, project teams should define how DHS
can involve the broadest audience possible (video conference, webinar, etc.) while completing the meeting
within the four hour limit.

Sprint User Acceptance

Traditional User Acceptance Testing (UAT) is not readily possible when using Agile project approaches. As
a result DHS has adopted a three pronged approach to addressing the need for formal acceptance of the
developed solution:

Sprint Execution User Validation see section 2.4.2 Sprint Execution User Validation.

Sprint Review User Acceptance (this section) at the end of each Sprint, during the Sprint Review
Meeting, DHS reviews and accepts the developed components presented in the review.

Release Sprint User Revalidation while executing a Release Sprint (see section 2.5.6 Formal User
Revalidation & Acceptance requirements) DHS requires that Users Revalidate the functionality that is
being released.

As a general guideline, DHS provides acceptance of the developed features and components
demonstrated in the Sprint Review Meetings. All feedback that requires a system change is incorporated

AR DHS Agile System Implementation Guidelines v0_13.docx Page 11

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

into the Product Backlog for future Sprints. During project planning, the project team should define how features
will be demonstrated, how feedback will be collected and how acceptance will be documented.

Retrospectives

Retrospection is a key activity that drives continuous improvement during the project. During project
planning, the project team should provide teams with a brief summary of techniques that they will use
during retrospective meetings. Additionally project teams should define when retrospective meetings are
conducted within the Sprint cycle. A recommended guideline is to conduct retrospective meetings at the
end of a Sprint because any opportunities for improvement will be fresh in the teams minds. Note that for
large, multi-Feature Team, projects retrospective meetings are conducted at two levels.

FEATURE TEAM SPRINT RETROSPECTIVES

At the end of each iteration, each Feature Team must conduct a retrospective meeting to inspect their
processes, identify potential improvements, assess any limitations and challenges faced by the team, and
determine solutions that potentially eliminate the challenges. During project planning, the project team
should define how the Feature Teams will conduct retrospectives, when they should be conducted and any
particular approaches that might be helpful (e.g. Kaizen, 5-Whys).

SOLUTION RETROSPECTIVES

In addition to the Feature Team retrospectives, the DHS Project Owner or Master Scrum Master (for very
large projects) conduct a solution level retrospective with the focus of improving operations and
efficiencies at the overall project level. A key input to the solution level retrospective is the challenges and
ideas from the Feature Team retrospectives. The enterprise retrospective meetings are also conducted
after the end of each Sprint and will be led by the Product Owner or designee and attended by the Area
Product Owners, DHS management and other interested stakeholders.

During project planning, the project team must provide input and recommendations about how the solution
retrospective meetings should be scheduled to enable input from the Feature Team retrospectives as well
as more general recommendations about how DHS can implement enterprise improvement activities.

Pre-Sprint Preparation

As a regular and on-going activity before each Sprint, in anticipation of the Sprint planning meetings, the
DHS Project Owner assesses and updates the Product Backlog to reflect current priorities, recent
legislative changes or other external influences. These updates also include the closing out of
requirements that were addressed in the previous Sprint as well as updates to the Product Burn-down
Chart.

Graphical Depiction of Planned Sprint Cycle

Within the Defined Project Strategy, project teams should create a graphical depiction of the Sprint
cycle/timeline that will be used to meet DHS requirements. This graphic depiction should show how
activities conducted within each Sprint are laid out; it is not considered to be a final, rigid plan for each
Sprint, but rather a visual aid to help project team members to understand the timing of events within the
Sprint.

2.3 Sprint 1: Project Startup Activities


As part of their project planning efforts, project teams should define the approach that will be used to
complete the installation, configuration, and start-up activities that are required by the project, but not
currently in place. These activities usually include:

Establishment of the proposed technical environments

Installation and base configuration of any necessary software packages

AR DHS Agile System Implementation Guidelines v0_13.docx Page 12

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

Configuration of the development toolsets (i.e. IDE, testing tools, see section 2.4.4 Continuous Integration)
Establishment of the project performance metric utilities (section 3.5 Performance Metrics)
Executing Sprint Planning (parts 1 and 2) for Sprint 2 (2.2 Scrum Management Approach)

Establishing project facilities and workspaces (if required)

The plan for the project startup activities within Iteration 1 should include a Sprint Charter (and Solution
Charter for large projects) and a Sprint Backlog. These materials accelerate the execution of the execution
activities for the first Sprint. The project team should also provide details about the tasks that must be
executed, any constraints that are known, dependencies between tasks, and the level of resourcing
required to complete the work.

For all identified iteration tasks, the project team should insert requirements into the Product backlog at the
end of the list and assign unique priorities that effectively convey that these tasks must be completed in the
first Sprint. For example, the first Sprint may be used to elicit User Stories/Requirements from business
users for a given set of functions; the elicitation of requirements in each area may be suitable tasks for the
Sprint product backlog.

2.4 Solution Engineering Approach


In order to maximize efficiency, minimize risk, and implement the desired solution in the shortest possible
time, DHS makes heavy use of Extreme Programming (XP) concepts. During project planning, project
teams are encouraged to use additional/alternative techniques and methods that have been proven to yield
effective results for software engineering in the past.

In Agile terms, the four week development iterations must produce a potentially shippable product. In
general, potentially shippable is defined to mean a version of software products and any and all

associated materials required to implement that software. It is important to note that the software may
require additional components to function in the production environment (e.g. a user interface) but the unit
component (e.g. database stored procedure) is of production quality.

Collective Ownership of the Solution

It is imperative that everyone on the project, from the most junior team member to the Project Owner and
management, to have a strong sense of collective ownership for the successful implementation of the
desired solution within the defined schedule. This is reflected in the horizontal nature of the organization
chart shown in section 4.1 AGILE PROJECT ORGANIZATION. During project planning, the project team
should decide how they will actively foster and maintain a culture of shared ownership across the team and
with other project stakeholders.

Sprint Execution User Validation

DHS uses the concept of validation to ensure that the systems meet the intended needs, or goodness of
fit. Hence validation cannot be completed with automated testing, at least initially. Validation must involve
subject matter experts (SMEs) who understand the nature of the intended solution. Through close
collaboration during each Sprint, the Project Owner, Area Product Owners, SMEs, and other participants
validate the evolving solution during each Sprint.

Traditionally, validation is accomplished during User Acceptance Testing (UAT), which is not readily
possible when using Agile concepts. As a result DHS has adopted a three pronged approach to addressing
the need for formal acceptance of the developed solution:

Sprint Execution User Validation (this section). During the execution of the Sprint, the Project Owner and
Area Product Owners regularly provide input, use and review the work of the Feature Teams.

Sprint Review User Acceptance see section 2.2.1.6 Sprint User Acceptance

Release Sprint User Revalidation see 2.5.6 Formal User Revalidation & Acceptance

AR DHS Agile System Implementation Guidelines v0_13.docx Page 13

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

For an Agile approach to be successful, frequent and regular interaction must take place between the
Feature Teams and Subject Matter Experts (SMEs). As a result of this interaction, the Feature Teams gain
heightened insight about the required solution and lowers risk associated with the delivery of the right
solution. Due to the nature of the feedback and SME interactions, the Feature Teams use their judgment
based on the scope of the feedback; for feedback with small resulting changes the team assesses how to
incorporate the change within the existing Sprint; for larger changes that cannot be accommodated in the
Sprint the Feature Team works with the Project Owner and Area Product Owner to communicate that an
additional task (or tasks) needs to be added to the Product Backlog.

During project planning, the project team should discuss how features will be demonstrated, how feedback
will be collected, and how validation will be tracked to make sure that all new components receive user
scrutiny.

As a general guideline, the Project Owners and Area Produce Owners regularly engage stakeholders
external to the project to use features and components developed in the Sprint, and past Sprints, to
validate that the solution fits its intended need.

A best practice for user validation involves recording user interactions with the system in an automated
manner that enables playback of the user actions. During project planning, the project team should
discuss how this information will be captured, managed throughout the project, and used (played back) to
facilitate User Revalidation during Release Sprints (see 2.5.6 Formal User Revalidation & Acceptance for
more details).

Training Material Production

Project teams should use engineering approaches that incorporate the development of training materials
seamlessly into each Sprint. In other words, Feature Teams should be staffed with people who have skills
necessary for the development of training materials. Project teams should also give thought to the type of
type of training material(s) that will be developed and how those materials are integrated into the software
solution (e.g. on-screen help, training manuals and video tutorials such as YouTube).

Continuous Integration

DHS mandates and verifies that all project teams make full use of Continuous Integration (CI) from the
beginning of the project through the end of the project. During project planning, project teams should
define an approach to implementing continuous integration practices on the project including the
frequency and proposed times when integration cycles will be executed (e.g. 10 AM and 2 PM each
business day; full regression each Saturday at 1 PM). The project team should also include a discussion
of the tools (e.g. CI tool, build tool, version control and others) that are required to implement the proposed
approach.

Where possible, DHS uses freely available, open-source tools for executing a continuous integration
strategy. Whenever a project team identifies software tools that do not include a license free of charge, the
project team must clearly justify why the tool is needed and why a similarly-featured free of cost tool
cannot be used and request the purchase of the tool through the DHS Office of Systems and Technology
(OST).

Software Builds: Code Compilation

Ideally, system code should be compiled at least three times daily (twice during normal business hours
and once overnight). As a general good practice, DHS also encourages more frequent compilations. Doing
so enables the rapid detection of defects and issues with the code base that can rapidly be resolved
because the events that triggered the corresponding error would still be fresh in the project teams
memory. During project planning, the project team should define their approach to code compilation,
including any tools that will be used.

AR DHS Agile System Implementation Guidelines v0_13.docx Page 14

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

SOFTWARE VERSION CONTROL

During project planning, project teams should define their approach and toolset that will be used to
maintain control of versions of software. Because a single program/file may be modified by multiple people
simultaneously (shared solution responsibility) it is important that the proposed tool be capable of
reconciling multiple changes as the solution progresses. It is required that ALL code be checked in every
day before the end of the day.

2.4.4.1.2 DEDICATED BUILD MACHINE/SERVER & CONTINUOUS INTEGRATION SERVER

During project planning, project teams should define how builds will be created, specifically focusing on
hardware and network resources. As a general good practice, build cycles must be completed as rapidly as
possible, and in most cases executed in a few minutes. Project teams should also determine the
configuration of the build server and any specific requirements of such a machine that enable rapid build
cycles.

As a general guideline, a build server will, under normal circumstances, not require any unusual hardware
or configurations; in most cases a developer workstation can satisfy the requirements of a build server.

AUTOMATED BUILDS

During project planning, project teams should establish their build cycle, the events that are triggered, the
order of the events, dependencies and exceptions handling.

PRIVATE DEVELOPER BUILDS

As a general guideline, continuous integration process and, more specifically, the regular solution builds,
should be uneventful processes on the project. One effective means to accomplish this goal is to mandate
that all developers conduct a local build with modified code prior to checking their modified solutions into
the common repository.

During project planning, project teams define how their project approach addresses this requirement as
well as what expectations are included for private developer builds.

FAST/QUICK BUILD CYCLES

Build cycles should execute rapidly in order to provide timely feedback. Project teams should describe the
conventions that will be established on the project to balance the speed of the build cycle and associated
feedback against the need to provide adequate coverage of tests during the cycle. Project teams should
continually seek ways to refine the balance between useful inspection and rapid builds should the project
team experience longer than acceptable build times.

Continuous Database Integration

As with the continuous integration of software code based products, DHS mandates that databases
technologies be continually integrated with software solutions. Continuous database integration enables
project teams to establish a solid foundational domain model with which to build application layers. Doing
so reduces the risk of defects associated with multiple layers of the system from taking heightened effort to
trouble shoot and resolve. Additionally, continuous database integration accelerates testing as the
application and associated data model are established in a well-defined and known state.

VERSION CONTROLLED DATA DEFINITION LANGUAGE (DDL) SCRIPTS

Any development resource should be able to create or modify database components within the system.
The modifications and definition of database components must be accomplished with DDL scripts that are
to be included in the version control repository. Establishing this practice on the project enables the
Continuous Integration process to wipe and establish required database structures from scratch in a known
stable state.

AR DHS Agile System Implementation Guidelines v0_13.docx Page 15

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

AUTOMATED DATABASE INTEGRATION

The Continuous Integration process used by the project team must have the capability to automatically create
and refresh databases. Because database solutions are integral components of the overall solution, it is critical
to establish practices that mitigate the risk of establishing a fragile database platform.

DATABASE ADMINISTRATION AND TUNING

DHS requires DDL and DML scripts that are created and managed along with the solution code. As a
result, it is possible for DBA-type resources to periodically review and tune the database construction
scripts. This also enables this activity to be conducted with little or no impacts to Feature Teams who are
executing work on the solution.

During project planning, project teams should define how they will plan for and enable knowledgeable
individuals with experience tuning databases to tune the system to improve performance and the solutions
capabilities.

Test Driven Development & Continuous Testing

The software engineering approach used by DHS is heavily dependent on continuous integration and test
driven development as risk mitigation mechanisms. Project teams must define how they integrate test
driven development concepts and how tests are conceptualized and associated with Sprint backlog items.
The project teams should also determine what tools would help to automate the unit testing process and
should also define the strategy/architecture of the unit testing approach.

AUTOMATED UNIT TESTING

Because the regular, multi-daily build-and-test cycle must execute quickly, the project team must establish
sets of tests that vary in scope and complexity that exercise the system differently depending on the
available time. In other words, that automated tests run during business hours are relatively small, but
those that run each night are more comprehensive in nature. Project teams should use an approach that
enables unit tests of the system to be performed at various levels. During project planning, project teams
should describe such tiers of tests will be established, configured and maintained over time as the System
evolves.

MAINTAINING ROBUST UNIT TESTS

Systems often require thousands of unit tests to be run by the completion of the project. With this in mind, it
is important to establish an architecture for the unit tests in aggregate to prevent them from becoming
fragile over time. For example, hard coding a future date into a unit test makes the test sensitive to the
current date and at some point in the future the test results will change when the current date passes the
hard coded date. Project teams should give thought to defining how to make the unit testing approach
maintain a robust testing system that avoids fragile tests. Project teams should also define how the unit
tests will be combined in sequence to form testing scenarios.

CREATION AND MAINTENANCE OF TEST DATA

As the system evolves, it becomes increasingly important to establish test data that maintains integrity
across the application (e.g. establishment of client demographics associated with a client ID that is
referenced throughout the system). One way to establish this data is the use of Structured Query
Language (SQL) scripts to populate data directly into the database on the back end. During project
planning, the project team defines how referent, high integrity test data will be created and maintained in
the application in a manner that enables quick, frequent, and preferably automated, environmental
refreshes.

AR DHS Agile System Implementation Guidelines v0_13.docx Page 16

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

UNIT TESTING FOR IDENTIFIED DEFECTS

It is inevitable that defects will be injected into the system as part of the engineering cycle. While
unavoidable during initial injection, it is possible to prevent future occurrences of the defect by establishing
tests that verify the defect condition does not reoccur. As a general guideline, DHS requires that all
detected defects be covered by unit tests that detect any future instance of the defect.

AUTOMATED DATA CONVERSION TESTING

The conversion of data from existing applications into the system is included in the scope of Sprints as
necessary based on selected Sprint scope. Within this in mind, during project planning, project teams
should define an approach to converting data from existing systems into the target/new system, what tools
will be used to facilitate the conversion (e.g. Extract-Transform-Load (ETL) tool), how errors and
discrepancies will be resolved, and how the process will be integrated into the continuous integration cycle
and environmental refreshes.

DHS makes production data available to the team to test data conversion routines and processes on an asneeded basis. During project planning, project teams should define, in detail, how they will identify the need
for production data, how they will work with other DHS staff to obtain the data, how the data will be stored
to protect confidentiality, how the data will be used, and how often the data is expected to be refreshed
(e.g. new data pulled at the end of each Sprint and made available before the beginning of the next). As a
general practice, the project team should identify specific individuals who are authorized to access
production systems, thereby enabling her/him to extract sensitive production data that are made available
to teams to construct and test data conversion programs.

DHS is interested in innovative approaches to the integration of data conversion activities into the software
engineering approach. Project teams are encouraged to identify practices and recommendations that are
believed to be relevant and potentially beneficial for DHS to use. Additionally, due to the sensitive nature of
production data, project teams must also identify ways that mandatory HIPAA requirements can be met and
how the project team will help keep the data secure from unauthorized access.

While structuring the team, project planners must not identify a single Conversion Feature Team for the
project. In the spirit of Agile software development practices each Feature Team should have skills and
competencies available to address all required conversion activities within a Sprint.

Code Inspection and Review

Project teams regularly and aggressively review and inspect code work products being created on the
project to maintain a high degree of technical quality. DHS appreciates highly efficient Agile engineering
teams and understands that the results of this high performance equates to a large volume of solution
code. Therefore, it is always necessary to establish automated and continuous practices to ensure that
established standards are being met to avoid unnecessary technical complexity and reduced efficiency.

AUTOMATED CODE VERIFICATION: CODING STANDARDS

During project planning, the project team must identify the coding conventions that will be used on the
project as well as the toolset that will be used to verify the code against established standards. If the
project team needs to modify the systems existing coding standards, the team must explain the rationale
for the change and how the new coding standards will be enforced in tandem with pre-existing system
standards. The team should also explain how deviations from the established standard will be resolved by
project teams and how the standards are modified and maintained within the selected toolset.

The process of automated code verification against the established standard must be fully integrated into
the continuous integration cycle.

AR DHS Agile System Implementation Guidelines v0_13.docx Page 17

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

MANAGING CODE COVERAGE

During project planning, the project team defines ways use automated unit testing to ensure that adequate
testing coverage of the software solution. The project team should also describe the tools that will be used
to monitor testing coverage. It is important for the project team to provide a thorough description of the
overall philosophy behind managing testing code coverage, what percentage of code coverage is targeted,
and how deviations from that target will be addressed.

ELIMINATION OF DUPLICATE CODE

During project planning, the project team should define how the code will be assessed and reviewed to
identify duplicate or redundant code or code components. DHS understands that large-scale code sets
inevitably contain duplication of code structures and that this duplication must be managed to preserve the
overall solution quality. The project team should define what tools will be used to review code for
duplication and how the review will be integrated into the regular engineering activities, including the CI
cycle.

Documentation Compilation

DHS believes that a self-documenting solution is the best mechanism through which to create solution
documentation.

During project planning, the project team should define how solution documentation will be compiled,
stored, and published within the continuous integration cycle. Specifically, the team should describe how
often the documentation will be compiled and what specifically will be compiled. The team should define
the toolset that will be used to create and publish the documentation.

At a minimum, DHS requires the automated generation of:

A system Application Programming Interface (API)

A data dictionary that fully describes data models that are implemented with systems and

AUTOMATED API COMPILATION: CODE SPECIFICATIONS

Test Driven Development process is executed shortly before a set of activities dedicated to specifying and
coding of software. In other words the specifications are developed at the same time the code is developed
to satisfy established automated unit tests.

During project planning, the project teams define the proposed approach for automatically generating high
quality API code specifications. The project teams should also discuss any toolsets that will be used to
generate the specifications and how these specifications will be integrated into the systems integrated
library.

AUTOMATIC CREATION OF A SOLUTION DATA DICTIONARY

During project planning, the project team should define how a data dictionary will be automatically created by the
system. As a general practices, DHS prefers a dictionary that makes use of HTML technologies.

PRODUCT DASHBOARD PUBLISHING

At the completion of each CI cycle, a dashboard summarizing the results of the various activities is
created. The dashboard is visually appealing and easily navigable to facilitate executive DHS management
in understanding the technical status of the solution. During project planning, the project team should
identify dashboards that will be used. In addition, the project team should indicate desired and
recommended approaches for communicating the status of the technical solution to interested
stakeholders.

AR DHS Agile System Implementation Guidelines v0_13.docx Page 18

Department of Human Services

Office of Systems and Technology

Agile System Implementation Guidelines

Periodic Automated Environmental Refresh

DHS continually seeks to establish development and test related environments that are regularly
automatically reconstructed with fresh versions of the solution, configuration data, and test data. The intent
of the regular refresh is to avoid unknown or undocumented, but required, configurations that may lead to
issues and slow development progress. As a result of the previous point, DHS does not use a backup or
disaster recovery mechanism to satisfy this requirement as this approach still leads to unknown and
misunderstood configurations.

Ideally, an environment refresh would involve the automated establishment of a fully functional version of
the system from a clean, base system image. During project planning, project teams should describe how
to accomplish a fully automatic refresh of all proposed environments from the base configuration to a
specified version/baseline of the system. The automated refresh approach makes heavy use of the CI
solution described in section 2.4.4 Continuous Integration. Project Teams should describe in particular
detail:

How the existing environment will be temporarily backed up and when the temporary backup is deleted
How the base system installation is reestablished

How the system version is migrated to the specified environment


How any structural database modifications are applied

How configuration data is applied/loaded into the base installation

How test data is loaded

How shake-down testing is conducted on the newly established environment

Note that DHS uses scripts that can be executed, both manually and programmatically, that accepts
parameters (target environment, DHS system version, etc.) to trigger the refresh using the proposed
Continuous Integration tools.

Automated Environmental Refresh Approach and Strategy

During project planning, the project team should define in detail their approach to automatically refreshing
each proposed environment to a base-state with a known version of software, data model, configuration
data, and test data.

The continuous integration toolset is used to accomplish the automated environment refreshes. For each
environment, the project team should specify the frequency of the refresh (e.g. development refreshed
nightly, system test weekly and UAT before each validation cycle).

Temporary Environmental Archive

During project planning, the project team should explain how a temporary backup of a given environment is
backed up before beginning the refresh. Because the refresh is fully automated, the project team should explain
how success and failure of the backup impacts the refresh cycle. The project team should explain how the
backup will be stored, accessed, and if necessary, restored. The project team should also provide a brief
discussion of how long the backup will be maintained before being deleted.

Reestablish Base system Configuration

During project planning, the project team should explain how the environment will be reset to a base
version of the system once the environment is backed up. If the system involves the restoring of server
images, the project team should explain how the images will be established, refreshed, stored and
maintained.

Code Migration

During project planning, the project team should briefly explain how the code for the system will be
migrated to the specified environment after the establishment of the base system.

AR DHS Agile System Implementation Guidelines v0_13.docx Page 19

You might also like