You are on page 1of 54

Testing in an Agile

Environment

Robert Walsh - President - EnvisionWare, Inc.


rwalsh@envisionware.com

1
Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved
2
Agenda
• Myths and Misperceptions

• Agile’s Guiding Principles

• Structure of an Agile Environment

• Testing Activities in Agile

• Putting it all Together

• Handling Defects and Test Failures

• Conclusion

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


3
Myths and Misperceptions

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


4
Myths and Misperceptions
 Agile doesn’t test

 Agile doesn’t need testers

 There’s no place in Agile for manual testing

 Agile doesn’t write documentation

 Agile doesn’t plan

 Agile has a public release at the end of each


iteration

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


5
Agile’s Guiding Principles

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


6
Agile’s Guiding Principles
 Don’t repeat yourself (DRY)

 Just enough, but no more

 You ain’t gonna need it! (YAGNI)

 Do only what is valuable

 Test early, test often

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


Structure of an Agile
7
Environment

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


Structure of an Agile
8
Environment
 Product features are written as stories and stored
in a product backlog

 A customer (or customer team) prioritizes the


backlog according to business value

 Development is divided into iterations


(sometimes called sprints)

 The Whole Team works in parallel to deliver new


functionality each iteration

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


Structure of an Agile
9
Environment
 Principle roles within the Whole Team
 Customer Customer Team
 Real
 Proxy / Surrogate
 Business Analyst / Product Owner
 Developers
 Testers / QA
 Documentation / Technical Writers

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


Structure of an Agile
10
Environment

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


Structure of an Agile
11
Environment
 Each iteration
 Starts with an Iteration Planning Meeting (IPM)
 Showcases functionality from last iteration
 Acceptance Tests are run
 Opportunity to describe and discuss stories for
upcoming iteration

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


Structure of an Agile
12
Environment
 Iteration 0
 Project kickoff
 Team Formation
 Sharing the Common Vision
 Develop initial stories
 Create a target release plan
 Prepare development and test infrastructure
 Build system
 Automated test framework

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


Structure of an Agile
13
Environment
 Groups work in parallel within the iteration

Development

QA
I I
P P
M M
Documentation

Customer Team

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


Structure of an Agile
14
Environment
 Iterations 1 - n
 System evolves over time in cross-section
Iterations

Database
Complete functionality
developed end-to-end
Network

Business Logic

GUI

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


Structure of an Agile
15
Environment
 Release
 Finalize installation programs, documentation
 Prepare distribution package
 Publish

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


16
Testing Activities in Agile

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


17
Testing Activities in Agile
 Define Acceptance Criteria

 Test-Driven Development (TDD)

 Automated Acceptance Testing

 Exploratory Testing

 Usability Testing

 Bug Fix Verification

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


18
Testing Activities in Agile
 Define Acceptance Criteria
 Purpose
 To avoid misinterpretation of the requirements
 Properly set expectations
 Provide transparency into the test cases
 Done by
 Customer team
 Assisted by
 QA, Development

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


19
Testing Activities in Agile
 Define Acceptance Criteria
 Done when
 At the start of the project
 At the start of each iteration
 Done how
 In the language of the Customer
 Preferably executable

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


20
Testing Activities in Agile
 Test-Driven Development (TDD)
 Purpose
 To verify that each discrete part of the system
works as expected
 Done by
 Developers
 Assisted by
 QA

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


21
Testing Activities in Agile
 Test-Driven Development (TDD)
 Done when
 Continuously
 With each compilation
 As part of the build
 Done how
 In the language of the code
 xUnit frameworks

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


22
Testing Activities in Agile
 Automated Acceptance Testing
 Purpose
 To verify that the system satisfies the
Customer’s expectations
 To help document the system
requirements
 To provide assurances that the system
continues to work
 Done by
 Customer
 Assisted by
 QA
Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved
23
Testing Activities in Agile
 Automated Acceptance Testing
 Done when
 Continuously (as part of the build)
 Whenever the Customer wants to see the state of
the project
 Done how
 In the language of the Customer
 FIT
 Fitnesse

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


24
Testing Activities in Agile
 Exploratory Testing
 Purpose
 To expose unusual or unexpected behaviors in
response to typical, combinatorial, and atypical
user activity
 Done by
 QA
 Assisted by
 Customer, Developers
 Paired testing

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


25
Testing Activities in Agile
 Exploratory Testing
 Done when
 Infrequently in early iterations
 More frequently in later iterations
 Almost continuously near the release
 Done how
 In the language of the tester
 By manually running the application
 By exercising parts of the application through
automation

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


26
Testing Activities in Agile
 Usability Testing
 Purpose
 To determine and/or verify that the system can be
used effectively by the users
 Done by
 Customer
 Assisted by
 Business Analyst / Product Owner

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


27
Testing Activities in Agile
 Usability Testing
 Done when
 At the beginning of the project
 Within each Iteration
 At or after Release
 Done how
 In the language of the Customer
 Structured and unstructured observations of users
as they attempt to perform tasks in the system (or
a mock up of the system)

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


28
Testing Activities in Agile
 Bug Fix Verification
 Purpose
 To ensure that defects are corrected
 To augment existing automation to ensure bugs
don’t come back
 Done by
 QA
 Assisted by
 Developers

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


29
Testing Activities in Agile
 Bug Fix Verification
 Done when
 When development says bugs have been fixed
 Done how
 By exercising the code where the bug was found
 By creating additional scripts or other automation
to expose the bug

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


30
Testing Activities in Agile
 But what about ...
 Regression Testing
 Integration Testing
 Performance Testing
 System Testing

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


31
Testing Activities in Agile
 To the extent possible, these should be
addressed via automated acceptance testing
 Growing body of acceptance tests provides
regression
 Acceptance tests should test the behavior of
multiple modules working together
 Where performance is important, acceptance tests
should exist to verify that performance meets
expectations

 System Tests likely will overlap with Exploration


Testing

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


32
Putting it all Together

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


33
Putting it all Together
 Groups work in parallel within the iteration

Development Test Driven Development

QA Test Authoring Test Execution

Documentation Document Content


Document Structure

Acceptance, Usability Testing


Customer Team
Support Role for Development & QA

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


34
Putting it all Together
 Development
 Steady, constant focus on implementing new
functionality according to Customer priorities
 Incremental progress guided by Acceptance Tests
and verified through Test-Driven Development
 QA helps Development
 Identify edge cases
 Provide representative and appropriate sample
data for unit tests

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


35
Putting it all Together
 QA
 Early iterations focus on
 Creating test cases
 Creating and fine-tuning the automated testing
infrastructure
 However, tests are run as soon as functionality
exists!
 Later iterations focus on
 Executing tests (automated and manual)
 Exploring, digging, and probing to understand
how the product responds in a variety of situations
 Collaborates with Customer Team and Development

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


36
Putting it all Together
 Documentation
 Early iterations focus on
 Creating the structure for the documentation
 Boiler plate
 Chapter outlines
 As stories are completed, content can be
“roughed-in”
 Later iterations focus on
 Enhancing content to match functionality

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


37
Putting it all Together
 Customer Team
 Emphasis is on “being available”
 To clarify
 To answer questions
 Usability Testing
 Experiment with early versions
 Explore ideas for upcoming features and
functionality
 Acceptance Testing
 As functionality evolves, more Acceptance Tests
pass
 Helps gauge progress

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


38
Putting it all Together
 How do we get all of this done each iteration?

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


39
Putting it all Together
SPRINT START SPRINT END

Define Acceptance Tests Automate Acceptance Tests Exploratory

M ME N DED
NOT RECO
TDD (Test Driven Development) Full Regression

Manual Testing Performance


Strategy One:
Work One
Iteration Behind
Verify Bug Fixes

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


40
Putting it all Together
SPRINT SPRINT SPRINT N
(Hardening

Automate Acceptance Tests Automate Acceptance Tests Full Regression

TDD TDD Exploratory

Manual Testing Manual Testing Performance

Verify Bug Fixes Verify Bug Fixes Strategy Two:


The Hardening
Sprint
Exploratory Exploratory

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


Handling Defects and Test
41
Failures

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


Handling Defects and Test
42
Failures
 Even in environments with lots of testing, defects
still happen
 Misunderstandings and miscommunications
 Undiscovered requirements, both functional and
non-functional
 Implicit and unstated assumptions and expectations
 Errors in tests

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


Handling Defects and Test
43
Failures
 When defects are discovered, both QA and
Development need to ensure a test exists that
exposes the defect
 These tests will also prove the defect has been
corrected
 These tests will help to ensure the defect does not
come back

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


Handling Defects and Test
44
Failures
 Defect discovered in the same iteration as the
story in which it occurs
 QA should work with Development to fix the defect
as part of the current iteration
 The effort to fix the defect should be incorporated
into the effort to implement the feature (for metrics
and tracking purposes)
 The story should not be considered complete until
the acceptance tests associated with the story all
pass

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


Handling Defects and Test
45
Failures
 Defect discovered in a later iteration
 A new story to fix the defect may need to be
created, prioritized, and scheduled
 Customer ultimately must choose whether to
allocate resources to additional functionality or
fixing the defect
 QA and Development should ensure both have tests
that will catch the defect in the future

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


46
Conclusion

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


47
Conclusion
 Myth: Agile doesn’t test
 Testing is an integral part of Agile development
 Testing happens in every iteration
 Testing is done by almost every person involved in
the development of the product

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


48
Conclusion
 Myth: Agile doesn’t need testers
 Developers cannot test the product sufficiently
 Different mindset
 Different goals and objectives
 TDD and Automated Acceptance Tests are
supposed to augment traditional QA
 Testers should be free to do higher-level
exploratory and system testing

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


49
Conclusion
 Myth: Agile doesn’t write documentation
 Project teams in Agile environments write as much
documentation as is needed
 Automated Acceptance Tests and Unit Tests serve as
documentation
 They describe in unambiguous terms how the
product is supposed to behave
 Since they are executable, they describe exactly
how the product does behave
 This documentation can never be out of sync with
the code

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


50
Conclusion
 Myth: There’s no place in Agile for manual testing
 Automated Tests are supposed to allow skilled and
experienced testers to have more opportunity to
perform exploratory testing
 There is no real substitute for running the product to
see how it all works together
 The goal, though, is for that to NOT be the only
way (or even the primary way) the product is
tested
 Automate as much as is practical and use manual
efforts to fill in the gaps

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


51
Conclusion
 Myth: Agile doesn’t plan
 Iteration 0 is focused almost exclusively on “the
plan”
 Each iteration begins and ends with the Iteration
Planning Meeting (IPM)
 The goal is to do just enough planning to accomplish
the scope for the iteration
 Without closing off future options
 By leaving enough flexibility to adapt to new
information

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


52
Conclusion
 Myth: Agile has a public release at the end of
each iteration
 Products in Agile environments are released publicly
when the Customer decides there is enough
functionality to provide sufficient business value
 The goal is for the product to be as close as possible
to a releasable state at the end of each iteration

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


53
Questions?

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved


Thank you!
54

For more information:

Robert Walsh
President
EnvisionWare, Inc.
678-584-5911
rwalsh@envisionware.com
http://www.envisionware.com

Copyright©2008-2009 Robert Walsh and Megan Sumrell - All Rights Reserved

You might also like