You are on page 1of 112

Agile & High Speed

Software Testing Techniques

Bob Galen

President & Principal Consultant,


RGCG, LLC – Leading you down the path of agility…
www.rgalen.com bob@rgalen.com
Instructor Bio

Bob Galen

‰ 25 years of experience developing and testing software systems


‰ 15 years of leadership experience at manager & director levels
‰ RGCG, LLC formed in 2000 and located in Cary, NC
‰ Regular speaker at Development, PM, and SQA conferences
‰ Author of Software Endgames and articles in Better Software,
Software Testing & Performance, and QAI journal
‰ Experienced XP Coach, Certified ScrumMaster, and Agile
methods enthusiast
‰ More info at www.rgalen.com/about.html

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 2
Basic Protocol

„ Ask questions, stay engaged, challenge the ideas


„ It’s our class
„ Class runs from 8:30am – 4:30pm
‰ 10 minute breaks about every hour, lunch around 12 for an hour
‰ We may start early / leave early on 2’nd day
‰ Please arrive & return from breaks on time
„ Turn off misc. devices
„ Please fill out course evaluation and send me personal
feedback as well – bob@rgalen.com

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 3
Primary Course Goal

„ Provide a survey of the latest thinking on fast (time


constrained) testing methods
‰ Including an emphasis on Risk-Based Testing (focus) and
stakeholder management (connection & inclusion)

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 4
Introduction

„ I once interviewed for a SQA manager position. The


interview team was composed of 5 development
managers and 1 director. Oddly enough, there were no
testers. Each of them asked me the same hypothetical
question…

What would you do if you’d planned on having 6 weeks for application


testing. Then development needed more time to complete its work,
so your time was reduced to 2 weeks. How would you assure the
same level of quality for the application?

„ This class is intended to address this (perhaps all too


common) situation.

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 5
Introduction

„ What would you say in this situation?

„ More importantly, if you indeed had only 2 weeks, what


would you test?

„ How would you define and achieve excellence in your


testing strategy?

„ How would you communicate and engage stakeholders


in the risk?

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 6
Introduction
Disclaimer
„ You need to keep in mind that there are no Silver Bullets
in software development projects AND this course
promises none

„ Sometimes lemons are only lemons – no matter how


hard you squeeze

„ What I do promise is real world examples of how to


approach time & resource constrained testing projects
‰ However, you’ll need to bring your courage, integrity, creativity,
and hard work to the process

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 7
Course Outline

Introduction

1. Context Based Testing


2. Just-in-Time Testing
3. Lean in Testing
4. Exploratory Testing
5. Risk-Based Testing
6. Pareto-Based Testing
7. AllPairs Technique &
Tool

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 8
Section 1
Context–Based Testing
Outline:

„ Introduction to the 4 “Schools” of testing


„ The “context driven” mindset, learning how to adjust
„ Exploring the dimensions of your context

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 9
Context–Based
“Roots”
„ Thoughts originated by Cem Kaner,
James Bach, and Bret Pettichord
„ Michael Bolton, Mike Kelly, and Rob
Sabourin are other recent
contributors
„ Pettichord characterized 4 “schools”
in a 2003 presentation
„ Lessons Learned in Software
Testing: A Context-Driven Approach
– published in 2001

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 10
Context–Based
Core Philosophy
There are NO

“Testing Best Practices”

There are only

Good Practices that are Applied in a Specific


Context

In other words – One Size Does Not Fit All Situations

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 11
Context–Based
7 Basic Principles of the Context–Driven School
1. The value of any practice depends on its context.
2. There are good practices in context, but there are no best
practices.
3. People, working together, are the most important part of any
project's context.
4. Projects unfold over time in ways that are often not predictable.
5. The product is a solution. If the problem isn't solved, the product
doesn't work.
6. Good software testing is a challenging intellectual process.
7. Only through judgment and skill, exercised cooperatively
throughout the entire project, are we able to do the right things at
the right times to effectively test our products.

http://www.context-driven-testing.com/

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 12
Context–Based
4 Schools
Analytic School Quality School
Sees testing as rigorous and Emphasizes process,
technical with many policing developers and
proponents in academia acting as a gatekeeper
Exemplar: Code Coverage Exemplar: The Gatekeeper

Standard (Factory) School Context-Driven School


Sees testing as a way to Emphasizes people, setting
measure progress with out to find the bugs that will
emphasis on cost and be most important to
repeatable standards stakeholders
Exemplar: Requirements Exemplar: Exploratory Testing
Traceability

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 13
Context–Based
4 Schools: Risk-Based Focus
Analytic School Quality School

Operational profiles Uncover project risks


Calculate reliability Prove project is not adhering to
process, not repeatable

Standard School Context-Driven School

Feature risk assessments Develop understanding of risks


Specific schedule risk Develop tests targeting
identified risks

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 14
Context–Based
Context Explored…
„ Dimensions of context „ Project schedule and status
„ Methodology
„ Test team skill set & capabilities
Let’s explore some of the „ Progress to-date
factors that influence testing „ Product technologies
project context…
„ Cross-team capabilities
„ Distributed team
„ Overall team size
„ Development team capabilities
„ Requirement clarity
„ Politics „ Level of change
„ External customer pressure „ Expectations
„ Business conditions „ History

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 15
Context–Based
Context Explored…
„ Plans are road-maps, not straight-jackets
„ Context is dynamic, changing daily
„ Part of establishing context is goal setting & focus
„ It’s about the people – testers and their skills!

„ Top priority contexts


‰ Testing school connections
‰ Time (within the schedule, spent & remaining)
‰ What’s already been done & not done
‰ Setting and resetting expectations

„ Remember: one size never fits all!


Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 16
One More School – March 2007
Agile School
„ In March of 2007, Pettichord revisited the 4 Schools and
added another…

„ Key question: Based on customer conversation &


acceptance, is the story complete?
„ Exemplar: TDD, Unit testing, automated tests
„ Authors: Bob Martin, Kent Beck, Brian Marick

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 17
5 Schools
Evolution

Analytic Quality

Standard

Context-Driven Agile

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 18
Context–Based
Wrap-up
„ What school(s) do your test team(s) align with?

„ If you were to adopt a context–based approach, what


would be some of the benefits?

„ Some of the issues or impediments?

„ How would you communicate, explain and defend your


context decisions back home –
‰ Within your teams?
‰ Within your projects?
‰ To your stakeholders?
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 19
Section 2
Just-in-Time Testing
Module Outline:

„ Introduction to Rob Sabourin & JIT


„ Exploring test ideas – mine, define, prioritize, plan,
chunk, track.
„ JIT Triage & Workflow
„ Scenario testing
„ Working with Development

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 20
JIT Testing
Foundations
„ Rob Sabourin (www.amibug.com) has defined a set of
related practices for Just-in-Time (JIT) testing. None
are novel or unique, it’s the combination that’s
important.

„ Central JIT Themes


‰ Ruthless Triage
‰ Collaborate
‰ Test Early and Always, 7x24
‰ Adapt – Daily
‰ Chunk Work
‰ Work in Parallel

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 21
JIT Testing
Foundations

Begin with the End in Mind (Covey’s 7 Habits)

„ Declare goals
‰ How do we know when we’re finished?
‰ Phased exit criteria, release criteria, hand-off criteria – be clear &
be concise
„ What is the overriding purpose of the product?
‰ How do we support it – well?
„ Define testing practices
‰ Not so much process, as rules of the road for this engagement
‰ Bug flows

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 22
JIT Testing
Test Ideas
„ Tests are designed as part of a collaborative, note
based, brainstorming process
„ The idea is to get all test ideas on the table as a team or
group
„ Think of a test idea as:
‰ A succinct thought
‰ A singular test
‰ Connecting to central (project) goals & (application) themes
‰ Small, low fidelity artifacts; written on a card or post-it note
‰ Requiring high thought, but low (documentation) effort and overall
investment
„ Brainstorm, list, sort, organize, shuffle

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 23
JIT Testing
Test Idea Groupings
„ From testing ideas build a set of Test Objectives (TO)
‰ Each can be assigned to a tester as work
‰ Each can include all, part of, or multiple testing ideas
„ Decide how to organize the ideas (work)
„ Assign testers to TO’s (considering experience & skill,
availability & impact)
„ Work the test ideas / TO’s into sessions or “chunks”
‰ 90 – 120 minutes
‰ Related sets of test ideas
‰ Finest granularity of work planning, scheduling & reporting
‰ Chunk execution planned “in the moment” and not well in
advance

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 24
JIT Testing
Test Ideas
„ Where to find them?

‰ Creative approaches: lateral thinking, mind maps, and action


verbs
‰ Investigation: historical reviews, oracle interviews, metrics
‰ Bug taxonomies
‰ Requirements: functional, non-functional, use cases, implicit
‰ Usage Scenarios
‰ Functional analysis: CRUD, boundary analysis, prototypes
‰ Failure Modes
‰ Domain or Product Quality Factors
‰ In the moment – experience, exploration & discovery
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 25
JIT Testing
Test Idea Brainstorming
„ Gather group of testers and stakeholders
„ Walk in with “product awareness”
„ Kick-off the meeting
„ Everyone build their own lists of ideas
„ Converge ideas – remove overlaps, fill gaps
„ Prioritize
„ Estimate size of the ideas
‰ Look to balance granularity of the ideas – roughly same size.
Combine / breakdown as necessary
‰ Fit into a 90-120 minute sessions or chunks
„ Discuss workflow – optimum scheduling and work
assignments
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 26
JIT Testing
Lifecycle of a Test Idea
Example Priority

¾ Pops into existence 1. Test now


¾ Discussed, clarified, tuned 2. Test before (time)
¾ Prioritized ►►► 3. Test before (release)
¾ Integrated into a testing 4. Nice to have
objective (TO) 5. May be of interest in future
¾ Run releases
¾ Feedback, change, re- 6. Not of interest in current form
clarified, re-tuned 7. Will never be of interest

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 27
JIT Testing
Triage
„ Consider it the “daily meeting” for JIT targeting
‰ Review the current business & technical context
‰ Review testing context: testing progress, bugs, discovery and
new test ideas

„ Work sign-up or assignments


‰ Best tester (skill), most appropriate (context SME), most available
(time)
‰ Rework the chunking and workflow

„ Test idea reprioritization


‰ New ideas, new chunks, highest priority focused
‰ Should we skip tests (ideas, whole chunks or scenarios)
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 28
JIT Testing
Scenario Based Testing
„ Develop a series of typical usage scenarios for each
type of user
‰ Based on functional descriptions – use cases, requirements,
stories, or observation. Could use story boarding
‰ In parallel with other activity
„ Use the chunking methods to elaborate scenarios
„ Selection of execution team based on experience
„ Scenarios cover operational threads across sets of
features
„ Usually only a few critical scenarios, 5-10-20
‰ Pareto applies here, seeking the critical 20%

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 29
JIT Testing
Status Reporting
„ Lightweight, reactive, daily in real-time. Synonymous
with the Agile notions of information radiators and daily
stand-up meetings
„ Report test status at the “chunk” level
‰ Elaboration
‰ Planned / executed
‰ Passed / failed; confidence levels
„ Bug trending
‰ Open, closed
‰ Fixed, Remaining to verify, Regressions
„ Provide broad & deep visibility to ALL
‰ Invite ALL interested parties, stakeholders, ultra visibility

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 30
JIT Testing
Working with Development
„ Influence the development team in the design phases
‰ Can you control the application via API; below GUI layer?
‰ Error & status logging completeness and availability?
‰ Configuration flexibility (setups, queried)?
‰ Influence any specific testing “hooks” for data gathering &
measurement?
„ Configuration Management
‰ Integration w/smoke testing
‰ Baseline hand-offs for testing; revert easily to previous release
‰ Entry criteria; loose, agile, not as an impediment
‰ Development and testing are on separate servers &
environments
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 31
JIT Testing
Wrap-up
„ What are the key JIT practices?

„ Do you do any of this now?


‰ What works – well?
‰ What doesn’t?

„ How does JIT relate to your understanding of agile


teams?

„ Is JIT a test team only function?


‰ Where are the collaborations?
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 32
JIT Testing
Morning Workshop (30 minutes to 1 Hour)
„ Gather together into teams of 3-6, pick a leader who will facilitate the
brainstorming
„ Go to www.google.com
„ Your task is to create a set of interesting test ideas for the Google
search engine. Pick your primary feature set or area and start
brainstorming. Your deliverables should include:
‰ A set of at least 50 varied test ideas

‰ Prioritize them according to impact to the feature area you


selected (High, Medium, Low)
‰ Try and estimate how long each idea would take to execute (in
10 minute increments)
„ De-brief the workshop as a group

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 33
Section 3
Lean Software Development – Testing
Implications
Module Outline:

„ Overview of Lean Software Development principles


„ Connect LSD towards testing activity
„ Acceptance is important – release criteria & smoke
tests
„ Adopting a Just in Time and Just Enough mindset
towards application delivery
„ Testing only when its ready

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 34
Lean Software Development – The Why…

„ Bringing Lean principles into play for


software – Mary & Tom Poppendieck

„ Lessons learned from Lean Manufacturing


at 3M, Toyota & elsewhere

„ Strong connection to the “roots” of Scrum


as well

„ Consider them more so the “why”


underscoring all Agile Methodologies and
a toolbox of techniques

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 35
7 Core Principles of
Lean Software Development

1. Eliminate Waste – don’t do things that don’t add value


to the system – features, architecture, documentation,
bugs. Also process steps – hand-offs, sign-offs,
decision delays, long prioritization steps, etc.

Implications for Testing


„ Only document relevant tests; depth based on impact
„ Keep bug list short, relevant and focused
„ Create verbal, but well understood hand-offs
„ All testing artifacts need to be actively used; just enough content

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 36
7 Core Principles of
Lean Software Development

2. Build Quality In – Amplify Learning: put processes in


place that enable learning leading to adaptation and
improvements. Build it right the first time.

Implications to Testing
„ Become active early and often; collaboration with development
„ Observe, learn, and adjust with the evolution of the application –
don’t assume static models
„ Become a team quality champion, serving as an example by
providing data, insights and techniques
„ Don’t get stuck in approaches; always add value

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 37
7 Core Principles of
Lean Software Development

3. Create Knowledge – Decide as Late as Possible:


illustrates the notion that waiting until you know more is
better than the typical delivery, change and rework
cycle. Decrease speculation; replace with knowing.

Implications to Testing
„ Become active early and often; collaboration with development
„ Continuously observe, learn, and adjust with the evolution of the
application – don’t become too dependent (stuck) in your plans
„ Become intimate with the customer; understand the true value
drivers for the release
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 38
7 Core Principles of
Lean Software Development

4. Defer Commitment – Deliver as Fast As Possible:


delivery in this context includes not only the product, but
all related artifacts. Deliver with high quality too! Speed
can totally disrupt your competition.

Implications to Testing
„ Test whenever possible; take incremental releases
„ Test what works; skip what doesn’t
„ Provide real-time (daily) constructive feedback for development
„ Apply risk-based testing techniques
„ Automate as much as possible; as soon as possible; to reduce
cycle-time
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 39
7 Core Principles of
Lean Software Development

5. Deliver Fast – Empower the Team: self-directing teams


are a central force within agile and lean practices and
thinking. If you want to go fast, you need engaged,
thinking people who can be trusted to make good
decisions and help each other out.

Implications to Testing
„ Become active early and often; collaboration with development;
truly become an active partner in “paired” development & testing
„ Apply risk-based testing techniques
„ Automate as much as possible; as soon as possible; to reduce
cycle-time
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 40
7 Core Principles of
Lean Software Development

6. Respect People – Build in Integrity: the resulting


system is sensible, useful, easy to maintain and adjust,
being put together in a professional manor. Keys are
getting the right people.

Implications to Testing
„ Be proud of your profession; continuously learn new ways and
approaches for you domain
„ Remain Open Minded!
„ Become a business partner; become intimate with how customers
USE your applications
„ Apply scenario based testing
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 41
7 Core Principles of
Lean Software Development

7. Optimize the Whole – See The Whole: don’t allow


strength in particular areas to guide the entire systems
evolution. Think at a “systems” level. Don’t fall into the
sub-optimization trap.

Implications for Testing


„ Apply scenario based testing; carefully consider the broad
requirements for the application
„ Maintain a broad view to overall product quality; factor in your
business dynamics; become a champion for the customer
„ Continuously increase the capabilities of your team across
technology, testing, and business domains

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 42
Lean Software Development
7 Wastes in Software
Manufacturing Testing Software
Development
1. In-Process 1. Evolving large test plans 1. Partially Done
Inventory & scripts Work
2. Over Production 2. 100% coverage goals 2. Extra features
3. Extra 3. Loss of context or SME (Gold Plating)
Processing 3. Relearning
4. Transportation 4. SDLC driven test cycles 4. Handoffs
5. Motion 5. Parallel testing projects 5. Task Switching
6. Waiting 6. Development schedules 6. Delays
7. Defects 7. Defects 7. Defects

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 43
Lean Testing Practices
Wrap-up, Let’s Brainstorm Impact?
Hi, Med, Low
1. Smoke Testing
2. Release Criteria
3. Small releases
4. Collaboration w/customer on Value!
5. Pair w/Development
6. Influence Quality – Design, TDD
7. Limit Waste – Just enough, Just-in-Time
8. Risk Based Testing
9. Limit multitasking
10. Impact driven work assignments
11. Automate wherever possible
12. Test constantly
13. Inspect & Adapt

„ What practices have I missed?


„ How do we practically use Lean?
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 44
Section 4
Exploratory Testing or ET
Module Outline:

„ Overview & history of Exploratory Testing


„ Role of heuristics (guidelines, hints) in the process
„ Developing overall goals and session themes
„ The exploratory session – boundaries & examples
„ When & where to use ET – come up with some
common scenarios that illustrate the “sweet spots” of
ET

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 45
Exploratory Testing
History
„ The technique was coined by Cem Kaner in his book –
Testing Computer Software
„ James Bach (www.satisfice.com) has been expanding
the definition & practice of the technique and is
considered the father Exploratory Testing in Practice
„ His brother Jon Bach has focused on managing ET
focused projects, so has expertise on the management
side
„ Wrongly confused with ad-hoc testing. Contrasting
characteristics include:
‰ Trainability, repeatability, focused or tasked, agile uses, and
experience centered

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 46
Exploratory Testing
History
„ The technique is the primary
method used for testing within
the context-based school.
„ James & Jon Bach
„ Key proponents who are „ Scott Barber
forwarding the craft include
►►► „ Michael Bolton
„ Elisabeth Hendrickson
„ Variations include JIT, Lean, „ Mike Kelly
Rapid Software Testing, etc. „ Jonathan Kohl
„ James Lindsey
„ In fact, James Bach’s ET „ Rob Sabourin
course is entitled Rapid
Software Testing

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 47
Exploratory Testing
Scripted vs. Exploratory Continuum

fragmentary
pure scripted freestyle exploratory
test cases
vague scripts (scenarios) charters roles

To know where a test falls on this scale, ask


yourself: “to what extent am I in control of the
test, and from where did the idea originate?”

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 48
Exploratory Testing
Heuristics as a testing device
Some Exploration Skills and Tactics

“MR.Q COMC GOARABC R&R?”

Modeling Chartering Generating/Elaborating Recording

Resourcing Observing Overproduction/Abandonment Reporting

Questioning Manipulating Abandonment/Recovery


Collaboration Refocusing
Alternating
Branching/Backtracking
Conjecturing

Exploratory testing is a mindset using this skillset.


Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 49
Exploratory Testing
Skills & Tactics
„ Modeling
‰ Composing, describing, and working with mental models of the
things you are exploring. Identifying relevant dimensions,
variables, and dynamics. A good mental model may manifest
itself as having a “feel” for the product; intuitively grasping how it
works.
„ Resourcing
‰ Obtaining tools and information to support your effort. Exploring
sources of such tools and information. Getting people to help you.
„ Questioning
‰ Identifying missing information, conceiving of questions, and
asking questions in a way that elicits the information that you
seek.

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 50
Exploratory Testing
Skills & Tactics
„ Chartering
‰ Making your own decisions about what you will work on and how
you will work. Understanding your client’s needs, the problems
you must solve, and assuring that your work is on target.
„ Observing
‰ Gathering empirical data about the object of your study; collecting
different kinds of data, or data about different aspects of the
object. Designing experiments and establishing lab procedures.
„ Manipulating
‰ Making and managing contact with the object of your study;
configuring and interacting with it.

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 51
Exploratory Testing
Skills & Tactics
„ Collaboration
‰ Working and thinking with another person on the same problem;
group problem-solving.
„ Generating / Elaborating
‰ Working quickly in a manner good enough for the circumstances.
Revisiting the solution later to extend, refine, refactor, or correct
it.
„ Overproduction / Abandonment
‰ Producing many different speculative ideas and making
speculative experiments, more than you probably need, then
abandoning what doesn’t work. Examples are brainstorming, trial
and error, genetic algorithms, free market dynamics.

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 52
Exploratory Testing
Skills & Tactics
„ Abandonment / Recovery
‰ Abandoning ideas and materials in such a way as to facilitate
their recovery, should they need to be revisited. Maintaining a
repository of old ideas.
„ Refocusing
‰ Managing the scope and depth of your attention. Looking at
different things, looking for different things, in different ways.
„ Alternating
‰ Switching among or contrasting different activities or perspectives
so as to create or relieve productive tension and make faster
progress.

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 53
Exploratory Testing
Skills & Tactics
„ Branching / Backtracking
‰ Allowing yourself to be productively distracted from one course of
action in order to explore an unanticipated new idea. Identifying
opportunities and pursuing them without losing track of the
process.
„ Conjecturing
‰ Considering possibilities and probabilities. Considering multiple,
incompatible explanations that account for the same facts.

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 54
Exploratory Testing
Skills & Tactics
„ Recording
‰ Preserving information about your process, progress, and
findings. Taking notes.
„ Reporting
‰ Making a credible, professional report of your work to your clients
in oral and written form.

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 55
Exploratory Testing
Analogies
„ Job Interview
„ Bounty Hunter
„ Warfare Operations
„ 20 Questions
„ Newspaper Reporter
„ Psychologist
„ Detective
„ Going to a conference
„ Others?
„ Lewis & Clark
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 56
Exploratory Testing
Session Strategy
„ Exploratory Testing proceeds in a series of
interconnected sessions that are focused on a specific
testing project (application)
„ Planning the project encompasses establishing a set of
time boxed session charters
„ Establishing roles and focus areas for the sessions or
groups of sessions
„ Establishing the session execution dynamics
‰ Starting, Stopping, Re-Chartering, Reporting
„ Reporting progress to stakeholders & re-establishing the
overall test strategy / charter
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 57
Exploratory Testing
Session Dynamics
„ Sessions are focused ET events
‰ They are limited in duration (60-120 minutes)
‰ They have a session charter, goal, or focus
‰ The results of the session are captured in a log
„ Testing path logged, findings & bugs reported, repeatable steps
‰ Sessions are de-briefed (retrospective) with re-chartering as
required for subsequent sessions
‰ A day of testing is composed of multiple sessions
„ Often there is a sense of collaboration in the sessions
‰ Paired testers; Paired w/developers
‰ Co-located in the same room; lab area
‰ Shared / common data & environment
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 58
Exploratory Testing
Roles & Feature Areas
„ Roles, as a…
‰ Power user ‰ Tester, testability
‰ Specific clients / ‰ Compliance user
configurations ‰ Process owner
‰ User communities ‰ Business user
‰ Administrator

„ Feature Areas of focus…


‰ Installation ‰ Performance
‰ Compatibility ‰ Load
‰ Database integrity ‰ Online help and docs
‰ 3’rd party add-ins ‰ Security
‰ Configuration & setup ‰ Interoperability
‰ Usability ‰ Beta

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 59
Exploratory Testing
What to Write While Exploring…

Observations • feature model


(To the degree you think they are • text from log files
relevant to stakeholders) • text from dialogs

Conjectures • test ideas


(Inferences based on • questions
experiences) • product and project issues
• concerns
• risks
Project information • charter
(Independent of observer) • test actions
• config info
• build details
• tools used

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 60
Exploratory Testing
Sweet Spots
„ Any extremely time constrained testing situation
„ Anytime you have a lot of ambiguity (stability, feature
operation, etc.)
„ When you’re blessed with lots of solid domain
experience, SME breadth
„ Smoke testing; Does it work?
„ Acceptance testing; Did I get what I expected?
„ Beta testing; Will we embarrass ourselves?
„ Agile testing – daily explorations: What works? What
doesn’t? Progress? Feedback for development…

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 61
Exploratory Testing
Supporting Tools
„ Philosophically Exploratory Testing is not a tool based
activity, it’s a human experience based one. So tool
requirements are minimized. That being said…

„ The following can be useful –


‰ Any tools that allow you to capture screen state information – ex:
Spector
‰ Quick, UI interaction tools – ex: Perlclip
‰ Fast logging / scripting tools; ex: Log-Watch
‰ Web based DTS
‰ Wiki’s
‰ TestExplorer

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 62
Exploratory Testing
Wrap-up
„ Let’s compare & contrast JIT and ET
‰ Where are the similarities?
‰ Do you see differences?

„ How would an entire ET test project work?


‰ Planning? Execution?
‰ Reporting? Adjustment?

„ Do any of you use the technique now? What are the


sweet spots?
‰ Where are the challenges? Suggestions on how to use it?

„ Read - Context-Driven Yahoo Group discussion!


Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 63
Exploratory Testing
Afternoon Workshop (30 minutes to 1 Hour)
„ Gather together into the same teams from the morning exercise
„ Using the same testing focus and test ideas, your task is to
strategize session based exploratory testing and execute a few short
(10 minute) sessions
„ Using your test ideas from the morning session, create a set of
interrelated session charters to plan & focus your testing efforts.
Your deliverables should include:
‰ The test idea estimates from the morning should be reevaluated
and adjusted as necessary
‰ Combine related test ideas into a set of at least 5-10 session
charters that should cover 10 minute time boxed sessions.
‰ Every team member should execute 1-3 sessions and log their
“travels”
„ De-brief the workshop as a group

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 64
Section 5
Risk–Based Testing
Module Outline:

„ Realizing you can’t cover it all!


„ Product decomposition – making a feature map
„ Risk planning – collaborative analysis with stakeholders
of types of testing, timing of testing, pervasiveness of
testing
„ Testing phases and activities – selecting the right
places to “focus”
„ Other management considerations

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 65
Risk–Based Testing Background

„ It starts with the realization that you can’t test everything


– ever!
100% coverage being a long held myth in software
development

„ There are essentially 5 steps in most of the models


1. Decompose the application under test into areas of focus
2. Analyze the risk associated with individual areas – technical,
quality, business, schedule
3. Assign a risk level to each component
4. Plan test execution, based on your SDLC, to maximize risk
coverage
5. Reassess risk at the end of each testing cycle

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 66
Risk–Based Testing Background

„ Risk–Based Testing is effectively a risk mitigation


technique
‰ Not a prevention technique

„ It’s about trade-offs


‰ Human and physical resources
‰ Ratio’s between Producers (Developers) and Consumers
(Testers)
‰ Time
‰ Rework (retesting & verification)
‰ Quality – Coverage vs. Delivery
‰ Visibility into the trade-offs
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 67
Risk–Based Testing
Decomposition
„ Application-to-testing decomposition is the first action
you face. Breaking down the application and its
associated testing artifacts. I prefer a 3 tiered model for
test decomposition:

1. Suites: A collection of test cases, focused on testing a related


set of functions, features or requirements.
2. Cases: A collection of steps, focused towards testing a specific
function, feature or requirement; It should have a succinct goal
3. Steps: Individual test steps executing the test case

„ Suites should reflect not only functional, but non-


functional requirements.
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 68
Risk–Based Testing
Decomposition
„ For risk analysis, we’ll operate at the Suite level

„ For each Suite, you should determine:


‰ Approximate # of cases per suite
‰ Complexity of the Suite (Very, Average, Simple)
‰ Time to design tests for the suite
‰ Time to setup & teardown the environment for the suite
‰ Time to run tests for the suite & handle “typical” defects #’s

„ These are high level estimates, based on history


whenever possible, and will serve as a sizing indicator
when we’re prioritizing and scheduling suites

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 69
Risk–Based Testing
Risk Prioritization
„ A testing version, more dimensions:
‰ Business Importance

‰ Probability of Instability
„ In traditional
‰ Overall Complexity
prioritization
‰ (Probability: 0-3) x
(Impact: 0-3) = Risk ‰ Coverage & Timing
Level „ First release (cycle)
„ Calculate a risk level for „ Middle cycles
each suite „ Last release(s) (2 cycles)
„ Rank order them „ Pre-Production Release

„ Decide how to cover


„ Instead of a static view, a continuum
each via test plans &
view across different stakeholder
schedules perspectives
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 70
Risk–Based Testing
Risk Prioritization
„ Traditional - Collaborative Risk Planning Workshop
‰ Include stakeholders and interested parties – BA, Architects,
Developers, Testers, PM’s, Stakeholders, Management, etc.
‰ Prepare by reading product requirement and other artifacts
‰ Test team leads discussion for each suite – default intentions,
concerns or issues, questions
‰ Q&A
‰ Constituents feedback their views to –
„ Business Importance, Probability of Instability, Overall Complexity,
Coverage & Timing
‰ Test team constructs a working view towards
„ Individual suite risk handling
„ Overall project risk handling
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 71
Risk–Based Testing
Risk Prioritization
„ Agile – Planning Poker, Risk Planning Workshop
‰ Include stakeholders and interested parties – BA, Architects,
Developers, Testers, PM’s, Stakeholders, Management, etc.
‰ Have the nearest SME / feature owner / BA overview a Test
Suite (feature set or requirement set)
‰ Ask questions
‰ Constituents vote via cards on their views to –
„ Business Importance, Probability of Instability, Overall Complexity,
Coverage & Timing
‰ High & Low participants discuss the “why” behind their views
‰ Re-vote until you converge as much as possible
‰ Repeat…

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 72
Risk–Based Testing
Risk Prioritization
„ Bottom & Top 20% exercise
‰ A typical outcome of all prioritization efforts is that everything (or
+80%) is rated as High
‰ While this may be true from individual perspectives, it doesn’t
help the testing team make good choices
„ At the end of the workshop, have everyone “vote” on
their Top 20% and Bottom 20% using dot voting.
‰ Give each of your primary constituents (Development, Business,
Testing) different colors
‰ This will help normalize the feedback while providing good
insight across perspectives

Remember: all “ties” are broken by the testing team!

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 73
Risk–Based Testing
Risk Scheduling & Tracking
„ Once you have your overall risk assessment and
cyclical feedback, you need to create a plan & schedule
that reflects the tempo and cyclical testing requirements
of your SDLC

„ Iterative or agile methodologies require more testing


cycles
‰ They also increase the complexity of your planning to sensibly
handle rework (re-testing, regression, integration, and repair
verifications)
‰ Ensure you don’t over-test, by testing too soon, then too often

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 74
Risk–Based Testing
Worksheet Example
„ Download sample worksheet from
‰ http://www.rgalen.com/t_files/AgileAndHSpeed-1dayMaterialsV2.zip
is a zip file link for class supporting documents…
„ This is simply a sample worksheet that I’ve used for risk-based
testing.
„ It tries to capture:
‰ Complexity
‰ Design, Setup, and Execution time at a suite level
‰ Handle different iterations in a risk–based manor:
„ High – person days as is
„ Med – person days / 2; Low – person days / 4
„ Consider it a high-level, risk-based planning tool for testing projects
(time, focus, & resources – to achieve a balance)

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 75
Risk–Based Testing
Test Suite – Execution Plan

e
Av e S s

im
e

e
as

C o rag ze

P r G ya e

as
pT e

as
nT
d t m

ec e
M iio r i g - y

S e T im
i
tC

C 1 e le
D eio r i m e
P r e n ine x it

E x im
E nio r i G a

e le
io
O pm p e
es

lR
s igty
d ty
ag

ut

lR
P r d le
l
fT

iti a
er

tu

na
e
Test Suites

#o

Av

C2

C3

Fi
In
Accept - Smoke 25 S M High High Med 4 1 2 3 2 2 1 2
Func - Database Meta-data Integrity 45 M H Low Med High 5 3 3 4 2 2 3 3
Func - Mware, Business rules 75 L H Low High High 10 2 5 3 5 5 5 5
Func - Real-time data 30 M M High Low High 5 1 2 3 1 1 2 2
Func - Intelligent Searching 45 L M High High High 5 5 3 8 3 3 3 3
Func - Area 3 25 S T Med Med Med 5 1 2 2 1 1 1 2
Func - Area 4 40 S T Med Med Med 5 1 2 2 1 1 1 2
Func - Area 5 45 S T Med Med Med 5 1 2 2 1 1 1 2
Func - Common UI Features 150 S T Med Med High 15 2 10 7 5 5 10 10
Comp - Operating systems 30 S T Low Low High 2 3 3 4 1 1 3 3
Comp - Browsers & databases 130 S M Low Low High 3 10 5 11 1 1 5 5
Perf - 5 sources, 5 user scenarios 25 L H Low Med High 15 3 5 4 3 3 5 5
Defect Verifications N/A N/A N/A Low High Low 5 1 5 2 5 5 1 5
Regression N/A N/A N/A High Low High 0 1 15 16 4 4 15 15
Automation N/A N/A N/A Low Low Low 10 1 5 2 1 1 1 5
Total Test Cases 665.0 Totals: 94 36 69
Average / time per test case 0.30 Total Time 74 35 35 58 69
Test team members 3.5 Team/Person Days 21 10 10 16 20
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 76
Risk–Based Testing
Your Context
„ A big part of your risk–based approach has to be
grounded in your context as it relates to
‰ Team ratio’s
‰ Equipment investments & capacity
‰ Methodology
‰ Product domain & any regulatory or compliance requirements
‰ Testing process and activities
‰ Automation strategies
‰ No hard lines; always trade-offs

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 77
Risk–Based Testing
Methodology Implications for Testing
Model Test Setup Test Planning Test Execution Test
Automation
Waterfall Large scale, early Traditional Single pass w/ Executed but
on, dedicated System Test view limited rework rarely
equipment developed till
the next release
RUP Enterprise scale, Incremental Test Iterative passes, Executed but
early on, often view moderate - heavy rarely
shared equipment rework developed till
the next release
Agile Small scale, often TDD model, Within Automated unit,
shared planned within development smoke, and
environments development iteration – unit & acceptance
until later iterations acceptance tests; minimal
iterations focused regression

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 78
Risk–Based Testing
Wrap-up
„ What are some of the other factors that influence your
projects’ risk?

„ Is automation a risk mitigation technique?

„ Stakeholders need to be connected. So far in the


course…
‰ Any lessons in how to effectively do that? Silver Bullets?
‰ Lessons from your own experience?

„ Risk planning is all about risk collaboration and shared


responsibility. Agree?
‰ Do you have experiences or techniques to share?

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 79
Risk–Based Testing
Methods Comparison
Types of Risk Based Exploratory JIT
Context-Based Testing
Testing
N/A N/A Test Usage
Management & Scenarios
Tracking level Test Suites Charter-driven Ideas “chunked”
sessions into Test
Objectives
Test Cases Test Ideas, Test Ideas
Team heuristics-driven
Collaboration & testing
Execution Test Steps N/A N/A

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 80
Section 6
Pareto-Based Testing
Module Outline:

„ History of Pareto
„ Charting
„ Using Pareto as a risk-based localization technique
„ Monitoring risk migration

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 81
Pareto
Principle
Italian economist Vilfredo Pareto observed that -
For many phenomena, 80% of the consequences stem
from 20% of the causes
When analyzing personal wealth distribution in Italy.

„ Also known as the 80-20 rule, the law of the vital few, and the
principle of factor sparsity
„ Joseph Duran brought the principle forward as a potential quality
management technique
„ In probability theory referenced as a Pareto distribution

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 82
Pareto Principle
“Thinking” Examples
„ In a Toyota Prius warehouse –
‰ 20% of the component boxes take up 80% of the space
‰ 20% of the components make up 80% of the overall vehicle cost

„ In software applications –
‰ 20% of the application code produces 80% of the defects
‰ 20% of the developers produce 80% of the defects
‰ 20% of the test cases (ideas) find 80% of the defects
‰ 20% of the test cases (ideas) take 80% of your time to design &
test
‰ 20% of the product will be used by 80% of the customers
‰ 20% of the requirements will meet 80% of the need
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 83
Pareto Principle
“Thinking” Examples
„ Leads to the notion of defect clustering. Many have
observed that software bugs will cluster in specific
modules, classes, components, etc.

„ Think in terms of stable or well made components versus


error-prone, unstable, and fragile components. Which
ones should receive most of your attention? Do the
areas remain constant?

„ Often, complexity plays a large part in the clustering.


Either solution (true) complexity OR gold-plating
(favored) complexity.

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 84
Open Defects per Functional Area
Trending – Pareto (80:20 Rule) Chart

Sample Pareto Chart

35 120
30
30 100 100
25
25 90
80 80
70
Defects

20 # Bugs
15 60
15 55 Cum %
10 10
40
10 30 5
5 20
0 0
UI Mware Parsing SOAP Reports Help

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 85
Open Defects per Functional Area
“Rolling” Pareto Chart
Open Defects per Functional Area

30

25

20
# of Defects

15

10

0
Jan 1-15 Jan 16-31 Feb 1-14 Feb 15-28 Mar 1-15 Mar 16-30
Proje ct w e e ks

Install & Config Internal files Dbase Reporting


R-time analysis Off-line analysis GUI Help & docs

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 86
Pareto Principal
Step 1 – Application Partitioning
„ The first major challenge to Pareto-Based risk analysis is
meaningfully partitioning your application. Here are some
guidelines –
‰ Along architectural boundaries – horizontally and/or vertically
‰ Along design boundaries
‰ At interface points – (API, SOA points, 3’rd party product
integrations, external data acquisition points)

„ Always do this in conjunction with the development team


„ The partitioned areas need to be balanced – in
approximate size & complexity
„ Shoot for 5-12 meaningful areas for tracking

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 87
Pareto Principal
Step 2 – Defect Tracking Setup
„ Modify your DTS to support specific application
component areas

„ During triage, effectively identify and assign defect


repairs and enhancements to component areas
‰ Early on, testers will need development help to clearly identify
root component areas (about 20% of the time)

„ If you have historical defect data (w/o partitioning), you


can run an application analysis workshop to partition
data (post release) for future predictions

It does require discipline and a little extra effort…


Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 88
Pareto Principal
Application Analysis Workshop
„ Sometimes you don’t have the time to start Pareto
tracking before starting a project, so reflectively analyze
Pareto for future planning –

‰ Decompose your application or a sub-component of it if pressed


for time
‰ Gather defects surfaced
‰ Gather your team (developers, testers)
‰ Discuss locale for each bug and create distribution
‰ Off-line create your curves and publish insights for the “next”
release
‰ Can also help fine-tune decomposition areas and train the test
team in defect localization

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 89
Pareto Principal
Step 3 – Observations & Adjustments
„ Project trending at a component level
‰ Look for migration of risk and make adjustments
‰ Look for stabilization or regressions (risk)
‰ Identify high risk & low risk component areas at a project level
‰ Map component rates to overall project goals
‰ Trend open & high priority defects at a component level
‰ Track or predict project “done”ness at a component level

„ Weekly samples of 20% component focus areas –


looking for risk migration
‰ Sample weekly, then adjust focus across your testing cycles or
iterations

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 90
Pareto Principal
Tools
„ Excel can be used to display Pareto like charts, with the
cumulative percent trend needing to be simulated

„ There are other packages available that will properly


calculate & display Pareto Charts for you. Keeping in
mind that it’s a Six Sigma tool, many are associated with
supporting it.

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 91
Section 7
All-Pairs Technique & Tool
Module Outline:

„ Introduce the All-Pairs testing technique


„ Explore scenarios where All-Pairs technique is
particularly helpful
„ Explore a hands-on session where the class takes an
example project and produces All Pairs test case
definitions

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 92
All-Pairs Testing

„ All-Pairs testing is a method of handling large scale


combinatorial testing problems
‰ Also referred to as Pairwise, Orthogonal Arrays, and
Combinatorial Method
„ Instead of attempting to test all combinations, often a
very intimidating figure, it identifies all pairs of variables
that need to be tested in tandem – to achieve reasonably
high coverage.
„ Three primary references include –
‰ Lee Copeland – A Practitioners Guide to Software Test Design
‰ James Bach – Open Source, AllPairs implementation
‰ Bernie Berger – Efficient Testing with All-Pairs 2003 StarEast
paper
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 93
All-Pairs Testing
Interoperability Testing
Client OS Browser App Server Server OS
Win NT IE 5.5 WebSphere Win NT
Win 98 IE 6.0 WebLogic Linux
Win 2000 IE 6.5 Apache
Win XP IE 7.0 IIS
FireFox 1.0
FireFox 2.0
Opera 9.1

„ One sweet spot area for All-Pairs testing is interoperability.


Something that faces web application testers every day.
„ In this example, we want to examine browser compatibility across
this specific set of system software levels – focusing on the browser
„ Considering all combinations, there are (4 x 7 x 4 x 2) or 224
possible test cases for the example.
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 94
All-Pairs Testing
Example
„ In All-Pairs test design we are concerned with
‰ Variables of a system
‰ Possible values that variables could take

„ Then we generate a list of test cases that represent the


pairing of variables (all pairs) as the most interesting set
of test cases to approach in your design

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 95
All-Pairs Testing TEST CASES
case Client OS Browser App Server Server OS pairings
1 Win NT IE 5.5 Websphere Win NT 6
Example 2 Win 98
3 Win NT
IE 5.5
IE 6.0
WebLogic
WebLogic
Linux
Win NT
6
5
4 Win 98 IE 6.0 Websphere Linux 5
5 Win NT IE 6.5 Apache Linux 6
„ Using ALLPAIRS on the 6 Win 98 IE 6.5 IIS Win NT 6
previous example, we 7 Win 2000
8 Win XP
IE 7.0
IE 7.0
Apache
IIS
Win NT
Linux
6
6
would identify 28 test cases 9 Win 2000 FireFox 1.0 Websphere Linux 5
as an alternative to the 224 10 Win XP FireFox 1.0 WebLogic Win NT 5
11 Win NT FireFox 2.0 IIS Linux 4
for absolute coverage. 12 Win 98 FireFox 2.0 Apache Win NT 4
13 Win 2000 Opera 9.1 WebLogic Linux 4
14 Win XP Opera 9.1 Websphere Win NT 4
„ We’d then use this output 15 Win 2000 IE 5.5 IIS ~Win NT 3
16 Win XP IE 5.5 Apache ~Linux 3
as guidance when 17 Win 2000 IE 6.0 Apache ~Win NT 2
designing our test cases. 18 Win XP IE 6.0 IIS ~Linux 2
19 Win 2000 IE 6.5 Websphere ~Linux 2
20 Win XP IE 6.5 WebLogic ~Win NT 2
21 Win NT IE 7.0 Websphere ~Win NT 2
22 Win 98 IE 7.0 WebLogic ~Linux 2
23 Win NT FireFox 1.0 Apache ~Linux 2
24 Win 98 FireFox 1.0 IIS ~Win NT 2
25 Win 2000 FireFox 2.0 WebLogic ~Win NT 2
Note the ‘~’ indicates a don’t care for 26 Win XP FireFox 2.0 Websphere ~Linux 2
this variable 27 Win NT Opera 9.1 IIS ~Win NT 2
28 Win 98 Opera 9.1 Apache ~Linux 2

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 96
All-Pairs Testing
Intent
„ Defects
‰ The hope of All-Pairs testing is that by running from 1-20% of
your test cases you’ll find 70% - 85% of your overall defects

„ Coverage
‰ By way of example (Cohen) a set of 300 randomly selected test
cases provided 67% statement coverage and 58% decision
coverage for an application. While 200 All-Pairs derived test
cases provided 92% statement and 85% decision coverage.

„ Important tests can be missed. Use sound judgment


when creating tests and add as required
Copyright © 2009 RGalen Consulting Group,
Fall / Winter 2008 v2 LLC 97
All-Pairs Testing
Intent
„ All-Pairs is simply a tool in your test design arsenal.
Don’t use it alone or blindly!

„ You won’t find all of your bugs exclusively using this tool!

„ Often the strategy is to use All-Pairs to establish your


baseline set of test cases
‰ Then analyze other business critical combinations and add risk-
based tests as appropriate

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 98
All-Pairs Testing
Brainstorming Value Proposition
„ What are some „ UI type input / output variation testing
testing area (functional)
opportunities for „ Cross-platform (interoperability) testing
All-Pairs? „ Anything with high numbers of variables
„ Scenario based testing, with path
(variable) variation

„ What are not? „ Performance testing, and most other


non-functional testing
„ Exploration
„ Using it solely to derive your test cases

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 99
All-Pairs Testing
Fails when…

A few cautions from James Bach & Patrick J. Schroeder in


paper –
Pairwise Testing: A Best Practice That Isn’t

„ You don’t select the right values to test with


„ When you don’t have a good enough oracle
„ When highly probable combinations get too little
attention
„ When you don’t know how the variables interact

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 100
All-Pairs Testing
Microsoft - PICT
„ Pairwise Independent Combinatorial Testing tool

„ Features
‰ Model Driven w/allowance for sub-models of variable pairings
‰ Allows for specification of constraints and conditional constraints
‰ Allows for parameters and aliasing
‰ Supports variable weighting and seeding for reuse of previous
testing cases and “fixed” tests

„ Source not available, simply a command line driven


utility
‰ Link may be found at http://pairwise.org

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 101
All-Pairs Testing
PICT Example w/Sub Groups
PICT Model

$ PLATFORM: x86, ia64, amd64


| CPUS: Single, Dual, Quad
| order = 2 (defined by /o)
| RAM: 128MB, 1GB, 4GB,
+-------------------------------------+-----------------------------+ 64GB
| | | HDD: SCSI, IDE
| order = 3 | order = 2 |
OS: NT4, Win2K, WinXP,
| | |
{ PLATFORM, CPUS, RAM, HDD } { OS, IE } APP
Win2K3
IE: 4.0, 5.0, 5.5, 6.0
APP: SQLServer, Exchange,
Office

Subgroups are useful for grouping { PLATFORM, CPUS, RAM, HDD }


@3
sets of variables with strong { OS, IE } @ 2
relationships

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 102
All-Pairs Testing
“Extra Credit” Workshop
„ Go to www.satisfice.com
„ Download ALLPAIRS to your laptop & unzip the tool
„ Gather together into “pairs” of 2-3 – no pun intended
„ Read the included ALLPAIRS manual (it’s a short read)
„ Using all the information you’ve been exposed to so far:
1. Setup a spreadsheet and run ALLPAIRS for the example in the
slides
2. Setup a spreadsheet and run ALLPAIRS for Bernie’s Mortgage
example
3. Within your pair, discuss the tool and modify one of the examples
in a significant way (or create your own). Re-run ALLPAIRS.
„ De-brief the workshop as a group

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 103
Class Wrap-up

Questions?

Thank you!

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 104
References & Backup

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 105
Risk–Based Testing
Test Workflow

Test Preparation
Risk
Based
Tests &
Coverage

Initial Initial Testing


Releas Cycle Risk
e Based
Tests &
Coverage
Test Execution –
Subsequent Iterations or Final
– Iterative Cycles Releas
Releases e

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 106
Core Testing References

„ Beck, Kent, “Test-Driven Development: By Example”, Addision Wesley, (2003)


„ Black, Rex, “Managing the Testing Process – 2’nd Edition”, Wiley, (2002)
„ Black, Rex, “Critical Testing Processes: Plan, Prepare, Perform, Perfect”, Addison Wesley,
(2004)
„ Copeland, Lee, A Practitioner’s Guide to Software Test Design”, Arctech House, (2004)
„ Crispin, Lisa and House, Tip, “Testing Extreme Programming”, Addison Wesley, (2002)
„ Dustin, Elfriede, “Effective Software Testing: 50 Specific Ways to Improve Your Testing”,
Addison Wesley, (2003)
„ Galen, Bob “Software Endgames – Controlling Mastering the Software Project Endgame”,
Dorset House Publishing, (late 2003 – early 2004)
„ Kaner, Cem, Bach, James, and Pettichord, Bret, “Lessons Learned in Software Testing – A
Context Driven Approach”, Wiley, (2002)
„ Kaner, Cem, Falk, Jack, and Nguyen, Hung Quoc, “Testing Computer Software”, Wiley,
(1999)
„ Larman, Craig, “Agile & Iterative Development – A Manager’s Guide”, Addison Wesley,
(2004)
„ Mugridge, Rick and Cunningham, Ward, FIT for Developing Software: Framework for
Integrated Tests”, Prentice Hall, (2005) Petschenik, Nathan, “System Testing with an
Attitude: An Approach That Nurtures Front-Loaded Software Quality”, Dorset House, (2005)
„ Poppendieck, Mary & Tom, “Lean Software Development – An Agile Toolkit”, Addison
Wesley, (2003)
„ Poppendieck, Mary & Tom, “Implementing Lean Software Development – From Concept to
Cash”, Addison Wesley, (2006)

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 107
Core Web References

„ Context-Based Testing
‰ 4 Schools paper – http://www.io.com/~wazmo/papers/four_schools.pdf (Updated in
March 2007 for Factory School name change and to add the Agile School)
‰ Portal - http://www.compendiumdev.co.uk/page.php?title=context_driven_testing
‰ Bret Pettichord’s site – http://www.pettichord.com/
‰ http://www.context-driven-testing.com/

„ Just-in-Time (JIT) Testing


‰ Rob Sabourin has established the practice set called JIT. More on Rob at
www.amibug.com
‰ He’s presented the techniques at Star (www.sqe.com) and ST&P (www.stpcon.com)
conference. Look up information on past programs
‰ JIT techniques in ST&P article - http://www.stpmag.com/issues/stp-2005-06.pdf

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 108
Core Web References

„ Lean Software
‰ http://en.wikipedia.org/wiki/Lean_software_development
‰ Mary & Tom Poppendieck’s site – www.poppendieck.com
‰ Lean Software - http://www.leansoftwareinstitute.com/
‰ Rational Edge (good) article - http://www-
128.ibm.com/developerworks/rational/library/oct06/nelson/index.html

„ Exploratory Testing References


‰ http://en.wikipedia.org/wiki/Exploratory_test
‰ James Bach site – www.satisfice.com
‰ ET Explained - http://www.satisfice.com/articles/et-article.pdf
‰ ET in Pairs - http://www.kaner.com/pdfs/exptest.pdf
‰ Interesting blog - http://www.kohl.ca/blog/archives/000104.html
‰ ET Examples - http://www.testingeducation.org/k04/ExploratoryExamples.htm

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 109
Core Web References

‰ Jon Bach & Quardev whitepapers - http://www.quardev.com/whitepapers.html


‰ http://www.informit.com/articles/article.asp?p=405514&seqNum=5&rl=1
‰ James makes available his RST slides and reference materials at –
„ http://www.satisfice.com/rst.pdf

„ http://www.satisfice.com/rst-appendices.pdf

„ Risk-Based Testing References


‰ Florida Institute of Technology On-line course -
http://www.testingeducation.org/BBST/BBSTRisk-BasedTesting.html
‰ Solid research paper - http://www.amland.no/WordDocuments/EuroSTAR99Paper.doc
‰ http://www.csr.ncl.ac.uk/FELIX_Web/1A.R-BT%201.pdf
‰ http://home.c2i.net/schaefer/testing/risktest.doc
‰ Johanna Rothman’s Dev-to-Tester Ratio article –
http://www.jrothman.com/Papers/ItDepends.html
‰ Planning Poker - http://www.planningpoker.com/

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 110
Core Web References

„ Pareto Principal
‰ http://en.wikipedia.org/wiki/Pareto_principle
‰ Research paper - http://cmriindia.nic.in/PROCEEDINGS%20OF%20ACIT-
%202006/SESSION%201A/S%20Samaddar%20&%20S%20Bhattacharya.doc

„ All-Pairs
‰ Wikipedia - http://en.wikipedia.org/wiki/All-pairs_testing
‰ James Bach tool download - http://www.satisfice.com/tools.shtml
‰ Bernie Berger’s 2003 StarEast paper source -
http://www.stickyminds.com/getfile.asp?ot=XML&id=6488&fn=XDD6488filelistfilename
1.pdf
‰ James discussing the Microsoft PICT (free) All-Pairs generation tool –
„ http://www.satisfice.com/blog/archives/53
„ http://download.microsoft.com/download/f/5/5/f55484df-8494-48fa-8dbd-
8c6f76cc014b/pict33.msi
‰ http://pairwise.org
‰ PICT Overview - http://ripper.eeginc.com/mercury/1134/wed/37633_Merrell_330.pdf
‰ Constraints? http://www.testingeducation.org/wtst5/PairwisePNSQC2004.pdf
Pairwise.org - http://www.pairwise.org/default.html
‰ Michael Bolton paper - http://www.developsense.com/testing/PairwiseTesting.html

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 111
Contact Info

Software Endgames: Eliminating Defects,


Controlling Change, and the Countdown to
On-Time Delivery published by Dorset House
in Spring 2005. www.rgalen.com for order
info, misc. related presentations, and papers.
Robert Galen
RGalen Consulting Group, L.L.C.
PO Box 865, Cary, NC 27512
919-272-0719
www.rgalen.com
bob@rgalen.com

Copyright © 2009 RGalen Consulting Group,


Fall / Winter 2008 v2 LLC 112

You might also like