You are on page 1of 9

AFFINITY DIAGRAM

What is it?

An affinity diagram is a technique for organizing verbal information into a visual pattern. An affinity
diagram starts with specific ideas and helps you work toward broad categories. This is the opposite of a
cause and effect diagram, which starts with the broad causes and works toward specifics. You can use
either technique to explore all aspects of an issue. Affinity diagrams can help you:

* Organize and give structure to a list of factors that contribute to a problem.

* Identify key areas where improvement is most needed.

How to use it

Identify the problem. Write the problem or issue on a blackboard or flipchart.

Generate ideas. Use an idea-generation technique to identify all facets of the problem. Use index cards or
sticky-back notes to record the ideas.

Cluster your ideas (on cards or paper) into related groups. Use questions like "Which other ideas are
similar?" and "Is this idea somehow connected to any others?" to help you group the ideas together.

Create affinity cards. For each group, create an affinity card, a card that has a short statement describing
the entire group of ideas.

Cluster related affinity cards. Put all of the individual ideas in a group under their affinity card. Now try to
group the affinity cards under even broader groups. You can continue to group the cards until your
definition of "group" becomes too broad to have any meaning.

Create an affinity diagram. Lay out all of the ideas and affinity cards on a single piece of paper or a
blackboard. Draw outlines of the groups with the affinity cards at the top of each group. The resulting
hierarchical structure will give you valuable insight into the problem.

Affinity Diagram Example

A publication team wanted to reduce the number of typographical errors in their program's
documentation. As part of a first step, they conducted a brainstorming session that produced the following
list of factors that influenced errors.

Computers No Feedback Noise


Printers Typing Skill Proofreading Skill
Lighting Typewriters Chair Height
Comfort Desk Height Time of Day
Font Interruptions Handwriting
Grammar Slang Spelling
Draft Copy Punctuation Distribution
Technical Jargon Final Copy Editing Skill
Computer Skill Unreasonable Deadlines No Measurement

The following diagram helped them to focus on areas for further analysis.
BAR CHART:
Histogram or Bar Graph

In statistics, a histogram is a graphical representation, showing a visual impression of the


distribution of data. It is an estimate of the probability distribution of a continuous variable and was first
introduced by Karl Pearson [1]. A histogram consists of tabular frequencies, shown as adjacent
rectangles, erected over discrete intervals (bins), with an area equal to the frequency of the observations
in the interval. The height of a rectangle is also equal to the frequency density of the interval, i.e., the
frequency divided by the width of the interval. The total area of the histogram is equal to the number of
data. A histogram may also be normalized displaying relative frequencies. It then shows the proportion of
cases that fall into each of several categories, with the total area equaling 1. The categories are usually
specified as consecutive, non-overlapping intervals of a variable. The categories (intervals) must be
adjacent, and often are chosen to be of the same size

Histograms are used to plot density of data, and often for density estimation: estimating the
probability density function of the underlying variable. The total area of a histogram used for probability
density is always normalized to 1. If the length of the intervals on the x-axis are all 1, then a histogram is
identical to a relative frequency plot.

An alternative to the histogram is kernel density estimation, which uses a kernel to smooth
samples. This will construct a smooth probability density function, which will in general more accurately
reflect the underlying variable.

BRAIN STORMING:

Brainstorming is a group creativity technique designed to generate a large number of ideas for
the solution of a problem. In 1953 the method was popularized by Alex Faickney Osborn in a book called
Applied Imagination. Osborn proposed that groups could double their creative output with brainstorming.

Although brainstorming has become a popular group technique, when applied in a traditional
group setting, researchers have not found evidence of its effectiveness for enhancing either quantity or
quality of ideas generated. Because of such problems as distraction, social loafing, evaluation
apprehension, and production blocking, conventional brainstorming groups are little more effective than
other types of groups, and they are actually less effective than individuals working independently.[2][3][4]
In the Encyclopedia of Creativity, Tudor Rickards, in his entry on brainstorming, summarizes its
controversies and indicates the dangers of conflating productivity in group work with quantity of ideas.

Although traditional brainstorming does not increase the productivity of groups (as measured by the
number of ideas generated), it may still provide benefits, such as boosting morale, enhancing work
enjoyment, and improving team work. Thus, numerous attempts have been made to improve
brainstorming or use more effective variations of the basic technique.

Professor Olivier Toubia of Columbia University has conducted extensive research in the field of
idea generation and has concluded that incentives are extremely valuable within the brainstorming
context.

From these attempts to improve brainstorming, electronic brainstorming stands out. Mainly
through anonymization and parallelization of input, electronic brainstorming enforces the ground rules of
effective brainstorming and thereby eliminates most of the deleterious or inhibitive effects of group
work.The positive effects of electronic brainstorming become more pronounced with group size.

Ground Rules:

There are four basic rules in brainstorming.[1] These are intended to reduce social inhibitions among
group members, stimulate idea generation, and increase overall creativity of the group.

1. Focus on quantity: This rule is a means of enhancing divergent production, aiming to facilitate
problem solving through the maxim quantity breeds quality. The assumption is that the greater the
number of ideas generated, the greater the chance of producing a radical and effective solution.
2. Withhold criticism: In brainstorming, criticism of ideas generated should be put 'on hold'. Instead,
participants should focus on extending or adding to ideas, reserving criticism for a later 'critical stage' of
the process. By suspending judgment, participants will feel free to generate unusual ideas.
3. Welcome unusual ideas: To get a good and long list of ideas, unusual ideas are welcomed. They can
be generated by looking from new perspectives and suspending assumptions. These new ways of
thinking may provide better solutions.
4. Combine and improve ideas: Good ideas may be combined to form a single better good idea, as
suggested by the slogan "1+1=3". It is believed to stimulate the building of ideas by a process of
association..

CAUSE AND EFFECT ANALYSIS:

The cause-and-effect diagram is a method for analysing process dispersion. The diagram's
purpose is to relate causes and effects. Three basic types: Dispersion analysis, Process classification and
cause enumeration. Effect = problem to be resolved, opportunity to be grasped, result to be achieved.
Excellent for capturing team brainstorming output and for filling in from the 'wide picture'. Helps organise
and relate factors, providing a sequential view. Deals with time direction but not quantity. Can become
very complex. Can be difficult to identify or demonstrate interrelationships.

CUSTOMER RELATIONSHIP CHECKLIST:

A Checklist contains items that are important or relevant to a specific issue or situation.
Checklists are used under operational conditions to ensure that all important steps or actions have been
taken. Their primary purpose is for guiding operations, not for collecting data. Generally used to check
that all aspects of a situation have been taken into account before action or decision making. Simple,
effective.

DECISION ANALYSIS:

Decision Analysis (DA) is the discipline comprising the philosophy, theory, methodology, and
professional practice necessary to address important decisions in a formal manner. Decision analysis
includes many procedures, methods, and tools for identifying, clearly representing, and formally
assessing important aspects of a decision, for prescribing a recommended course of action by applying
the maximum expected utility action axiom to a well-formed representation of the decision, and for
translating the formal representation of a decision and its corresponding recommendation into insight for
the decision maker and other stakeholders.
Graphical representation of decision analysis problems commonly use influence diagrams and
decision trees. Both of these tools represent the alternatives available to the decision maker, the
uncertainty they face, and evaluation measures representing how well they achieve their objectives in the
final outcome. Uncertainties are represented through probabilities and probability distributions. The
decision maker's attitude to risk is represented by utility functions and their attitude to trade-offs between
conflicting objectives can be made using multi-attribute value functions or multi-attribute utility functions (if
there is risk involved). In some cases, utility functions can be replaced by the probability of achieving
uncertain aspiration levels. Decision analysis advocates choosing that decision whose consequences
have the maximum expected utility (or which maximize the probability of achieving the uncertain
aspiration level). Such decision analytic methods are used in a wide variety of fields, including business
(planning, marketing, and negotiation), environmental remediation, health care research and
management, energy exploration, litigation and dispute resolution, etc.

FLOW CHARTS:
Pictures, symbols or text coupled with lines, arrows on lines show direction of flow. Enables
modelling of processes; problems/opportunities and decision points etc. Develops a common
understanding of a process by those involved. No particular standardisation of symbology, so
communication to a different audience may require considerable time and explanation.

FORCE FIELD ANALYSIS:


Force field analysis is an influential development in the field of social science. It provides a
framework for looking at the factors (forces) that influence a situation, originally social situations. It looks
at forces that are either driving movement toward a goal (helping forces) or blocking movement toward a
goal (hindering forces). The principle, developed by Kurt Lewin, is a significant contribution to the fields of
social science, psychology, social psychology, organizational development, process management, and
change management.

Lewin, a social psychologist, believed the "field" to be a Gestalt psychological environment existing in an
individual's (or in the collective group) mind at a certain point in time that can be mathematically described
in a topological constellation of constructs. The "field" is very dynamic, changing with time and
experience. When fully constructed, an individual's "field" (Lewin used the term "life space") describes
that person's motives, values, needs, moods, goals, anxieties, and ideals.

Lewin believed that changes of an individual's "life space" depend upon that individual's
internalization of external stimuli (from the physical and social world) into the "life space." Although Lewin
did not use the word "experiential," (see experiential learning) he nonetheless believed that interaction
(experience) of the "life space" with "external stimuli" (at what he calls the "boundary zone") were
important for development (or regression). For Lewin, development (or regression) of an individual occurs
when their "life space" has a "boundary zone" experience with external stimuli. Note, it is not merely the
experience that causes change in the "life space," but the acceptance (internalization) of external stimuli.

Lewin took these same principles and applied them to the analysis of group conflict, learning,
adolescence, hatred, morale, German society, etc. This approach allowed him to break down common
misconceptions of these social phenomena, and to determine their basic elemental constructs. He used
theory, mathematics, and common sense to define a force field, and hence to determine the causes of
human and group behavior.

CONTROL CHARTS:
Control charts are a method of Statistical Process Control, SPC. (Control system for production
processes). They enable the control of distribution of variation rather than attempting to control each
individual variation. Upper and lower control and tolerance limits are calculated for a process and
sampled measures are regularly plotted about a central line between the two sets of limits. The plotted
line corresponds to the stability/trend of the process. Action can be taken based on trend rather than on
individual variation. This prevents over-correction/compensation for random variation, which would lead to
many rejects.

PARETO ANALYSIS:
The Pareto principle suggests that most effects come from relatively few causes. In quantitative
terms: 80% of the problems come from 20% of the causes (machines, raw materials, operators etc.); 80%
of the wealth is owned by 20% of the people etc. Therefore effort aimed at the right 20% can solve 80% of
the problems. Double (back to back) Pareto charts can be used to compare 'before and after' situations.
General use, to decide where to apply initial effort for maximum effect.

You might also like