You are on page 1of 19

Masters Level Module – Research Methods

Chapter 7

Principles of Analysing Data

Learning Outcomes

After studying this chapter, you should be able to:

1. Describe the role of data analysis in the research process as a whole.

2. Outline the core principles of data analysis.

3. Apply appropriate techniques for data analysis.

4. Discuss the principles which can be used to confirm or verify research


Masters Level Module – Research Methods

7.1 Introduction
The analysis of captured data is traditionally the final exercise in
producing research results, leaving only their dissemination and
exploitation to complete an entire cycle of the research process. It is
during the analysis phase that the data are translated into information,
and the results are transformed into the findings. However, data analysis
is an integral part of the overall research process, hence it should be
considered as a component of the research rather than an interruption to
or the simple culmination of it.

In this chapter, we examine the underlying principles of data analysis in

the context of the overall research process. The actual process of data
analysis is described in Chapter 7 of the module text, Research Methods
for Construction by Fellows and Liu (1997).

7.2 The Data Analysis Process And The Overall

Research Process
The analysis of data has strate gic relevance to the research process at
several points. It could even be argued that the analysis of data is one of
several longitudinal strands which affect the entire research process. This
begins with its formative role in the design of a researchable problem for
which data exists and is attainable. During the intermediate phase of the
process, the selection of the methodological approach and the
corresponding methodological technique will have been guided by the
anticipated analysis of the data and required results. Finally, the data
analysis supports the confirmatory function as the researcher compares
evidence with reflections and predictions.

By this stage of the module, you should be aware that the research
process is cyclical and involves a complexity of feedback and
feedforward relationships. These also manifest themselves via data
analysis in several contexts − research needs are identified as knowledge
needs, to be systematically resolved by the collection of targeted data and
its transformation into information. The resulting information supports or

Masters Level Module – Research Methods

challenges the existing field knowledge. It may also extend it, either
qualitatively or quantitatively, thereby extending the knowledge base and
offering the opportunity for the research cycle to recommence.

7.2.1 Approaching The Analysis Of Data

The process of analysing data should be characterised by flexibility and
rigour, and conducted in a reflective and systemic manner. The
alternative, of approaching the analysis of the data without any
recognition of the nature of its collection, or the preceding
conceptualisation which led to the design of that data collection, will
cause problems. Similarly, conceptualisation without recognition of the
limitations of data availability, or the process of its capture and analysis,
can also create problems for analysing the findings.


Record here your preliminary thoughts on what the purpose of research

data is and how this should guide the data analysis process.

7.2.2 The Frame Of Reference, The Explanatory

Template, And The Analysis Of Data
The early stages of the research process which were concerned with the
conceptualisation and formation of the research question should have
generated a frame of reference for the data analysis. As the initial pilot
data is collected, the frame of reference can be used to assist in designing
Masters Level Module – Research Methods

the collection, storage and retrieval of information (the principles of

which were discussed in the first chapters when the concepts of
managing research information were first introduced). There may also
have been a sampling plan produced for the collection of data which will
correspond to the frame of reference. If so, this could assist in the
assessment of the data.


Using your chosen research topic, record your current frame of reference
here, and compare it with your initial frame of reference. How has it
changed in terms of assumptions, boundaries, identified processes,
variables, and theoretical constructs?

Record your thoughts here.

In the final data analysis phase, the conceptual frame of reference can be
used to produce templates for the description or explanation of the data
patterns which emerge. These templates will represent the link between
the researcher’s hypothetical conceptual framework and the analysed
findings. They should contain the key concepts which form the
conceptual model and the research question/ hypothesis. These concepts
are usually described in terms of:

1. theoretical constructs which describe the hypothesised organisation

rules of the system under study

Masters Level Module – Research Methods

2. processes, which define the nature of operation of the system and the
way in which it changes or operates under change

3. contextual descriptions of the research field and focused research

topic, which will include a contextualised definition of where the
assumed boundary lies between the overall field and the focus of the
research project

4. the identification of the key dependent and independent variables and

their inter-relationships within the system and across the boundary

5. some hypothesised outcomes.

Looking for instance at (4), the key variables, depending on the style of
conceptual model being used, the explanatory template used for data
analysis may emphasise any or all of the following aspects:

a) the links between variables

b) their individual role

c) the criteria for their successful operation or influence on other


d) the nature of the relationship between the independent variables and

the external agents of change which will be manipulated or observed
for the purposes of the study

e) the anticipated nature of outcomes that changes in the status of

independent variables may have on the system

f) any assumptions which are being incorporated into the research design

It may be useful to create the explanatory template graphically − this is

particularly useful for viewing the entire problem in a single picture. The
nature of relationships and priority variables can be visually overlaid and
checked. By overlaying the data onto a visual template, you will be able
to compare the completeness of coverage of your research study with the
nature of its focus.

Masters Level Module – Research Methods


Consider each of the key concepts involved in producing an explanatory

template. For each of them record here what aspects would you include
in the explanatory template for your chosen research topic:

(1) theoretical constructs

(2) processes

(3) contextual descriptions , including the boundary to the research

(4) key dependent and independent variables and their inter-relationships

(5) hypothesised outcomes

7.2.3 Use Of Templates For Interim And Final Data

In a formative piece of research, where the purpose may be to generate
new theories, the descriptive or explanatory templates produced from
data analysis represent the refinement of the conceptual model. They may
even represent the generation of novel theory.

In an inductive piece of research conducted using a qualitative approach,

there may be several cycles of generating explanatory templates from
analysed data which then feed into new conceptual propositions, thereby
allowing the research concepts to evolve. A reiterative process emerges
of generating, reviewing, and altering the explanatory templates. This

Masters Level Module – Research Methods

will inform the development of the conceptual theory. By comparing the

various generations of conceptual framework and explanatory template,
the nature of changes in the research conceptualisation can be seen
together with any contradictions, overlaps, or refinements required for
the conceptual model. Assumptions and qualifications of findings can
also be identified using this technique.

Further data collection may allow new insights, which in turn lead to the
development of new explanatory templates, and so the process goes on.
Hence the interim assessment of data allows the progression of the
research concepts and makes an important contribution to the overall
research process.

At the conclusion of the interim cycles of data analysis, the findings

should be described and stored using the same conceptual frame of
reference which was used to describe the conceptualisation of the
research field. If the initial conceptualisation was robust, it will provide a
solid mechanism for the comparison of the findings with the existing
field knowledge.


Creating and Using a Template for Designing Data Analysis

This activity requires you to create a preliminary version of a data

analysis template from your conceptual frame of reference. There are
three steps in the process. It will take a little while to do them properly,
but it is potentially a very valuable exercise.

Step 1: Create a preliminary version of a data analysis template by

reconfiguring the conceptual frame of reference to indicate:

rules of the system under study

processes, which define the nature of operation of the system and the way
in which it changes or operates under change

contextual descriptions of the research field and focused research topic,

which will include a contextualised definition of where the assumed
boundary lies between the overall field and the focus of the research
Masters Level Module – Research Methods

the identification of the key dependent and independent variables and

their inter-relationships within the system and across the boundary

your hypothesised outcomes

It is best to do this graphically and to indicate clearly which are

theoretical constructs, processes, contextual descriptions, etc.

Step 2: Next highlight the data needs and data analysis opportunities
associated with each of these five issues. It is also best to do this
graphically and to use different colours for each issue.

Step 3: For each highlighted part of the diagram, summarise where your
data analysis opportunities lie. By studying the template in detail,
describe what you are trying to establish using the data (for example,
patterns, comparisons, clusters of evidence, identification of inter-
relationships between variables; or new conceptual links).

If any obvious limitations in your data set emerge consider what you can
do about them.

7.2.3 The Underlying Principles Of Data Analysis

Data analysis represents one of the most significant steps in realising the
potential richness of a research project. Data collection and analysis
should have been considered when the research question was selected,
and again when the template for its analysis was devised.

Consider the following specific issues when approaching the analysis of

data, noting that this section relates to the principles of data analysis, not
the detail of particular techniques.

7.2.5 Overall Purposes Of Data Analysis

It is conventional to analyse data to look for meaning using a range of
parameters. These might include seeking patterns or themes to help
Masters Level Module – Research Methods

develop an outline conceptual theory. In a more defined field, you may

be searching for contrasts and comparisons within the data set and/or
between this data set and others. In an advanced field with a solid
theoretical foundation, the data may be assessed for conceptual
coherence or seeking logical chains of evidence based upon the existing

The analysis techniques applied to the data will differ according to the
purpose of the analysis, also whether the data analysis is being carried
out within the terms of a qualitative approach or quantitative approach to
the research. Consider in particular whether you are exploring the data
for trends or emergent theories, or if you expecting to be analysing data
to confirm an existing theory. This distinction is one of exploratory data
collection or confirmatory data collection.

Exploratory data collection suits an exploratory research project, perhaps

undertaken for the purposes of theory building. Common instances
include qualitative data analysis from semi-structured or unstructured
interviews; or the analysis of anecdotal information or narratives for their
emergent trends, perhaps by replication or recurrence of identifiable
generic themes.

In contrast, confirmatory data is used for the specific provision of

evidence to support taking a theory further on than it was at the start of
the research, or to support its calibration as an applied tool (rather than a
research tool, say).

Consider also the issue of data correlation. Think about the way in which
you will make comparisons within the data set. Are you looking, for
instance, at multiple cases with the aims of identifying overall patterns or
to make cross-case comparisons?

7.3 The Nature Of The Raw Data

Is the amount of data which has been collected so great as to cause a
logistic problem in terms of the handling or analysis process? If there is

Masters Level Module – Research Methods

an overload, it may be possible to selectively re-focus the analysis to

make it manageable and produce a richer data analysis in a narrower
context. Before doing so, study the initial research question carefully to
understand the implications of adjusting the frame of reference for the
data at this stage. It may be better to apply for further funding/ resources
to allow the intended thoroughness and scope of data analysis to be

Consider also the specific issue of data overload. Is the amount of data
that you have collected actually required for the purposes of its
originally-intended application? It is quite common for researchers to
generate immense data sets as a precaution against the charge of under-
evidencing their research. This often has the counter-productive effect of
swamping the researcher with data sets that are problematic to assess but
give little or no extra value to the research. This may be an issue of
rigour or simply a problem of scale. Looking carefully at the nature of
limiting assumptions placed on the data collection exercise may reveal a
previously unforeseen distinction between large portions of the data set
characteristics which could give a valuable and sustainable method of
dividing the data to make it more manageable.

Look also at whether the data are structured or unstructured. Highly-

organised data will be open to a prescriptive, perhaps statistically-based
analysis. Unstructured data will not. The structuring of the data will be
determined by the methodological approach used for the data collection,
and also by the quality of the coding and retrieval system which you
created during the research project.

Are your data sources multiple or single, and does this create consistency
problems? Data gathered by more than one researcher should be
collected using a very tight procedural and coding design to prevent
differences in the collection or processing of the raw data occurring.
Making the links between the different data sources can be difficult
unless a generic set of criteria can be established for their assessment. If
this was not done at the formative stage of the research then this is going
to be very problematic.

Masters Level Module – Research Methods

7.3.1 The Processing Of Data

What is the appropriate fineness of analysis? This is a matter which
should have been considered at the early stage of the research and have
informed the choice of rigour for the methodological technique which
captured the data. In addition to matching the fineness of the data
analysis to the precision of the theory or question which it informs, is
your data fineness consistent and rich enough for the proposed analysis?

In an emerging field, the methodologies and data may be relatively

coarse and in these circumstances, data can be legitimately gathered
using a low level of rigour without necessarily compromising the
usefulness of this form of research. It is therefore important to match the
level of analysis and definition of correctness with the field as a whole
and the research project in particular.

7.3.2 Robustness Of Data

Consider issues such as internal and external validity, which can be
affected during the process of designing the collection of data, the
collection of data, and/or its analysis.

Internal validity relates to the correctness of the conceptual model or

experimental design which is being used to capture the data. An error in
the internal assumptions and selection of variables can produce errors of
a gross magnitude in findings if a causal link is unidentified (or of a
lesser scale if the nature of a link is more sensitive than anticipated).
Combinations of internal validity phenomena can combine (invisibly) to
create a superficially plausible set of results when in reality the
underlying internal model could be fatally flawed. Such a fatal flaw may
only show up on cross-analysis between data generated from different
sets. If the combination of internal validity issues remain constant, it may
not be discovered at all.

Internal validity issues can also arise from the inappropriate handling of
the data, such as the scope and scale, the detail of recording of the data
and assumptions used for its collection and processing.

Masters Level Module – Research Methods

External validity is a related issue but is concerned with issues which are
out of control of the researcher. For example, the collection of qualitative
case data for an action research project may fail to identify a profoundly
important variable which is affecting the system, resulting either in an
apparently inconsistent effect for the (known) causes, or even over-
attribution of the effect to the planned cause. In a highly-defined research
field with a body of existing data, it may be necessary to demonstrate
statistically that the data is correct and of the appropriate form.

Finally, review the overall ‘correctness’ of the data analysis, which can
be assessed in a number of ways. In attempting to analyse your data,
consider whether the data analysis feels appropriate for the purposes.
Does the proposed fineness of analysis and scale of data correspond with
the rigour of the research methodology and/ or the conceptual detail
underpinning the research question? Are there assumptions in the
research design which affect the validity or reliability of the hypothesis
which the data is being used to test (and therefore the applicability of the
data)? Does the initial analysis match with or contradict the picture in the
mind which you started with? Is the data collection and analysis process
trustworthy and credible?

If you do discover validity problems, explore them and be open about

discussing them. Rather than indicating a weakness in the research, this
could be a step forward in modelling the concepts upon which the data
collection was planned.

7.3.4 The Practice Of Data Analysis

For appropriate techniques of data analysis, you should now read Chapter
7 of the module text by Fellows and Liu. This will give you some ideas
about the best way of analysing the data that will be generated by your
research topic. After reading this section, you are advised to consult with
your supervisor to confirm that the proposed techniques are the most
appropriate for your situation.

Masters Level Module – Research Methods

7.4 Verifying Findings

It is essential to check data quality before being conclusive about your
findings. In extreme cases it will be clear that there is a problem − either
the data appears to directly contradict the expected findings but for no
good reason, or sometimes it simply fits too well. Usually researchers
finds themselves in an intermediate position where the data is less than
entirely conclusive, but there is apparently some pattern or confirmation
emerging from the data analysis.

There are a number of principles which can be applied to appraising the

data quality. In some cases, it is clear from the collection phase that there
is some compromise in the data quality. Where this is fatal to its
usefulness, it may become clear very early on and an interim analysis of
the data will help highlight the problem and its seriousness to the test of
an hypothesis or development of a proposal. Adjustments may be

Assuming that the data appears superficially to be sound, the next issue
that arises is how to confirm the findings. Miles and Huberman (1994)
provide an excellent and very readable review of verification and
confirmation tests in their book entitled Qualitative Data Analysis: An
Expanded Sourcebook, on which the remainder of this section is based.
They indicate thirteen tools for confirming findings, which tend to
overlap considerably, and we have grouped them together here as six
distinct approaches.

7.5 Representativeness
Is the set of results typical, and does it represent a generalisable finding?
Using a non-representative sample of interviewees or questionnaire
respondents, perhaps because they were easily accessed; generalising
from non-representative events; or drawing inferences from non-
representative processes; each of these could restrict the
representativeness of the findings.

Masters Level Module – Research Methods

7.6 Researcher Effects

First, the presence of the researcher may create responses in interviewees
which distort the data as they are collected. They can become very
defensive, or over-enamoured with the importance of their opinion.
Either way, the data are likely to be distorted in some way at collection.

Secondly, the case being examined can affect the perception of the
researcher. This is a similar phenomenon to that which can occur in
literature reviews, where the first profound document that the researcher
reads can establish a mindset against which other subsequent theories are
compared. The researcher is no longer ‘disinterested’.

7.7 Triangulation
This is the use of an independent means of assessing the findings, in
other words an alternative perspective. The aim is to test whether the
independent measures agree with, or at least do not contradict with the
findings of the research. This is useful for testing whether it is possible to
replicate a finding, for instance by using cross-case analysis, or in
extreme cases by re-running the test completely with new data sources.
This is most relevant in the qualitative approach for the replication of
generalisable, or thematic findings. The Scientific Method applie s this
approach as a common measure for experimental verification.

Good triangulation sources are characterised by having different biases or

different strengths so that they compensate for the biases and strengths of
the inherent in the research project under review. Triangulation may
provide corroboration.

7.8 Weighting Research Evidence

All data is equal, only some data is more equal than others ... Miles and
Huberman suggest that the following facets contribute to stronger data

Masters Level Module – Research Methods

which should be given greater weighting: Data which are collected later,
or after repeated contact; that which is seen or reported first hand;
observed behaviour; a field worker whose reports are known to be
trustworthy; information collected in an informal setting; and information
collected from the respondent when they are alone rather than in

An extension of weighting of research evidence can be made by looking

for negative evidence. Here you actively look for reasons to refute the
findings and hence to disconfirm it. This exercise helps strengthen the
conceptual analysis of findings, and is similar to that of exploring rival
explanations, both of which could be conducted using a negative
brainstorming approach. Ruling out spurious relations is another form of
application of using rival explanations. It involves checking the apparent
cause-effect yielded by the study to see if any other unforeseen links
between variables are causing or contributing to the observed effect.

7.9 Looking for ‘Outliers’

Outliers are defined by Miles and Huberman as extreme responses which
tend to give increased credibility to the bulk of the findings. In statistical
terms, they are the results which at first view you may wish were not
there, but on more detailed review help define confirm the remainder of
the findings − ‘the exception that proves the rule’. Example sources of
outliers include novel or unique processes or treatments, and unusual
events. Outliers appear to confound generalisability. Extreme cases
operate like outliers. Following up surprises is also related, but here the
outliers are conceptual rather than merely data-based, and therefore
warrant particular consideration. They also suggest getting feedback from

Masters Level Module – Research Methods

7.10 If -Then Tests

These are a method of formalising the treatment of propositions. They
can be applied as a qualitative test to examine the application limits of a
finding, or earlier in the research process for the exploration of

7.11 Summary
This chapter has dealt with designing, conducting and confirming the
analysis of research findings from data analysis. This exercise can occur
and reoccur throughout the research process. Once the final analysis of
results has been made it is next necessary for the research to be reported.
The final chapter of the workbook will deal with writing reports and
disseminating the findings of the research.

7.11.1Personal Feedback Questions

PF 7.1

What is the difference between exploratory use of data analysis and

confirmatory use of data analysis?

PF 7.2

Summarise the key principles of a robust data analysis process.

PF 7.3

How would you deal with a mismatch between the collected data and the
data analysis parameters?

PF 7.4

Define representativeness in research findings

Masters Level Module – Research Methods

PF 7.5

How can the researcher affect the research findings?

PF 7.6

What is triangulation in the context of analysing research findings?

PF 7.7

How do you weight research evidence?

PF 7.8

What are outliers and how can they affect the research findings?

PF 7.9

How can you use Extreme Cases to confirm findings?

PF 7.10

What are If-Then Tests used for?

7.11.2 Recommended Reading

Bell, J. 1993 Doing Your Research Project: A Guide For First Time
Researchers in Education and Social Science Open University Press (2nd
Edition) ISBN 0-335-19094-4. See Chapter 11 Interpretation and
Presentation of The Evidence (pp 127-150).

Mark, R., 1996 Research Made Simple: A Handbook for Social Workers.
Sage Publications. ISBN 0-8039-7427-2. Chapter 15 How to Analyze
Data (pp 300 - 321), deals with detailed data handling tools as well as
principles. See also Chapter 16 Statistical Hypothesis Testing (pp 322 -
343); and Appendix C, Presenting Data in Tables and Figures (pp 370 -

Masters Level Module – Research Methods

Mason, J., 1996 Qualitative Researching Sage Publications ISBN 0-

8039-8986-5. See Chapter 7 Producing Analyses and Explanations
Which are Convincing (pp 135 - 163).

Maxwell, J., 1996 Qualitative Research Design: An Interactive

Approach. Sage Publications Applied Social Research Methods Series
Vol. 41. ISBN 0-8039-7329-2. See Chapter 6 Validity: How Might You
Be Wrong? (pp 86 - 98).

Miles, M. B., Huberman, A.M., 1994 Qualitative Data Analysis: An

Expanded Sourcebook . Sage Publications. ISBN 0-8039- 5540-5. As
discussed in this chapter.

Further Reading on Data Analysis

Altheide, D.L., 1996 Qualitative Media Analysis Sage University Paper,

Qualitative Research Methods Series 38. ISBN 0-7619-0199-X. For
document analysis.

Bryan, A., 1993 Quantity and Quality in Social Research Routledge

Contemporary Social Research Series No 18. ISBN 0-415-07898-9. See
Chapter 5 The Debate About Quantitative and Qualitative Research (pp
93 - 126).

Howard, K., Sharp, J.A., 1983 The Management of a Student Research

Project Gower ISBN 0-566-00613-8. See Chapter 5 Analysing The Data
(pp 99 - 119).

Kvale, S., 1996 Interviews: An Introduction to Qualitative Research

Interviewing Sage Publications ISBN 0-8039-5820-X. See Chapter 11
Methods of Analysis (pp 187 - 209).

Moroney, M.J., 1980 Facts from Figures Pelican ISBN 0-14-02-0236-8.

Oppenheim, A.N., 1996 Questionnaire Design, Interviewing and Attitude

Measurement Pinter. ISBN 1 85567 044 5. See Chapter 14 Data
Processing (pp 261 - 278) and Chapter 15 Statistical Analysis (pp 279 -

Masters Level Module – Research Methods

Sapsford, R., Jupp, V., 1996 Data Collection and Analysis Sage
Publications ISBN 0-7619-5046-X. Authoritative.