You are on page 1of 27

Exercise

Empirical Model Building and Methods


(Empirische Modellbildung und Methoden)
Liliana Guzmn
SS 2012
Chapter 3.3 Design
Empirical Model Building &
Methods Exercise 5
Purpose
Gathering experience in
Operationalizing variables
Formulation of hypotheses
Sampling
Experimental design
Analysis of validity threats
Preparing the examination by
Reviewing most important aspects of the design phase
(using an example!)
Slide 2
Empirical Model Building &
Methods
Overview of design phase
Purpose
Operationalizing variables and formalizing
Specifying experimental design: who? what? how? when?...
Steps
1. Operationalization
2. Formalization of hypotheses
3. Sampling
4. Selecting experimental design
5. Selecting & designing data collection methods
6. Selection & designing material
7. Analyzing design validity/reliability
Specifying variables to make
them observable/measurable
Specifying what will be done
and observe/measure,
how, by whom,
when
E
x
e
r
c
i
s
e

5
Specifying hypotheses
to make them testable (challengeable)
Specifying who can participate
and how to get participants
How good is the design?
Can we draw conclusions? For which context?
Slide 3
Empirical Model Building &
Methods
Experimental terminology (Example!!!!)
Research goal
Analyze notations for requirement specification to compare them w.r.t.
efficiency, effectiveness and acceptance from the perspective of
requirement engineers in the context of IS development.
Object of study: Requirement notation
New graphical notation
Control: Structured natural languages (e.g. use cases) or graphical
notation (e.g. activity or sequence diagrams)
Population: Requirement engineers
Sample: Novice, e.g. students of the lecture of requirement
engineering.
Research purpose: comparison (quasi-) experimental design
Setting: IS material should be representative of this domain!
Variables: notations, efficiency, effectiveness and acceptance
SoP and SoA
support the
selection of
the control
Slide 4
Empirical Model Building &
Methods Experimental terminology (Example!!!!)
Research goal
Analyze notations for requirement specification to compare them w.r.t.
efficiency, effectiveness and acceptance from the perspective of
requirement engineers in the context of information system
development.
Variables: notations, efficiency, correctness and usability
Dependent variable (expected variation, response)
Efficiency, effectiveness and acceptance
Independent variable (expected cause(s))
Requirement notation
Control/Confounding variable
Characteristics of the sample, e.g. experience, skills, attitude
Characteristics of the setting and material, e.g. problems to be
modeled and tool support
Laboratory conditions, e.g. time, noise, light, fire alarm
Slide 5
Empirical Model Building &
Methods
Experimental terminology (Example!!!!)
Research goal
Analyze notations for requirement specification to compare them w.r.t.
efficiency, effectiveness and acceptance from the perspective of
requirement engineers in the context of information system
development.
Underlying hypotheses (expected relationship among variables!)
Requirements engineers using the new requirement notation are
more efficient than using activity diagrams.
Requirements engineers using the new requirement notation are
more effective than using (e.g.) activity diagrams.
Requirements engineers accepts the new requirement notation
more than (e.g.) activity diagrams.
Based on SoA and SoP we can specify the type of difference, i.e. more
than, less than, equal to, at least as ... as,
Slide 6
Empirical Model Building &
Methods
Operationalization (Example!!!!)
Concept Variable(s) Instrument(s)
Efficiency Variable(s) Instrument(s)
Time required by a
requirement engineering
for modeling a set
of requirements
Time: Minutes Time sheet
(Log file)
Effectiveness Variable(s) Instrument(s)
Degree to which a
requirement engineering
correctly models a set
of requirements ,
,
Total N Defects: L
+
N of defects per type:
(c, L
+
) with c C
C:={missing, extraneous,
ambiguous, inconsistent,
correct & miscellaneous
information}
Defect report form
Slide 7
Empirical Model Building &
Methods Operationalization (Example!!!!)
Example of time sheets
In:
http://www.dummies.com/how-to/content/how-to-monitor-work-effort.html
http://www.corasystems.com/capabilities/project-time-sheet-tracking-software/
Slide 8
Empirical Model Building &
Methods
Operationalization (Example!!!!)
Example of defect report form
In:
http://www.cs.umd.edu/projects/SoftEng/ESEG/manual/pbr_
package/download.html
In the above webpage you also find an example of
experimental design, including all material!
Slide 9
Empirical Model Building &
Methods
Operationalization (Example!!!!)
Acceptance
From SoA: Unified theory of acceptance and use of technology
Venkatesh, V.; Morris; Davis; Davis (2003), "User Acceptance of Information Technology: Toward a Unified View", MIS
Quarterly, 27, pp. 425478. For purpose of this exercise, we are interested in the final model! Nevertheless, this paper is
also a good example regarding Model Building.
Slide 10
Empirical Model Building &
Methods
Operationalization (Example!!!!)
Variable(s)
Performance expectancy
Perceived of usefulness:= The degree to which a person believes
that using a particular system would enhance his or her job
performance.
Extrinsic motivation := The perception that users will want to perform
an activity because it is perceived to be instrumental in achieving
valued outcomes that are distinct from the activity itself, such as
improved job performance, pay, or promotions.
Job fit:= How the capabilities of a system enhance an individuals job
performance.
Relative advantage:= The degree to which using an innovation is
perceived as being better than using its precursor.
.
Effort expectancy
Social influence
Facilitating conditions
Behavioral intention
Self efficiency
Anxiety
Attitude toward using technology
For remaining definitions See Venkatesh et al. 2003)
Slide 11
Empirical Model Building &
Methods
Operationalization (Example!!!!)
Variable(s)
Slide 12
Empirical Model Building &
Methods
Operationalization (Example!!!!)
Variable(s)
Slide 13
Empirical Model Building &
Methods
Operationalization (Example!!!!)
Instruments
Questionnaire
..
Question 10: Considering your performance expectancy with respect to <the
new requirement notation/use cases>, to what degree you agree or disagree
with the following statements:
a) I would find the system useful in my job.
1: Strongly agree
2: Agree
3: Neither agree nor disagree
4: Disagree
5: Strongly disagree
b) Using the system enables me to accomplish tasks more quickly.
Strongly agree
Agree
Neither agree nor disagree
Disagree
Strongly disagree

Interval scale!
It allows descriptive
and tendency analysis
(H
0
: H = S or H S
or H S; N: =meuian)
Ordinal scale!
It allows only descriptive
analysis, e.g. frequency
analysis, mode, .
Slide 14
Empirical Model Building &
Methods
Formalization of hypotheses (Example!!!!)
Requirements engineers using the new requirement notation are more
efficient than using activity diagram.
H
1
: p
NRN
,
cicicncy

p
A
,
cicicncy
H
0
: p
NRN
,
cicicncy
<
p
A
,
cicicncy
Requirements engineers using the new requirement notation are more
effective than using (e.g.) activity diagram.
H
1
: p
NRN
,
cccti:cncss

p
A
,
cccti:cncss
H
0
: p
NRN
,
cccti:cncss
< p
A
,
cccti:cncss
Requirements engineers accepts the new requirement notation more than
(e.g.) activity diagram.
H
1
: p
NRN
,
occcptoncc

p
A
,
occcptoncc
H
0
: p
NRN
,
occcptoncc
< p
A
,
occcptoncc
But, e.g.
efficiency time!
more efficient less time!
Be aware that the formalization of hypotheses depends on the
operationalization!
Slide 15
Empirical Model Building &
Methods
Formalization of hypotheses (Example!!!!)
Requirements engineers using the new requirement notation are more
efficient than using activity diagram.
Efficiency time!
H
1
: p
NRN
,
timc
< p
A
,
timc
H
0
: p
NRN
,
timc
p
A
,
timc
Requirements engineers using the new requirement notation are more
effective than using (e.g.) activity diagram.
Effectiveness total N defects and N of defects per type!
H
1
: p
NRN
,
Jcccts
<
p
A
,
Jcccts
H
0
: p
NRN
,
Jcccts
p
A
,
Jcccts
H
1
: p
NRN
,
Jcccts
,
cotcgory

i

< p
A
,
Jcccts
,
cotcgory

i
H
0
: p
NRN
,
Jcccts
,
cotcgory

i

p
A
,
Jcccts
,
cotcgory

i
with i:=missing, extraneous, ambiguous, inconsistent, correct and miscellaneous information
How do you formalize the hypothesis(es) concerning acceptance?
Slide 16
Empirical Model Building &
Methods
Sampling (Example!!!!)
What sampling and sample type will be used?
Population: Requirement engineers
Sample: Novice, e.g. students of the lecture of requirement
engineering.
Why? Avoiding bias because high experience in activity diagram.
x x x x
x x x x
x x x
x x x
x x x x
x x
u: Rcquircmcnt
cnginncrs
Proctitioncrs
No:iccs
How do you select
subjects from the
population?
Probability sampling
- random, systematic, stratified
Non probability sampling
- quota, convenient
Be aware that we distinguish between randomization by :
1. Selecting subject from the population
2. Assigning subject to experimental treatments This
determined if the study is an experimental or quasi-
experimental design!!!
Slide 17
Empirical Model Building &
Methods Sampling
What sample size is required?
Depends on:
Type of hypotheses: difference, change or causal
Expected effect size
Statistical test to be used, and
Slide 18
Empirical Model Building &
Methods
Sampling (Example!!!!)
What characteristics of the subjects should be collected?
Individual attributes? e.g.
: Languages Its assumed that master and bachelor student
have an average English level.
; Education master and bachelor students from different
countries
; Highest education degree, major and the corresponding
University
; Experience
; In general, experience in software development, experience
in requirement elicitation , requirement documentation,
requirement inspection,
; In particular, experience in graphical notations, activity
diagrams,
What about gender, age, nationality, ?
Slide 19
Empirical Model Building &
Methods
Sampling (Example!!!!)
What characteristics of the subjects should be collected?
Project attributes?
Type, size
Team structure
Development environment
Application domain
What about organizational attributes ?
Slide 20
Empirical Model Building &
Methods
Sampling
What characteristics of the subjects should be collected?
How to Demographic test
1 or more questionnaires with open and closed questions
If information is required for assigning subjects to treatment, ask for
the corresponding information before or during the training.
Asked for remaining information, at the end of the study
When should we use open and closed question? Why?
Slide 21
Empirical Model Building &
Methods
Sampling (Example!!!!)
What characteristics of the subjects may be considered as
confounding or control variables?
Confounding variables
e.g. Experience
When do you identify and analyze them? Why?
Design and analysis
What can you do if you identify a potential cofounding variable
during design?
Make it constant
Transform it in an independent variable (factor)
Use parallelization or matching sampling
Take the risk
How do you explore cofounding variables during data
analysis?
Control
variables
Slide 22
Empirical Model Building &
Methods Research design (Example!!!!)
Comparison (Quasi-) experiment?
How many factors and groups?
How will you assign subjects to groups?
What is the experimental treatment? number and sequence of
steps, tasks and sessions; time,
What materials are required?
What instruments are required?
Slide 23
Empirical Model Building &
Methods
Validity threats
Are observed relationships due to cause-effect
relationship?
Are correct conclusions drawn from (correct) statistical
analysis?
Do employed measures
appropriately reflect
constructs they represent?
Can findings of the study be generalized?
Slide 24
Empirical Model Building &
Methods Validity threats (Example!!!!)
Internal Validity?
Selection
Maturation
History
Instrumentation
Mortality
Testing
External Validity?
Interaction of selection and treatment
Interaction of setting and treatment
Interaction between history and treatment
Slide 25
Empirical Model Building &
Methods Validity threats (Example!!!!)
Conclusion Validity?
Low statistical power
Violated assumptions of statistical tests
Fishing for results and error rate
Reliability of measures
Reliability of treatment implementation
Construct Validity?
Inadequate operation
Mono-operation bias
Mono-method bias
Slide 26
Empirical Model Building &
Methods
What are the next steps?
Slide 27

You might also like