Professional Documents
Culture Documents
Training Evaluation
• Training evaluation refers to activities aimed at
finding out effectiveness of training programs
against the training objectives for which these
programs were organized
1 Reaction Reaction evaluation is how the 'Happy sheets', feedback forms. Quick and very easy
delegates felt about the training or Verbal reaction, post-training surveys to obtain.
learning experience. or questionnaires. Not expensive to
gather or to analyse
2 Learning Learning evaluation is the Typically assessments or tests before Relatively simple to
measurement of the increase in and after the training. set up; clear-cut for
knowledge - before and after. quantifiable skills.
4 Results Results evaluation is the effect on Measures are already in place via Individually not
the business or environment by normal management systems and difficult; unlike whole
the trainee. reporting - the challenge is to relate to organisation.
the trainee.
Process must
attribute clear
accountabilities.
COMA Model
1. Cognitive
2. Organizational Environment
3. Motivation
4. Attitudes
CIPP Model
for Context, Input, Process, Product,
and these 4 main aspects comprise the
CIPP Evaluation Model. The intention of
this model is not to prove, but rather, to
improve upon the programme itself. The
CIPP Evaluation Model may be applied to
educational / training programmes, to best
determine the merit and worth of the
training programme, as well as to
determine how to improve upon it.
Context Evaluation, which establishes
the goals of the programme. At this
stage, the beneficiaries and their needs
are also identified, along with potential
resources available on hand, and
potential problems that will need to be
overcome. At this stage, the background
of the programme will need to be
evaluated, and any social / economic /
political / geographical / cultural factors
within the immediate environment are to
be accounted for.
*What needs to be done ??
Input Evaluation encompasses the programme
plans / planning. Stakeholders will need to be
engaged, and suitable strategies of programme
execution identified. Competing or conflicting
strategies may also be identified. A budget will need
to be allocated and suitably portioned off. To ensure
sufficient coverage of the training programme,
research may also have to be carried out.
*How should it be done??
In Process Evaluation stage of the CIPP
Evaluation Model, the actual actions are
evaluated. This can be cyclic, repeated
throughout the develop / development stage,
or during the implementation / execution of
the training programme. Controls to monitor
the progress will have to be in place, as well
as a system for feedback from learners and
stakeholders, and vice versa.
* Is it being done??
Product Evaluation stage of the CIPP
Evaluation Model measures outcomes. The
impact / reach of the training programme,
and its effectiveness in fulfilling the
objectives. Transportability seeks to
determine if the training programme can be
transferred, adapted, or used in a different
setting. Sustainability is another aspect to be
measured, accounting for how durable / long-
lasting the benefits were. Adjustments to the
training programme may also need to be
performed at this stage.
*Did it succeed??
TVS Model (1994) Training Validation System
(TVS) Approach
1. Situation: collect pre training data to determine
current levels of performance within the
organisation; define a desirable level of future
performance
2. Intervention: identifying the reason for the
existence of the gap between the present and
desirable performance to find out if training is the
solution to the problem
3. Impact: evaluate the difference between the pre
and post-training data
4. Value: measures differences in quality,
productivity, service, or sales, all of which can be
expressed in terms of dollars
IPO Model
Input
Process
Output
IPO model can readily determine
whether training programs are
achieving the right purposes. It also
enables them to detect the types of
changes they should make to improve
course design, content, and delivery.
Jack Phillips – Five level Model
Calculating ROI
• Next step is to convert the data to monetary
value
– Direct conversion of hard data – quantity, quality, cost
or time
– Conversion of soft data to place monetary value on
improvements; Techniques are
• Historical costs
• Supervisor estimation
• Management estimation
• Expert opinion
• Participation estimation
• External studies
• Next calculate costs of the program
Calculating ROI
• ROI Formula is the annual net program
benefits divided by the program costs;
• Where,
• Net benefits are monetary value of
benefits minus costs of the program
CIRO Model
• Context – collect information about organizational
deficiency, identify needs and sets objectives at 3 levels
–
– Ultimate objectives (overcome particular deficiency)
– Intermediate objectives (changes in work behavior require for
ultimate objectives to be met)
– Immediate objectives (new knowledge, attitude, skills or attitude
employee requires to reach intermediate objectives)
• Input – involves obtaining and using information about
possible training resources to choose between
alternative inputs to training
• Reaction – involves obtaining and using information
about participants reactions
• Outcomes – involves obtaining and using information
about results
Methods of data collection
Method Advantages Limitation
Interview Flexible, opportunity for clarification, High reactive effects, high cost,
depth possible, personal interaction face to face threat potential,
labour-intensive, time consuming
Questionnaire Low cost, honesty increased if it is Possible inaccurate data, on the
anonymous, respondent sets pace, job responding conditions are not
variety of options controlled, respondent sets
varying paces, return rate of
questionnaire difficult to control
Performance Test Reliable, objective, close relation to job Time consuming, simulation often
performance difficult, high development cost
Performance Data Reliable, objective, job based, easy to Lack of knowledge of criteria for
review, minimal reactive effects keeping or discarding records,
information system discrepancies
Designs of training evaluation