You are on page 1of 46

HENRY JOHN N.

NUEVA
ME 201: Strategic Management of Engineering Enterprise
Master in Management Engineering
Pangasinan State University
At the end of this lecture and presentation, we will be
able to:

• Verify the importance of Plan Evaluation and its concept as


applied to every project proposals;
•Be aware of the procedures and composition of Evaluation Study
Committee;
•Be knowledgeable upon identifying activities as applied in the
conduct of Plan Evaluation;
•Understand the importance of Evaluation Results for wider
spectrum of development and the effect of feedback process.
•The Concept of Evaluation
•The Plan Evaluation
•Primary Purpose
•Reasons for neglectful conduct

•Plan, Program & Project Evaluation


•Organization of Evaluation Committee
•Preparing the Evaluation Proposal

•Implementing the Evaluation


•Data Gathering and Processing
•Presentation and Analysis of Data
•Findings & Conclusions
•Plan Update
Evaluation as applied to a project proposal and planning
management, describes as the “PROCESS OF ANALYZING
PROJECT INPUTS, TRANSFORMATION TECHNIQUES AND THE
EFFECT & IMPACT OF OUTPUTS AGAINST DEFINITE STATED
GOALS AND OBJECTIVES”

Hence,

Evaluation on its simplest term, define as “SYSTEMATIC


DETERMINATION OF WORTH AND SIGNIFICANCE OF
SOMETHING USING A CRITERIA AGAINST A SET OF
STANDARDS”
-a quality determinitation of Strategic Planning
For the last twenty-five years, why does Plan
Evaluation indeed grossly disregard by planners and
managers causes neglectful conduct?
1. Planners and Managers general opinion was that,
their main task is just to put the project in place
with the hope that expected results would come up;
2. They are quite reluctant to subject their projects be
evaluated by outsourced group because their
motivation, integrity and competence are placed
under scrutiny;
3. As to their presumptions, evaluation has no
practical value because of whatever results known
would not be in any way put to effective use.
CONTROLLING BUDGET
- During the planning process, the production
manager is forced to create a production plan
that fits the given budget offered by the
company. The budget should be evaluated and
assessed each time a decision is made in the
planning stages. Each decision may cost
money, whether it is in labor costs or
equipment fees. Examining the changing
budget in the planning stages can help the
production manager stay in control of the
spending.
ADDRESS RISKS UPFRONT
- Another important reason why evaluation
and assessments should be done during
any planning stages is the risks associated
with a given project. Each project
performed by a company may have a set of
risks, such as the lack of operational
equipment, sick or absent employees or
the lack of a flexible budget. Each of the
risks the production manager faces should
have a set of solutions, so the risks are
prevented upfront.
TIME FRAME
- Without a steady and solid plan for a project,
the time line can be extremely flexible. The
time line may not have a deadline or monthly
goals if a plan or schedule has not been
created. Once a deadline has been set by the
board of executives or business owner, the
production manager must evaluate the tasks
to determine whether a project can be
completed within the time frame provided.
The tasks must be assessed to ensure the
schedule is realistic.
QUALITY CONTROL
- Once the planning is complete, the manager
must go back and assess the schedule in terms of
the quality produced in the given time. If the
schedule is too packed, it may affect the quality
of the production. The assessment and
evaluation of the planning process is important
to ensure the quality of the product, as the
manager will be held responsible if it is not
satisfactory.
PRIMARY PURPOSE:

“To determine the quality of a program


by formulating a judgment”
FORMATIVE SUMMATIVE
EVALUATIONS EVALUATIONS
(Click Here) (Click Here)
FORMATIVE EVALUATIONS
An evaluation that is performed during the
development or implementation of a project
to help developers modify or improve the
project

Example:

An example of Formative Evaluation includes mid-term surveys and


focus groups asking students about a new technology that has been
introduced. During a class an evaluator gave the students a brief
survey and engages them in a discussion on specific topics. The
results from this research led to an increased awareness of some of
the problems students were having and suggested some short-term
changes in how the programs were implemented, and some larger
longer-term changes in the way the courses were designed.
SUMMATIVE EVALUATIONS
An evaluation that is performed near, at, or
after the end of a project or a major section of
a project.

Examples of summative evaluations can range from


relatively simple and common IDQ (Instructor
Designed Questionnaire) results to a careful study
that compares two sections of the same class using
different methods of instruction.
Careful planning is key to successful summative
evaluations. This is because determining the possible
outcomes and developing criteria need to occur well
in advance of the completion of the project to be
evaluated. ATL can provide support for planning and
implementation of this type of evaluation.
“Plan Evaluation as a component of
strategic plan provides the key making
strategic planning a cyclical and
continuous process”

(Annual Project Basis)


ANNUAL PROJECT/PROGRAM TIMELINE

YEAR END
Q1 Q2 Q3 Q4

MID-PERIOD

In reference to the above model, results of the Mid-Period


Evaluation would have the specific program or project as its
frame of reference.
ANNUAL PROJECT/PROGRAM TIMELINE

YEAR END
Q1 Q2 Q3 Q4

FINAL

In reference to the above model, Results of the Final Plan


Evaluation would partly tell whether the mission and the
vision of the plan are achieved or not. Thus, accomplishment
reports are integrated and consolidated by the planner in
reference to the set objective.
•It would identify conclusively whether program/project
objectives are adequate and responsively attained or not.

•Conclusively resolves whether the plan mission and vision


are realized or not.
What is the use / What will happen to the results of
outputs and outcomes in terms of effects and impacts?

1. Planners could use these results for research and


study since it is eventually recycled.
2. May use as feedback or as an input in the planning
process.
Since a medium or long-term strategic development
plan requires a periodic evaluation, a need of
evaluators or committee is highly recommended. Task
to perform in-depth reviews of selected evaluation
issues, strategies and methodologies.

Evaluation Committee also discusses selected


evaluation reports to make suggestions for including
evaluations of particular interest towards the annual
work program. It is also suggested that the
composition of committee comprises of multi-
disciplinary orientation individual or experts in parallel
to the focused project.
OVERVIEW OF THE EVALUATION.
•All experts are briefed orally or in writing before the evaluation in order to
inform them of the general evaluation guidelines and the objectives of the
research area under consideration.

•Each proposal is evaluated against the applicable criteria independently by


experts who fill in individual evaluation forms giving marks and providing
comments.

•For each proposal a consensus report is prepared. The report faithfully reflects
the views of the independent experts referred to in Step 2.

•A panel discussion may be convened, if necessary, to examine and compare the


consensus reports and marks in a given area, to review the proposals with respect
to each other to make recommendations on a priority order and/or on possible
clustering or combination of proposals.
THE EVALUATION CRITERIA.
•In all circumstances, proposals are evaluated against the criteria for the instrument
for which they are submitted. In clear-cut cases a proposal may be ruled out of
scope by the Commission without referring it to experts.

•Any proposal for an indirect action which contravenes fundamental ethical


principles or which does not fulfil any conditions set out in the call shall not be
selected and may be excluded from the evaluation and selection procedure at any
time.

•Any particular interpretations of the criteria to be used for evaluation are set out
in the work programme, in particular the way in which they translate into the issues
to be examined.
PROPOSAL MARKING.
•Evaluators examine the individual issues comprising each block of evaluation
criteria and in general mark the blocks on a six-point scale from 0 to 5 or any other
marking. An example of score markings are as follows:
0 - the proposal fails to address the issue under examination or can not be judged
against the criterion due to missing or incomplete information
1 - poor
2 - fair
3 - good
4 - very good
5 - excellent

•Where appropriate, half marks may be given. If appropriate, evaluators may also
be asked to give a mark to each of the individual issues comprising the blocks of
criteria. Only the marks for the blocks of criteria are taken into account (after
applying any weightings) for the overall score for the proposal.
THRESHOLDS AND WEIGHTINGS.
•Thresholds may be set for some or all of the blocks of criteria, such that any
proposal failing to achieve the threshold marks will be rejected. The thresholds to
be applied to each block of criteria as well as any overall threshold are set out in
the call. If the proposal fails to achieve a threshold for a block of criteria, the
evaluation of the proposal may be stopped. The reasons will be detailed in the
consensus report. It may be decided to divide the evaluation into several steps
with the possibility of different experts examining different aspects. Where the
evaluation is carried out in several successive steps, any proposal failing a
threshold mark may not progress to the next step. Such proposals may
immediately be categorised as rejected.

•According to the specific nature of the instruments and the call, it may be decided
to weight the blocks of criteria. The weightings to be applied to each block of
criteria are set out in the call.
“Implementation of Evaluation is set &
ready if and only if organized team,
approved proposal, released budget,
validated evaluation instruments are
all prepared”

“Evaluation – an EVIDENCE in Program


Development Process
Evaluation Collect Analyze &
Interpret Report
Focus Data
Evaluation
Focus

Guidelines for utility consideration in determining


•The following guidelines will determine the correct evaluation focus
the correct evaluation focus .

•What is the purpose of the evaluation?


Purpose refers to the general intent of the evaluation. A clear purpose serves as the
basis for the evaluation questions, design, and methods. Some common purposes:

Gain new knowledge about program activities


Improve or fine-tune existing program operations (e.g., program processes or
strategies)

Determine the effects of a program by providing evidence concerning the


program’s contributions to a long-term goal
Affect program participants by acting as a catalyst for self-directed change
(e.g., teaching)
Evaluation
Focus

Guidelines for utility consideration in determining


•The following guidelines will determine the correct evaluation focus
the correct evaluation focus .
•Who will use the evaluation results?

Users are the individuals or organizations that will employ the evaluation
findings in some way. The users will likely have been identified during Step
1 during the process of engaging stakeholders. In this step, you need to
secure their input into the design of the evaluation and the selection of
evaluation questions. Support from the intended users will increase the
likelihood that the evaluation results will be used for program
improvement.
Evaluation
Focus

Guidelines for utility consideration in determining


thefollowing
•The correctguidelines will determine
evaluation focusthe
. correct evaluation focus
•How will they use the evaluation results?
Uses describe what will be done with what is learned from the evaluation, and many
insights on use will have been identified in Step 1. Information collected may have
varying uses, which should be described in detail when designing the evaluation.
Some examples of uses of evaluation information:

To document the level of success in achieving objectives


To identify areas of the program that need improvement
To decide how to allocate resources
To mobilize community support
To redistribute or expand the locations where the intervention is carried out
To improve the content of the program’s materials
To focus program resources on a specific population
To solicit more funds or additional partners
Collect
Data

Collecting data is a major part of any evaluation,


•The following guidelines will determine the correct evaluation focus
but we need to take note that the method follows
purpose.
• SOURCES OF EVALUATION INFORMATION
A variety of information sources exist which to gather your evaluative data.
Thus, in a major program evaluation, we may need more than one source.

The information source we select will depend upon what is available and what
answers the evaluation questions effectively. Most common source of evaluative
information fall into 3 categories namely:

1. EXISTING INFORMATION
2. PEOPLE
3. PICTORAL RECORDS AND OBSERVATIONS
Collect
Data

EXISTING PICTORAL RECORDS


PEOPLE
INFORMATION & OBSERVATIONS
• Might use of • Think about who • Data collection via:
program can best answer • Visual accounts
documents the questions via: • Pictures and
• Log-books • Participants or photographs
• Minutes of the beneficiaries • Direct observation
meeting (directly or of situations
• Accomplishment indirectly)
• Behaviors
reports • Nonparticipants,
• Program activities
• Media releases proponents, critics,
and outcomes
victims
• Local statistics
• Experts &
• Agency data
Specialists
• Etc.
• Collaborators &
Policy makers
Collect
Data

Major Methods for collecting information about an


Evaluation.
SURVEY
-collecting standardized information through structured questionnaires to
generate quantitative data . Surveys may be mailed or online through WebPages,
completed on-site or administered through interviews,
Conducted either face to face, by telephone or electronically.

Sample surveys use probability sampling which


Allow us to generalize findings to a larger
Population while informal surveys do not.
Collect
Data

Major Methods for collecting information about an


Evaluation.
CASE STUDY
-an in-depth examination of a particular case- a program, group of participants,
single individual, site or location.

Case studies rely on multiple sources of information and methods to provide as


complete a picture as possible.
Collect
Data

Major Methods for collecting information about an


Evaluation.
•INTERVIEWS
-an information collected by talking with and listening to people.
Interviews range on a continuum from those which are tightly structured
(as in a survey) to those that are free flowing and conversational.
Collect
Data

Major Methods for collecting information about an


Evaluation.
•GROUP & PEER ASSESMENT
-collecting evaluation information through the use of group processes
such as nominal group technique, focus group, brainstorming
and community forum.
Analyze &
Interpret

What does it mean by ANALYZING DATA ?

•Analyzing data involves examining it in ways that reveal the relationships,


patterns, trends, etc. that can be found within it.

•That may mean subjecting it to statistical operations that can tell you not
only what kinds of relationships seem to exist among variables, but also to
what level you can trust the answers you’re getting.

•It may mean comparing your information to that from other groups to
help draw some conclusions from the data. The point, in terms of
evaluation, is to get an accurate assessment in order to better understand
the work and its effects in order to better understand the overall situation.
Analyze &
Interpret

2 Types of data and how to analyze as applied to


planning

• To view
QUANTITATIVE
DATA Page Pls. Proceed
Click Button

• To view
QUALITATIVE
DATA Page Pls. Proceed
Click Button
QUANTITATIVE DATA
-Refer to the information that is collected as, or can be translated
into, numbers, which can then be displayed and analyzed
mathematically
Examples include:

➢The Frequency
➢Test scores
➢Survey Results
➢Numbers or Percentages

This data allow us to compare those changes to one another, to


changes in another variable, or to changes in another population. It
will be able to tell us, at a particular degree of reliability, whether
those changes are likely to have been caused by your intervention or
program, or by another factor, known or unknown. And they can
identify relationships among different variables, which may or may
not mean that one causes another.
QUALITATIVE DATA

- Data collected as descriptions, anecdotes, opinions, quotes,


interpretations, etc., and are generally either not able to be reduced
to numbers, or are considered more valuable or informative if left
as narratives.
The challenges of translating qualitative into quantitative
data have to do with the human factor. Even if most people agree on
what 1 (lowest) or 5 (highest) means in regard to rating “satisfaction”
with a program, ratings of 2, 3, and 4 may be very different for
different people.
Analyze &
Interpret

How to analyze & interpret gathered data?

•Record data in the agreed-upon ways. These may include pencil and paper,
computer (using a laptop or handheld device in the field, entering numbers into a
program, etc.), audio or video, journals, etc.

•Score any tests and record the scores appropriately

•Sort your information in ways appropriate to your interest. This may include
sorting by category of observation, by event, by place, by individual, by group, by
the time of observation, or by a combination or some other standard.

•When possible, necessary, and appropriate, transform qualitative into quantitative


data. This might involve, for example, counting the number of times specific issues
were mentioned in interviews, or how often certain behaviors were observed.
Analyze &
Interpret

How to analyze & interpret gathered data?

•Simple counting, graphing and visual inspection of frequency or rates of behavior,


events, etc., over time.

•Calculating the mean (average), median (midpoint), and/or mode (most frequent)
of a series of measurements or observations.

•Finding patterns in qualitative data. If many people refer to similar problems or


barriers, these may be important in understanding the issue, determining what
works or doesn’t work and why, or more

•Comparing actual results to previously determined goals or benchmarks. One


measure of success might be meeting a goal for planning or program
implementation, for example
Report

Report – a final stage of Evaluation Implementation


.
• Depending on the nature of the research or
project, results may be statistically significant or
simply important or unusual. Also, These may or
may not be socially significant.
Once we’ve organized the results and run them through whatever statistical or
other analysis we’ve planned for, it’s time to figure out what these mean for the
evaluation. Probably the most common question that evaluation research is
directed toward is whether the program being evaluated works or makes a
difference.
Report

“What were the effects of the independent variable


(the program, intervention, etc.) on the dependent
variable(s) (the behavior, conditions, or other factors it
was meant to change)?.
•Findings on the report should be stated in clear, straight-forward and
objective fashion.

•It should also be in agreement with the facts presented, briefly stated in
answer to the questions raised and preferably arranged sequentially in
accordance with the order of the problems or objectives of the project.

•On the report, conclusions should be presented in a more detailed manner


and resulting directly from the findings or tested hypothesis if there are.
Recommendations advanced and proposed should
be further verified and substantiated in the light of
study findings and conclusions.
Once validated, said recommendations provide
useful inputs to planners and managers in the
planning and decision-making processes.
Said inputs not only update the plan but also
make the programs and projects more
responsive and relevant.
en.wikipedia.org/wiki/Evaluation
http://www.ehow.com/info_8013194_importance-planning-evaluation-
assessments.html
http://www.er.undp.org/Procurement/docs/undp_procurement_evaluation.pdf
http://cordis.europa.eu/documents/documentlibrary/66623291EN6.pdf

http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
http://learningstore.uwex.edu/assets/pdfs/G3658-4.pdf
http://ctb.ku.edu/en/tablecontents/chapter37/section5.aspx
http://webxtc.extension.ualberta.ca/research/evaluation//evalModel3a.cfm?&subsecti
onid=3&sectionid=1&level3=6&sublevel3=17
HENRY JOHN N. NUEVA
PLAN EVALUATION & IMPLEMENTATION
Masters in Management Engineering

You might also like