You are on page 1of 38

Introduction to Program Evaluation

Professor: Dr. Jay Wilson


Course name: ECUR 809.3
Fall 2009

Planning an Evaluation Program: an Improvement Approach


The case of a Community Center in Ottawa

By
Nelson Dordelly Rosales
Graduate Student
Educational Communications and Technology

E-mail:
nrd537@mail.usask.ca
nelson_dordelly@yahoo.com

December 6th, 2009


2

OUTLINE

Page
Introduction………………………………………………………………………..……..……...... 3

I. Review of Literature

- What is Evaluation?................................................................................................................... 4
- Evaluation Models, Approaches and Methods ……..…………………………………......... 7
- Practical Steps of Getting a Program Evaluation Planned Program Evaluation ….……....... 8

II. Description of the Case Study: the Community Center in Ottawa

- Description of the Center: Structure………………..…………………………….…….….…10


- Diversified Programs and Values…………………..…………………………………..….…11
- Vision……………………………………………………………….….……………….…....12
- Mission, Goal or Purpose……………………………………………………………….….. 13

Description of Selected Program: the Intermediate Spanish Program (SIP)


- Outline and Main Goal……………………………………………………………….….….13
- Program Structure: content, activities, evaluation and materials……………………….….14
- Pre-requisite and Clientele or Participants ……………………….………………………. 15

III. Outlining the Program Evaluation Plan for the Assessment of ISP

A suitable Approach: The improvement Model………………………………………..…………15


- Stage 1 - Preparation: Focusing the Evaluation on the Improvement of SIP………..……….18
- Stage 2 - Assessment: Collecting and Using the Information..................................................20
- Stage 3 – Data Collection………………………………………………………..……..…….21
- Stage 4 - Performing an Evaluation Assessment Process………………………………....…22
Work Plan Objective (intervention objective)
Users of Evaluation
Plan development: Questions and Indicators…………………………………..….23
Data Sources and Data Collection Method…….………………………….....…..24
Budget, Resources and Timelines……………………………………………......24
Data analysis, Communicate Results and Designate Staff Responsibility……....25
Using the Information………………………………………………………........25
- Stage 5 – Evaluation: Role of the Evaluator……………………….........................................26
- Stage 6 – Reflection……………………………………………………………………….….28

Summary……………………………………………………………………………………….….29

Conclusion …………………………………………………………………..………………….…30

Bibliography…………………………………………………………………………………….....31
3

Appendices……………………………………………………………………………………….. 33
Introduction

Careful planning of an evaluation program helps to start the whole process successfully.

This paper is a report of four months of work (September to December 2009) as planner of an

evaluation program. The purpose is to explain the design of the evaluation program to assess

success and possible improvements to the Spanish Intermediate Program (SIP), which is one of the

“General Interests Programs”, offered this past Spring/Summer of 2009 by the non-profit

Community Center in the City of Ottawa. The plan is a theoretical paper that outlines the program

to be evaluated, integrates the different tools and theories addressed recently (in the Course ECUR

809) into an evaluation plan, explains why it is a suitable evaluation plan to assess the level of

satisfaction of clients and proposes a survey instrument to conduct the analysis. Essentially the

purpose of this evaluation plan is to evaluate the level of clients’ satisfaction about the

organization, design and teaching of SIP, and to convince the Coordinator of the “General Interests

Programs” of the Community Center that an internal evaluator, with the help of the clients or

students, teachers and the Coordinator, should be “the evaluator for the evaluation” of the

Program. An important piece of this evaluation plan is to describe, or elaborate upon, main reasons

for selecting the improvement approach and the logic model, which is useful for describing group

work, team work, community-based collaborative and other complex organizational processes as

its seeks to promote improvement (University of Wisconsin, 2009). Through case study, this paper

will lend insight to ways through a logic “improvement model” to facilitate a “holistic” approach

to the evaluation.

This paper has been organized in three parts comprising, a review of literature, a

description of the case and outline of selected Program, and an explanation of the process of
4

developing the program evaluation plan. A survey questionnaire was developed and applied to a

sample of students or clients of the program. The paper includes appendices with the preliminary

and final versions of the survey questionnaire. Other appendices such as the flowchart that shows

the logic model and the preliminary program evaluation plan are also included in this paper. These

documents were posted in www.researchphilosophy.blogspot.com. Data analysis and three

specific suggestions for improvement of SIP were collected and published electronically in the

web site: https://survey.usask.ca/results.php?sid=17783.

Review of Literature

Making a careful search of the literature before designing or developing new instruments is

important. According to Posavac (1990), evaluators can learn from the successes and failures of

others and get a picture of the methodological, political and practical difficulties that must be

overcome. The focus of this review of literature is on the conception of evaluation and the

different models, methods and approaches for evaluating and assessing the effectiveness of

programs. Below I will explain each of these important aspects of the review of literature.

What is Evaluation? Evaluation is a term used in many different ways. Scriven (1996)

identified sixty different ways to define the evaluation of a program that range from “appraise,

analyze, assess, critique, examine, grade, inspect, judge, review, study, testing, among others”

(p.151-162). Talmage (1982) notes that “three purposes appear most frequently in definitions of

evaluation: (1) to render judgments on the worth of a program; (2) to assist decision makers

responsible for deciding policy; and (3) to serve a political function” (p. 594).

Indeed, there is only one overall purpose for program evaluation activities, which is

“contributing to the provision of quality services to people in need” (Posavac et al., 2004, p. 13-

14). Evaluation work can be defined as a decision-making process whose focus is on “envisioning
5

and enacting a good educational journey for all students” (Wiggins, 1996, p. 20). It requires the

practice of sophisticated judgments and suggestions by the students themselves in order to

continually improve those programs. As Dewey (1966 edn.) once suggested, when we make

educational decisions, we must think broadly about the consequences of our actions. Program

evaluation is “to assess academic success” (Stephenie M. Hewett, 2008, p. 3204). In this sense,

evaluation must be inclusive and generous; in other words, evaluation purposes to value success,

not failure. Hewett (2008) says that evaluation is a multi-faceted process. The author suggests

evaluation as a continuous and ongoing process, providing both formative (ongoing) and

summative (culminating) opportunities to monitoring progress toward achieving essential

outcomes. Thus, no longer is assessment perceived as a single event. For example, recently Sherry

Y. Chen (2008) evaluated students’ learning performance and their perceptions in a Web-based

instructional program that was applied to teach students how to use HTML at Brunel University.

Students’ task achievements were affected by the levels of their previous system experience and

their post-test and gain scores were positively influenced by their perceptions and attitudes toward

the instructional program. Chen’s study has shown the importance of understanding individual

differences in the development of a program evaluation. The author suggests testing and

modification of the tests used in her research to build better instruments that can accommodate

individual differences. Thus, evaluation is a never ending-cycle of educational effort.

Evaluation uses inquiry and judgment methods, including, among others, 1) determining

standards for judging quality and deciding whether those standards should be relative or absolute,

2) collecting relevant information, and 3) applying the standards to determine value, quality,

utility, effectiveness, or significance” (Fitzpatrick et al., 2004, p. 5). Through these methods there

will be possible to discover discrepancies between program objectives and the needs of the target
6

population, between program implementation and program plans, between expectations of the

target population and the services actually delivered (Posavac, et al., 2004, p. 29). These methods

lead to recommendations intended to optimize the evaluation object in relation to its intended

purpose (s) or to help stakeholders determine whether the evaluation object is worthy of adoption,

continuation, or expansion. For example, Krista Breithaupt and Colla J McDonald (2008) provided

a description of the development and pilot study of the survey measure, and proposed their survey

as a means of assessing the quality of e-learning programs against this standard. Their findings

provide practical insights into how to support adult learners with varying needs and capabilities as

they engage in learning programs. Their findings offer insights for the improvement of e-programs

to meet the needs of clients. This type of program evaluation contributes to quality services by

providing feedback from program activities and outcomes to those who can make changes in

programs or who decide which services are to be offered. Indeed, without feedback, human service

programs (or any activity) cannot perform effectively. In this sense, Chelimsky (1997) uses the

terms ‘evaluation for development’ and ‘evaluation for accountability’ to refer to the formative

and summative purposes, respectively.

James Henderson and Katheleen Kesson (2004) talk about evaluation as “wisdom.” The

evaluation worker must consciously inquire into “the quality of educational experiences in a

comprehensive, penetrating, and far-sighted way” (p. 2). Evaluation involves both personal soul

searching and discerning criticism for the improvement of education. It is an affirmation of hope

and aspiration. “Evaluation is necessarily grounded in a humble, pragmatic openness; it takes

boldness and a deep sense of responsibility to translate our visions into action” (p. 3). During

evaluation we must act with integrity. This way of looking at evaluation places an enormous

challenge in our capacities to exercise good judgment. Evaluation is a never ending-cycle of


7

educational effort. In this sense, program evaluation is important because “information is needed

to meet the obligation of providing effective services, verify that planned programs do provide

services, examine the results, determine which services produce the best results, select program

that offer the most needed types of services, survey clients’ reactions and judgments to maintain

and improve quality and watch for unplanned side effects, among others” (Posavac and Carey,

2003, p. 3). Those are primary goals of program evaluation that can be met using a number of

different types of program evaluations: “the evaluation of need, the evaluation of process, the

evaluation of outcome, and the evaluation of efficiency” (Posavac and Carey, 2003 p. 10).

Different Models, Approaches and Methods. Through the history of evaluation, a number

of different models, and approaches to evaluation have been put forward to guide the design and

implementation of program evaluation. According to Posavac and Carey, 2003, those are the

following: the tradition (informally made by supervisors, self-evaluations, without the disciplined

analysis), social science research (to determine a program’s degree of success, experimental

approaches, introduced rigor and objectivity), industrial inspection ( depends on inspecting the

product at the end of the production line, reminds using a single final examination), black box

evaluation (examine the output of a program without examining its internal operation, reminds a

consumer or product evaluation), objective-based evaluation (examining goals and objectives and

particular structure), goal-free evaluation (focus is on the program as administered, the staff, the

clients, the setting, and the records, impacts of the program), fiscal evaluation (focus is on

calculation of the financial investment, increase output), accountability model (focus on

compliance with regulations), expert opinion model (art end literary criticism to examine a work to

render a judgment about its quality) naturalist (data-gathering instruments, not surveys or records,

qualitative methods, getting rich understanding of the program), empowerment evaluation


8

(requires close contact with the community stakeholders, inviting clients to participate actively to

improve their own community), theory-driven evaluation (careful controlled research, the analysis

consists of calculating the correlations among variables) and an improvement-focused model

(purposes to bridge the gap between what is or can be observed and what was planned).

Jody L Fitzpatrick, James R. Sanders and Blaine R. Worthern (2004) provide an explanation

of alternative approaches and practical guidelines for program evaluation. The authors identified

five approaches comprising: objectives-oriented (the purposes are specified, and then evaluation

focuses on the extent to which those purposes are achieved, it uses logic models, information could

be used to reformulate the program or a part of it), management oriented (systems approach in

which decisions are made about inputs, processes, and outputs to serve decision makers; much like

the logic models, highlighting levels of decisions), consumer-oriented (it is predominantly a

summative evaluation, uses checklists and criteria of the consumer), expertise oriented (evaluation

administered by a team of professional experts, members of committees to produce a sound

evaluation) , and the participant-oriented (participants’ opinions; clients are the focus and

orientation of evaluation; they organize and perform evaluation activities).

An adequate way of reporting the "success and failure" of a program seems to be, according

to Stake (1975), the “responsive approach or the "clock" model to reflect the prominent recurring

events in a responsive evaluation: “talk with clients, program staff, audiences; identify program

scope; overview program activities; discover purposes, concerns; conceptualize issues, problems;

identify data needs re issues; select observers, judges, instruments, if any; observe designated

antecedents, transactions and outcomes; thematize: prepare portrayals, case studies; validate,

confirm, attempt to disconfirm; winnow, for audience use; and assemble formal reports, if any” (p.

19).
9

Practical steps of getting a program evaluation planned: The Student Evaluation: Teacher

Handbook by Saskatchewan Education (1991) explains five stages a) preparation b) assessment c)

performing an evaluation assessment process, d) evaluation, and e) reflection. Posavac and Carey

(2003) coincide in that the steps in preparing to conduct an evaluation program are the following:

identify the program and its stakeholders, become familiar with information needs, planning and

evaluation. More specifically, the authors highly value the importance of the criteria and standards

that that are chosen for specific programs. To learn how to improve a program, staff members need

to find out the extent to which those purposes are achieved, so that information can be used to

reformulate the program or a part of it. As an evaluator, one needs to know what is not occurring

as expected: “Do the clients have needs that were not anticipated? Has it been more difficult than

expected to teach the staff needed skills? Has less support materialized than was promised? Has

experience led the staff to question the conceptual basis of the program?” (p.12) According to

these authors, objective information is needed, but such information should be interpreted using

qualitative information as well. In this sense, the primary goals of program evaluation that can be

met using a number of different types of program evaluations, which in logical sequence are: “the

evaluation of need, the evaluation of process, the evaluation of outcome, and the evaluation of

efficiency” (p. 10) The authors have found that personal observations provide direction in selecting

what to measure and in forming an integrated understanding of the program and its effects; that is,

does the program or plan match the values of the stakeholders? Does the program or plan match

the needs of the people to be served? Does the program as implemented fulfill the plans? Do the

outcomes achieved match the goals? Are the resources devoted to the program being expended

appropriately? In short, planning program evaluation focuses on improvement and implementing

the plan means to be wise in managing students, time, materials, budget and resources.
10

II

Description of the Case Study: The Community Center in Ottawa

The Glebe Neighborhood Activities Group (GNAG) was selected as the organization to use as a

model for the work of program evaluation. This section will focus on the description, goals or

objectives, stakeholders, philosophy and analysis of the programs’ structure of this selected

Community Center in the City of Ottawa. Below is the description of the Center, based on existing

materials and its Web site: http://www.gnag.ca/index.php

Structure: GNAG is a non-profit community group working in partnership with the City of

Ottawa and other community organizations to deliver social, cultural and recreational activities in

the heart of the Glebe. Their mission is to enhance and enrich life in our community by creating

opportunities through dynamic, innovative and affordable activities and services. GNGA is

organized hierarchically in the following manner: a) Ex-Officio Board Members, b) Chair, Vice

Chair and Treasurer, and elected Board Members that attend monthly board meetings and the

annual general meeting to oversee the financial, personnel and operational management of the

Group; b) Committee Members and special committees of the Board dealing with such issues as

personnel management, strategic planning, special projects or events; c) Special Event Co-

ordinator that takes the lead as the Coordinator of a special event (i.e. Glebe House Tour or Taste

of the Glebe). This involves event planning, organizing and coordinating the tasks of other

members that help out at special events such as the Glebe House Tour, Glebe Fall Craft and

Artisan Fair, Snow Flake Special, Taste of the Glebe and Family Dances, among many others; d)

Secretaries and Volunteers that attend to Committee meetings or help to complete any assignments

or tasks between meetings. GNAG invites the community to their Annual General Meeting at the

Glebe Community Centre on a periodical basis.


11

Diversified Programs: The Glebe Community Centre opened its doors on October 2, 2004,

almost thirty years since the first official opening of the Centre was held on November 28, 1974.

GNAG has provided feedback and input to The City of Ottawa in the development of a new set of

principles to guide the delivery of recreation services for the next 10 to 20 years, specifically about

how cultural and recreation services are provided and the challenges in meeting the demands of the

future, focusing on Service Delivery, Accessibility and Inclusion, Tax Support, Subsidization and

Revenue Generation. The resulting responses were incorporated into a final strategic direction

presented to City Council for consideration and approval (available in website at www.gnag.ca).

Under the leadership of GNAG, in partnership with the city's Recreation and Parks Department,

the centre became a hub of activity offering special events and a full slate of cultural, educational

and recreational programs for all ages. It offers diversified programs. Currently it offers the

following programs: Community Theater, Craft & Artisan, Jewelry show, Lobster Kitchen Party,

Exercise with baby, Infants, Preschooler, Parents & Caregivers, Breakfast Club, Family Children

and Youth, Birthday Party Central, Dance Studio, Pottery Studio, Workshop for all ages, Health,

Wellness & Group Fitness and Adults General Interests, which include the following specific

programs: Spanish Beginner, Spanish Intermediate/Conversational, Spanish Advanced,

Photography, Sports, and Painting/Drawing.

Values: The aim of GNAG is the inclusion of all. To GNAG a community is better off when

its members: a) care for each other, b) participate and contribute, c) share their skills and talents,

d) celebrate together. The purpose is serving the community with compassion, caring and

commitment through a hands-on approach by volunteers. It offers creative and innovative

programming that keeps up with trends and demographic changes. Overall, the goal is having a

rich cultural environment within the community.


12

Vision - In ten year’s time GNGA “Strategic Plan of September 25th, 2008) visualizes its

vision related to people, community, programs and organization. Below is GNAG’s vision for each

of them:

People: Community:

- Staff members are happy, fulfilled, challenged and - Our clients live in our community, the city-at-
well-rewarded. large and beyond.
- Visitors and program participants consistently - All members of our community feel a
report high levels of satisfaction with our centre, connection to and ownership of their
personnel and programs. community centre.
- Volunteers are growing in number and are integral - Our community centre is the cultural, social,
to the spirit of the Community centre. and recreational heart of the Glebe.
- Board of Directors is empowered, well-qualified,
involved and dynamic.

Programs: Organization:

- GNAG will be Ottawa’s flagship centre, offering GNAG has facilitated all the achievements in
the most innovative, responsive and wide ranging the People, Community and Programs
programs in the City. categories by:

- The most frequented facility, which now includes - Having efficient, up-to-date business tools.
satellite centers in partnership with local schools,
churches and senior’s residences. - The on-going recruitment, training and
development of excellent staff.
- Create a gathering place for all ages, especially for
children and youth. - Having effective and engaging partnerships
with local businesses and Government.
- ensuring that GNAG program reflects the needs of
our community by monitoring closely demographic - Ensuring strong and stable financial
change in the neighborhood. operations.

- an ongoing programming evaluation and review


process to ensure that courses are relevant, have the
best possible quality, and are cost effective or meet
the goals of our strategic plan
13

Mission: GNAG’s mission is to enhance and enrich life in our community by creating

opportunities through dynamic, innovative, and affordable activities and services. GNAG achieves

this by engaging highly competent, experienced and friendly staff and dedicated and committed

volunteers in alliances and partnerships with the City of Ottawa, local businesses, churches,

schools and other community organizations.

Goal or Purpose: GNAG is a community-driven, not-for-profit, volunteer organization

working in the heart of the Glebe in the City of Ottawa to deliver social, cultural and recreational

activities in cooperation with other groups in the community.

Outline of a Selected Program: Spanish Intermediate Program (SIP)

GNAG considers the study of Spanish to be essential components in the acquisition of a

liberal arts education for the community members, right at the heart of the City of Ottawa. At

present, Spanish is the third most widely spoken language in the world. The Spanish speaking

population of Europe and North and South America is estimated at 500 million people. These

numbers increasingly include the thriving Latino communities in the United States. These are

demographic realities that Canada cannot ignore. Relatively few Canadians know Spanish and

have an understanding of Hispanic culture. The mission of the GNAG is to lay a conversational

foundation and promote the knowledge of the literature and cultures of Spanish and Latin

American societies, to which many Canadians have strong historic ties.

SIP’s Main Goal: The purpose is to achieve Spanish Conversation at intermediate level.

With this program the student will acquire fluency and precision of speaking Spanish in a very

friendly atmosphere. The student will develop both spoken and written communication skills. The

student will attain development of listening and reading comprehension.


14

Program structure: The Intermediate Spanish Program (SIP) is designed to be used not only

at the computer and in the Community Center classrooms, but also on the commute, at the gym, on

tennis lessons, on a walk—anywhere the student can take guiding materials, a CD player or iPod.

Because the program materials are so convenient to use, the student will find himself/herself

connected with the program.

Content: Arts, language and literature; it is a step-by-step, systematic method to get

participant speaking Spanish conversationally.

1. Listen and Repeat


2. Substitution Drills
3. Replacement Drills
4. Translation Drills
5. Variation Drills

Activities: Students learn real life Spanish. Always, the goal is learning to have

meaningful conversations. The vocabulary, selected very carefully, only includes high-frequency

words that are going to be very useful. Rather than distracting the participant with lots of words,

the focus is more on structures—structures that can be expanded and used in a variety of

situations. The sequence is as follows:

1. Practice with the written transcript.


2. Practice without the transcript with Spanish speaking people.
3. To speak out loud, at normal conversational volume.

Continuous Evaluation and Improvement: will cover a relatively small body of material so

well that it becomes easy for the student to reproduce it. Focus is on pronunciation, vocabulary,

grammar, verb drills, and conversational dialogues.

Materials: illustrative materials, guides, games, outdoor activities, music, audio CDs with the

interactivity of the Internet, and with Spanish speaking instructors.


15

Pre-requisite: Spanish level 1 (Beginner). Other Criteria: authenticity. The Spanish

language as it is spoken in actual conversations—things students can really use when they travel

and interact with native speakers. The program offers guided conversation and discussions, a

varied practical exercise with emphasis on vocabulary and grammar (theory and practice).

Clientele, Participants or students: Relatively few Canadians know Spanish and have an

understanding of Hispanic culture. The clientele of Spanish in Conversational Programs are mostly

Canadians that live in the Community.

III
Outlining the Program Evaluation Plan

This past year 2008/2009 the Program of Spanish Intermediate level (SIP) was

successfully implemented. There is a need to evaluate if the clientele (students) are satisfied with

its organization, design and implementation. The idea is to know what should be improved for next

year. The stakeholders are, the Coordinator of ‘Adults General Interests Programs,’ two Spanish

teachers or instructors and participants or students.

This section will (1) explain the best approach to carry out a program evaluation for the

Community Center, particularly, the Spanish Intermediate Program, in order to determine the level

of clients satisfaction; (2) describe a suitable outline of planning the program evaluation, which

explains how to develop a program evaluation plan, including: a) preparation: focusing the

evaluation on the main 3 things that should be improved, b) assessment: collecting and using the

information, c) performing an evaluation assessment process, d) evaluation, the role of evaluator,

and e) reflection; (3) summarize the whole process with the specific case of SIP. As a community

member of the non-profit Community Center in the City of Ottawa, and as a voluntary member,

my contribution in this preliminary plan of program evaluatin is to explain a suitable approach for
16

program evaluation –the improvement oriented evaluation -- why and how it can be applied in SIP,

which is supposed to be a novelty with positive impact in participants’ satisfaction and

achievement.

A Suitable Approach: Outlining an evaluation program plan means firstly to have clear

reasons for initiating evaluation. Sometimes the ‘evaluation’ client can help us to find out: whose

need? What does s/he want to know? Why? What is its purpose? So, by listening closely to client’s

reasons for initiating the evaluation and talking to other stakeholders is important to determine the

reasons for initiating the evaluation and the purpose for the evaluation (Fitzpatrick et al., 2004, p.

175). In this sense, Chen (1996) proposed the ‘needs assessment, process and outcome’ to refer to

the type of questions the evaluation program should focus. Thus, questions are concerned with

establishing a) whether a problem or need exists and describing that problem, b) making

recommendations for ways to reduce the problem. Process or monitoring studies typically describe

how the program is delivered. Such studies may focus on whether the program is being delivered

according to some delineated plan or model or may be more open-ended, simply describing the

nature of delivery and the successes and problems encountered. “Outcome studies are concerned

with describing, exploring, or determining changes that occur in program recipients, secondary

audiences, or communities as a result of a program” (Fitzpatrick, et al., 2004, p.21).

It is important to have clear evaluation philosophy, theory or model. What approach will be

taken to accomplish an effective program evaluation? To Posavac and Carey (2003) to assess the

“evaluability” of the program we need to review literature and to examine previous experiences,

and select a model of evaluation that suits the need of the case. It is also necessary to define the

role of evaluator, and the purpose and steps in preparing to conduct the evaluation. For purposes of

this work, as previously stated in the review of literature, the ‘improvement oriented approach” is
17

the best way to start evaluation of the Community Center Programs. This model takes some

insights from different theories such as, the improvement-focused approach of Posavac and Carey

(2003), the ‘objectives-oriented evaluation’ model of Fitzpatrick et al., (2004) and the approach

“State Program Evaluation Guides: Developing an Evaluation Plan” taken by CDC/Department of

Health and Human Services (2009) and The Student Evaluation: A Teacher Handbook

(Saskatchewan Education, 1991), which consider that the main phases are the following:

preparation, assessment, evaluation (formative, diagnostic, and summative) and reflection. Another

important source is the University of Wisconsin (Program Development and Evaluation). These

models provide insights for outlining a holistic program evaluation focusing on the ‘improvement

oriented approach.” The focus of evaluation is on the evaluation of the organization, design and

teaching of the program, on how well the program is done, its strengths and weaknesses, or how

well specific aspects are done e.g., inputs, activities, and outcomes, or to discover some possible

discrepancies between the design of program objectives and its practice, that is, between plan and

program implementation, between expectations of the target population and the services actually

delivered. The focus is also on what improvements can be made to specific program. This

approach takes more a formative orientation than summative. The primary purpose is to provide

information for program improvement. Because formative evaluations are designed to improve

programs, it is critical that the primary audience be the clients or students. The target is also the

people who are in a position to make changes in the program and its day-to-day operations e.g.,

coordinators, teachers and students. So, the focus is not in potential consumers (although at the end

of the day they will benefit from better programs) or policy makers or administrators (Fitzpatrick

et al., 2004, p. 18). In sum, this paper concerns with theory but primarily with process, how the

program was organized, designed and delivered.


18

The use of an “improvement-oriented approach” for program evaluation is important because

information is needed to meet the obligation of providing effective services, verify that objectives,

contents, strategies, and activities of evaluation, and resources are properly organized and

designed, “devoted to meeting unmet needs, verify that planned programs do provide services,

examine the results, determine which services produce the best results, provide information needed

to maintain and improve quality and watch for unplanned side effects, among others” (Posavac and

Carey, 2003, p. 3). In this sense, the key is to conduct or manage the evaluation successfully. The

stages and tasks that are necessary in order to succeed in this endeavour comprising: a)

preparation: focusing the evaluation, defining the objectives, b) assessment: collecting and using

the information, c) performing an evaluation assessment process, d) evaluation, in which the role

of evaluator is explained, and e) reflection. These stages enlighten the program evaluation plan and

help in the assessment process of the Spanish Intermediate Program (SIP). Below each stage is

briefly described:

Stage 1 - Preparation: Focusing the evaluation. According to the Ellen Taylor-Powell,

Sara Steele, Mohammad Douglah (1996) of the University of Wisconsin, this phase requires to

answer questions such as, what is to be evaluated, what is the purpose of the evaluation? Who will

use the evaluation and how will they use it? What questions will the evaluation seek to answer?

What information do you need to answer? (The questions, indicators, kinds of info –qualitative or

quantitative), when is the evaluation needed? What resources do you need? (Time, money and

people). How does one determine whether a program is evaluable? To Fitzpatrick et al., 2004), this

means that as evaluator, one should clarify “the intended program model or theory, examine the

program in implementation, and explore approaches and priorities” (p.183). In the Student

Evaluation: A Teacher Handbook (Saskatchewan Education, 1991) the preparation phase requires
19

to focus on the type of evaluation (formative, summative, or diagnostic) to be used, the criteria

against which student learning outcomes will be judged, and the most appropriate assessment

strategies with which to gather information on student progress. Decisions made during this phase

form the basis for planning during the remaining phases. Evaluations are carried out to learn about

programs. Chemlisky, E., & Shadish, W.R (1997) call this ‘evaluation for knowledge.’ In order to

determine the overall scope and design of this evaluation plan, one must first obtain a complete

program description, meet with stakeholders, become familiar with information needs and

previous evaluation needs reports: Who wants an evaluation, what should be the focus of the

evaluation, why is an evaluation wanted?, When is an evaluation wanted? What resources are

available?). Rouda (1995) focusing on the evaluation means to assess an initial “gap analysis” of

the current skills, abilities, and knowledge of the major stakeholders and compare it to the desired

skill levels and knowledge base of all stakeholders. The main differences or the “gaps” between

the current and the desired determined the nature and direction of future evaluation. In this sense,

the Improvement-Focused Model seems to be an adequate way to reporting the "success and

failure" of a program. It will help us in reporting evaluation assessment of Intermediate Spanish

Program. It is important to notice that not only the survey questionnaires can provide feedback of

the quality of the program but also, it can be complemented with other activities such as specific

tests, sample work portfolios among others. After assessing the “evaluability” of the program,

reviewing the literature, the ‘gap analysis, the evaluator should be ready “to make some

methodological decisions regarding sampling procedures, research design, data collection, and

statistical analysis” (Posavac and Carey, 2004, p. 39).

Stage 2 - Assessment: collecting and using the information. Once one has selected a program

–in this case, the Spanish Intermediate and Conversation Program (SIP) of the "Adults General
20

Interest Program" of the Community Center, City of Ottawa, and once, one has identified the focus

of evaluation, which in this case is to assess the clients’ satisfaction with organization or design

and teaching of the selected program, one can proceed to identify information-gathering strategies

and to construct or select instruments, administer them, and collect the information on a sample of

clients to evaluate the instruments.

Instruments to collect the information: the Planning Evaluation Worksheets by Taylor-

Powell et al., (1996) of the University of Wisconsin are excellent examples of instruments with the

purpose of focusing the main aspects of evaluation. These instruments were used as models for

constructing our instrument. Thus, a new instrument, a survey questionnaire, was developed

specifically for the evaluation of the SIP.

Constructing the survey questionnaire: According to Ellen Taylor-Powell et al., (1998) of the

University of Wisconsin-Extension, four types of information may be distinguished: (a)

knowledge, (b) beliefs-attitudes-opinions (c) behaviours (what people do) and (d) attributes (what

people are or have) and I add a fifth one, (e) reactions (impacts of programs to the participants and

their suggestions to improve them). These types of information were taken in consideration to

design the first version of the survey questionnaire (preliminary version in Appendix # 3). It was

designed to evaluate students or participants’ reactions. The purpose was to obtain information

regarding the level of satisfaction with the program and the aspects that need to be improved in

their own views.

Identifying and selecting the evaluation questions and criteria: Generally, according to

Fitzpatrick et al., 2004, p. 234) the single most important source of evaluation questions are the

program’s stakeholders. To obtain such input or information, the evaluator needs to identify

individuals and groups who are affected by the program, e.g., policy makers, administrators or
21

managers, practitioners, primary and secondary consumers, the clients, participants, students,

teachers or members of the community. Once identified stakeholders, they should be interviewed

to find out what kind of concerns or questions do they have: How they perceive the program to be

evaluated? The information gathered during the preliminary meetings or interviews helped to

design a suitable survey questionnaire. This first version or list of possible questions is then

corrected by a sample of participants. An improved version (see Appendix 4) was then evaluated

by instructors and participants. Indeed, a survey questionnaire serves as a major source of

information. The improvement oriented approach guides the evaluator to ask whether specific

objectives are clearly stated and connected to contents, teaching strategies and activities of

evaluation, or what would prevent their achievement? Or what specific aspects should be improved

to lead program success (See third version in Appendix # 5). The information gathered during the

assessment phase with final version of instrument (see final version of instrument in Appendix # 6

and in the web site: www.researchphilosophy.blogspot.com) was used to make judgments about

what things to improve.

Stage 3 – Data Collection means that the evaluator should pay attention to important issues,

such as, what sources will you use? What data collection, methods will you use? What collection

procedures will you use? (Taylor-Powell et al., 1996). It is also important the identification and

elimination of bias (such as gender and culture bias) from the assessment strategies and

instruments, and the determination of where, when, and how assessments will be conducted

(Saskatchewan Education, 1991). These criteria were taken into account in the process of

collecting information regarding evaluation of the Spanish Intermediate Program. Collecting

information during the month of November and the first days of December was useful to be able to

write this report.


22

Performing the Evaluation Assessment Process of the Spanish Intermediate Program

It has been brought to this program evaluator’s attention that the current Coordinator of the

“Interest Programs” had some concerns regarding the improvement of the Programs. In the case of

Spanish Intermediate Program (SIP), no previous evaluation was made to this specific program.

There were some concerns about whether its organization and design of the program was

satisfactory, whether the students are satisfied with the program and whether there were

improvements to make. Following consultations with the major stakeholders of this program

(mainly the Coordinator, the teachers and the students), it was determined that, indeed there was a

need to perform an evaluation assessment process: to find out whether or not the program requires

certain improvements, or whether or not students’ satisfaction was being met, and whether or not

the program was doing what it has set out to do. The purpose then of performing the evaluation

assessment process of SIP is to find reasons for improving the program and therefore, to increase

clients’ satisfaction.

Intervention Objective

This paper uses the following SMART objective to develop the evaluation plan:

By December 5th, 2009, identify from 1 to 3 the number of improvements that should be made on
the organization, design and teaching of the “Spanish Intermediate Program,” according to The
Community Center’s members and clients.

Overall, the purpose is to find out if the clientele is satisfied with the quality of the program
organization, design and teaching, and what should be improved, that is, what changes or
improvements should be made to satisfy the clientele’s needs. At the end of the day, the idea is to
find out a number of possible improvements to be made to current program.

Users of Evaluation are the Coordinator of Adults’ General interests Programs, the two teachers of
Spanish Intermediate Program and the students, participants or clients. For this report only 5 from
a total of 15 students were active contributors, responding the survey questionnaire. The survey
questionnaire is electronically open in the web site to collect information until January 2010.
23

Plan Development: Once I have clarified the intervention objective and the use and users of the
evaluation, developing the plan included these steps:

(a) Develop evaluation questions. Different versions were developed during the process of
designing and testing of the survey questionnaire. Each one included a sample checklist, a variety
of question types such as scale rating, short answer, and open-ended. Two preliminary versions
were designed and evaluated together with four students. The students answered the questionnaire
but their concern was on correcting some discrepancies between items and the characteristics of
the actual program. They provided suggestions regarding the following issues: clarity of questions,
wording, style, and importance. These aspects were taken into account for developing a third
version of the survey questionnaire. This third version was posted in the web site of the University
of Saskatchewan. The respondents made suggestions to improve clarity in the organization and
design of SIP. All three versions are posted on the web site of ECUR 809 Course:
www.researchphilosophy.blogspot.com. The fourth version is the final one, which is now on:
https://survey.usask.ca/survey.php?sid=17783 as part of the web site of the University of
Saskatchewan. See appendices and www.researchphilosophy.blogspot.com.

(b) Determine indicators.  See the following Graphic.

Graphic Questions and Indicators


Questions Indicators Data Sources

Organization Level of Participant Survey


participant satisfaction
Are the outcomes or objectives clearly stated? satisfaction
Is the program content up-to-date? Scale:
Is the program level appropriate for most students? Excellent
Are competencies or tasks satisfactory stated? Good
Fair
Is the program presented in a logical sequence? Poor
Do performance checklists match with objectives?
Are directions or instructions for how students are to proceed through the materials Level of Participant Survey
participant satisfaction
clearly explained? satisfaction
Are content-map and competencies covered by the program? Scale:
Is prerequisite knowledge applied? Excellent
Are the materials and academic guides to students useful? Good
Fair
Does academic strategy fit with knowledge base and program? Poor
Are the visuals, videos, games, experiences and practices meaningful?
Is overall design of the learning activities satisfactory for individualized instruction?
Do learning activities and objectives match?
Do the tests and rubrics match with objectives?
Teaching Level of Participant Survey
participant satisfaction
How is the preparation and punctuality? satisfaction
Overall, does the program meet your expectations? Scale:
Excellent
Good
Fair
Poor
Registration Process Number of Follow-up written
responses: survey of attendees.
Would you register again for this program or recommend it to your friends? Yes or not
Do you find our customer service staff knowledgeable and courteous?
24

Demographic Participant Participants’


survey/demograp demographic
How many years/months have you participated in Community Center program hic questions answers.
activities?
How old are you now?
In which county do you live?
What is your gender?
What is your level of study? elementary, secondary, university, post-graduate
How would you describe the area where you live?
What is your ethnic background?
Suggestions/ Improvements Follow-up written
List of survey of attendees.
In which program or activity would you like to participate in the future? suggestions or
Please feel free to make any suggestions, comments that can help us to improve our recommendation
Program (s) s provided by the
If you would like us to respond to your comments please write name, phone and/or E- participants.
mail.

(c) Identify Data sources: Existing information or Program Kit, and the Survey Questionnaire.
People: Teachers or instructors and students or participants (clientele) of The Neigborhood or
Community Center in Ottawa. What sources of information will I use?
Existing information: Web site – Programs – Written materials provided by the Community Center
– Teachers materials – samples of students’ work and/or experiences (videos, photos, etc).

(d) Determine the data collection method. How I gathered the data? Using a final survey
questionnaire (click here: https://survey.usask.ca/survey.php?sid=17783). This process of
evaluation implied to conducting the survey questionnaire: firstly to the teacher (s) of SIP, and
secondly to a sample of participants of the Spanish Intermediate and Conversation Programs as
part of the "Adults General Interest Program" of the Community Center, City of Ottawa, to get
their feedback. This required an appropriate methodology to evaluate the program.

(e) Budget, Resources and Timelines: Specify the time frame for data collection. When will I
collect the data? Budget comes from students’ payment of tuition or registration, contributions of
the community members and from donations of the City of Ottawa.
Time frame for the evaluation: Preliminary version of this program evaluation was developed on a
voluntary basis. This is my contribution to the Community Center. However, to continue with the
ongoing process of evaluation or full implementation of it requires a specific budget to pay an
evaluator (at least $22.50 per hour as Graduate student). Number of required hours: 40 hrs of work
during four months.
Preliminary versions of survey questionnaire are presented in the web site:
www.researchphilosophy.blogspot.com.

(f) Plan the data analysis. How will data be analyzed and interpreted?
Preliminary Data was collected during the month of November, 2009. Deadline: December 6,
2009. Count number of answers (multiple choices – scale of level of satisfaction).Use of
percentages and analysis of demographic information. Data was analyzed electronically by
percentages (see Appendix # 7 and the results of survey questionnaire on the following link:
https://survey.usask.ca/survey.php?sid=17783. Data will continue to be collected until January
2010. The collected data will be used to improve the parts or the whole program, if necessary.
25

Corrections and changes will be made in concordance to students’ reactions and suggestions.
Based on the judgments (evaluations), changes and decisions to improve the program, a new
program will be offered to students, parents, and appropriate community center personnel.

(g) Communicate results. With whom and how will results be shared?
Results were shared with the Coordinator of General Interest Programs and with other two
teachers of Spanish, and with some participants, members of the community.

(h) Designate staff responsibility. Who will oversee the completion of this evaluation?
Voluntary Evaluator: Nelson Dordelly Rosales will oversee the completion of this evaluation.

In general, it seems the survey questionnaire really helped program evaluator “to discover

discrepancies and expectations of the target population and the services actually delivered”

(Posavac and Carey, 2003, p. 29) Once the instrument was applied to the selected group of

participants and instructors, content information was tabulated and data presented to Coordinator.

Using the Information: How will the information be interpreted, and by whom? How will the

evaluation be communicated and shared? How will data be analyzed? (Taylor-Powell et al.,

1996). Information collected from a sample of participants (pre-testing the questionnaire), was

used to make appropriate corrections to the instrument and a final version came up:

https://survey.usask.ca/survey.php?sid=17783. The questionnaire in its final version is now ready

for application to the whole group of students or participants. Information collected will be used

to improve the SIP for the next year.

Phase 5 - Evaluation and the role of evaluator: According Fitzpatrick et al., (2004), the

evaluator’s primary responsibility is to interpret information that can help key individuals and

groups improve efforts, make enlightened decisions, and provide credible information to the

public. This author distinguishes between internal (program employees) and external evaluators

(conducted by outsiders). Among the advantages of internal evaluators are: they have “more

knowledge of the program model and its history; they are more familiar with the various
26

stakeholders and their interests, concerns and influence; his/her knowledge can help increase the

use of the evaluation; and they will remain with the organization after the evaluation and can

continue to serve as advocates for use of its findings” (p. 187). In terms of knowledge about a

program, “internal evaluators have an advantage since they have better access to program directors

and to the administrators of the organization” (Posavac and Carey, 2004 p.17). A person who is

physically present on a regular basis is likely to see the program in action, to know its staff, and

tolerant about its reputation from other people in the organization. Such information is unavailable

to an external evaluator. The more that is known about the actual workings of the program, the

easier it is to ask relevant questions during the evaluation planning and interpretation. An internal

evaluator often works with a small group of two or three; some evaluators work alone. In this

work, I consider myself an insider rather than outsider, because I am one of the instructors of the

Community Center, and I volunteer to evaluate the program: Spanish Intermediate, which is the

program that I taught this past summer. This work was done alone, but the whole process was

developed with the help of 5 students or participants that responded to the Survey Questionnaire

and provided some suggestions. Also the other teacher and the Coordinator of the “General

Interests Programs for Adults” of the Community Center were of great help.

In this work, I took a variety of roles. Evaluators often take on many roles including

facilitator, collaborator, teacher, management consultant, specialists etc (Patton, 1996). In this

sense, Ryan and Schwandt (2002) describe the evaluator’s role as a teacher, helping practitioners

develop critical judgment. Technical expertise, skills in sampling, qualitative analysis, or statistics,

experience with the services being evaluated is also an asset. In addition to technical competence,

an evaluator’s personal qualities are important. Among them, “objectivity, fairness, trusty,

credible. Having these qualities increase the likelihood that people are more willing to devote time
27

to the evaluation, to admit problems, to share confidences and to participate in the improvement of

programs” (Patton, 1996, p. 18). Overall, according to Fitzpatrick et al., 2004) the key role for

evaluators is helping policy makers and managers select the evaluation model, performance

dimensions to be measured as well as the tools to use in measuring those dimensions. There are

two general ways an evaluator can relate to an organization needing an evaluation: a) evaluators

can work for the organization and do many evaluations, or b) they can come to the organization

from a research firm or a university to work on a specific project. Particularly evaluators in

program planning play an important role in helping articulate theories or logic models (Fitzpatrick

et al., 2004, p.13). A logic model starts with the long-term vision of how program participants will

be better changed or satisfied by the quality of the program. In this work, as evaluator I had to

apply different roles. A preliminary plan and a logic model as a way of planning how to conduct

evaluation was outlined as part of this paper (See Appendices # 1 and # 2) and on the following

link: www.researchphilosophy.blogspot.com.

Stage 6 - Reflection allows pondering the successes and shortfalls of the previous phases.

Specifically, reflection helps to evaluate the utility and appropriateness of the assessment strategies

used, and helps to make decisions concerning improvements or modifications to subsequent

teaching and assessment. Instruments contain questions that encourage reflection on student

assessment, teachers' planning, and on the structure of the curriculum.

Reflection on the Strengths and potential Limitations: Indeed, careful planning of an

evaluation program helps to start the whole process successfully; this program evaluation plan is

useful, but obviously it is not the ultimate solution to every problem or any sort of solution.

Evaluation, in this case, served to identify the level of satisfaction of a sample of students with the

organization, design and implementation of the SIP, its strengths and weaknesses; it highlights the
28

good, and expose the faulty, but the results of the survey questionnaire obviously cannot correct

the problems, for that is the role of management and other stakeholders, using evaluation findings

as one tool that will help them in that process of change (Fitzpatrick et al., 2004, p. 27). The main

strength of this work is the quality of the whole process, method, people and sources used. The

results of this planning can be one of many influences on improving the policies and or

organization practices and decisions in the Community Center. Perhaps, nowadays the main

constraint or weakness is money.

Summary

Overall, the main aspects of evaluating “Spanish Intermediate Program” are listed below:

What questions the evaluation seeks to answer? What information was needed to answer the
questions? See attach survey questionnaire or click here: https://survey.usask.ca/survey.php?
sid=17783.

Indicators: The students’ responses to survey questionnaire and testimonials of their experience
with Spanish Intermediate Program (SIP) in order to improve it (see Graphic in page 24).
Knowledge, beliefs, opinions, reactions – How will I know it? Answers to multiple choice and
scale rate questions.

When was the evaluation needed? December 2009.

What evaluation approach was used? The survey questionnaire to assess the level of clients’
satisfaction with the organization, design and teaching of SIP helps in identifying suggestions for
the improvement of the program.

Collection of the information: What sources of information was used?


Existing information: Program Kit, and the Survey Questionnaire. Web site – Programs – Written
materials provided by the Community Center – Teachers materials – samples of students’ work
and/or experiences (videos, photos, etc).
People: Teachers or instructors and students or participants (clientele) of The Neigborhood or
Community Center in Ottawa.

What data collection method(s) was used?


Mainly a Survey Questionnaire on a sample of students. Corrections and improvements were made
to the previous drafts of the instrument. Data was collected electronically from a small sample of
students and teachers: https://survey.usask.ca/survey.php?sid=17783
29

Who was involved or who should be involved?


Stakeholders: teachers, administrators or coordinators, and the students or participants.
How they were engaged? Staff meetings, email correspondence, volunteering, survey
questionnaires.

Focus of the Evaluation: Description, organization, design and teaching of the Program, level of
satisfaction of clients and suggestions for the improvement of SIP (see page 14 of this paper and
the attach logic model).
Participants’ reactions or answers to the survey questionnaire and written suggestions for the
Program’s improvement.

Goals or objectives to be evaluated: What was the purpose of the evaluation?


The purpose was to evaluate the extent or level of satisfaction of students of Spanish Intermediate
Program with the organization, design, implementation of the program. In other words:

By December 5th, 2009, identify from 1 to 3 the number of improvements that should be made on
the organization, design and teaching of the “Spanish Intermediate Program,” according to The
Community Center’s members and clients.

Responsibility: Who will use the evaluation? How will they use the information?
Administrators, coordinators, and teachers might use the information to assess the level of
satisfaction of students and to identify their suggestions for improvement of SIP. This evaluation
provided insights to assess the quality of the program organization, design and teaching: A list of
three main suggestions was created on the basis of the participants’ responses. This list will be
very helpful to coordinators and teachers make changes in the organization and re-design of the
program and meet the goals.

Specific Suggestions or recommendations provided by respondents:


https://survey.usask.ca/results.php?sid=17783

1. Improve clarity regarding statement and match of objectives, updating content map, and
tests/rubrics.
2. Enhance materials and academic guides: better directions or instructions for how students are to
proceed through the materials and the logical sequence (pre/requisites).
3. More practice or application of an academic strategy: improving fit with knowledge base and
program; making meaningful use of the visuals, videos, games, experiences and practices of
competencies, tasks, and activities.

Conclusion

The paper explains how to perform evaluation to assess merit, worth, quality and significance

of a program. To that end, the paper integrates different tools and theories of program evaluation
30

into an evaluation plan for evaluating a selected Program. Evaluations are conducted to answer

questions concerning program adoption, continuation, or improvement. This study focuses on the

last one. Specifically, this paper dealt with planning an evaluation project for the improvement of

the Spanish Intermediate Program (SIP), which is one of the ‘interests programs for adults’ offered

by the Community Center in Ottawa. To that end, it took a theoretical approach, the ‘improvement

model.’ The focus was on the organization, design and teaching of SIP; in this sense, evaluation

was undertaken, using a survey questionnaire, to identify and apply defensible criteria to determine

worth, merit or quality (Scriven, 1991) and to list a number of suggestions to further improvement

of SIP. In this work, program evaluation meant disciplined searching and caring imagination by

the evaluator, envisioning a better educational journey for clients. This approach best meets the

criteria necessary for effective evaluation that requires the inquiry of clients to find out their

judgements and to make them participate in the process of enhancing the quality of the Program.

Bibliography

CDC “State Program Evaluation Guides: Developing an Evaluation Plan” Retrieved December 2,
2009 from:
http://www.cdc.gov/DHDSP/state_program/evaluation_guides/evaluation_plan.htm
31

Chemlisky, E., & Shadish, W.R. (1997). Evaluation for the 21st century: A Handbook. (Thousand
Oaks, CA: Sage).

Dewey, J. (1966 edn.) Democracy and Education. An introduction to the philosophy of education
(New York: Free Press).

Fitzpatrick, Jody L., Sanders R. James and Worthen R. Blaine, (2004) Program Evaluation:
Alternative Approaches and Practical Guidelines (Boston: Allyn and Bacon).

Hewett, Stephanie M. (2008). Electronic Portfolios and Education: A Different Way to Assess
Academic Success in L.Tomei (Ed.) Online and Distance Learning: Concepts, Methodologies,
Tools, and Applications. (1 ed. USA: IGI global, v. 6, p. 3200-3213). Hershey, PA: Information
Science Reference.

Henderson, James G., and Kesson R. Kathleen (2004) Curriculum Wisdom: Educational Decisions
in Democratic Societies (New Jersey: Merril Prentice Hall).
http://www.managementhelp.org/evaluatn/chklist.htm

Krista Breithaupt and Colla J. MacDonald (2008). Qualitative Standards for E-Learning: The
Demand Driven Learning Model in L.Tomei (Ed.) Online and Distance Learning: Concepts,
Methodologies, Tools, and Applications. (1 ed. USA: IGI global, v. 2, pp. 1165-1177). Hershey,
PA: Information Science Reference

Patton M. Q. (1996) Utilization Focused Evaluation: The New Century Text (3rd Edition)
Thousand Oaks, CA: Sage

Patton, M. Q. (1987). How to Use Qualitative Methods in Evaluation. Newbury Park, CA: Sage.

Posavac Emic J. and Carey G. Raymond (1990), Program Evaluation: Methods and Case Studies
(New Jersey: Prentice Hall)

Posavac Emic J. and Carey G. Raymond (2003), Program Evaluation: Methods and Case Studies
(New Jersey: Prentice Hall)

Posavac Emic J. and Carey G. Raymond (2004), Program Evaluation: Methods and Case Studies
(New Jersey: Prentice Hall)

Rouda, Merrill. (1995). “Needs Assessment--The First Step”. Retrieved October 12, 2009, from
http://alumnus.calech.edu/~rouda/T2_NA.html

Ryan E. Katherine and Schwandt A.Thomas. (2002) Exploring Evaluator Role and Identity,
(Greenwich, CT: Information Age Publishing)

Saskatchewan Education (edit.) 2009. “Student Evaluation: A Teacher Handbook”. Retrieved


October 24th, 2009 from: http://www.thephysicsfront.org/items/detail.cfm?ID=6650
32

http://www.sasked.gov.sk.ca/docs/policy/studeval/index.html.2009.waq13102009

Scriven M. (1996) Types of Evaluation and types of evaluator. Evaluation practice, 17, 151-162.

Sherry Y. Chen (2008). Evaluating the Learning Effectiveness of Using Web-Based Instruction:
An Individual Differences Approach in L.Tomei (Ed.) Online and Distance Learning: Concepts,
Methodologies, Tools, and Applications. (1 ed. USA: IGI global, v. 3, pp. 1740-1751). Hershey,
PA: Information Science Reference.

Stake, R.E. (1975) Evaluating the arts in education: A responsive approach. (Columbus, OH)

Talmage, H. (1982). Evaluation of Programs. In H. E. Mitzel (Ed.), Encyclopedia of educational


research (5th ed.). New York: Free Press.

Taylor-Powell, E., Steele, S., & Douglah, M. (1996). Planning a program evaluation. Retrieved
April 2002, from University of Wisconsin-Extension-Cooperative Extension, Program
Development and Evaluation Unit. Retrieved November 27th, 2009 from:
http://learningstore.uwex.edu/Planning-a-Program-Evaluation--P1033C0.aspx

University of Wisconsin “Program Development and Evaluation” Retrieved September 2, 2009:


http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
http://www.uwex.edu/ces/pdande/

Wiggins, G. (1996). Practicing what we preach in designing authentic assessments. Educational


Leadership, 55 (1), 18-21. Example: http://www.geegees.ca/forms/program_evaluation

Appendix # 1 Logic Model


www.researchphilosophy.blogspot.com

Appendix #2 Preliminary Plan of Program Evaluation


www.researchphilosophy.blogspot.com
33

Appendix # 3 First Version


Survey Questionnaire

Short Survey: Design and test a short survey that includes a Sample Checklist, a variety of question
types such as scale rating, short answer, and open-ended.
A. - Original version
Short Answer: yes or not
1. Are objectives, competencies, or tasks stated in the student materials?
2. Does the content cover a significant portion of the program competencies?
3. Is the content up-to-date
4. Is the course level appropriate for most students?
5. Is a student’s guide included that offers how to manage and perform the course theory
and practice?
6. Is the material presented in a logical sequence?
7. Are performance checklists included?
8. Are tests included in the materials?
9. Is evaluation an integral part of (a) the development and
(b) the implementation of the program?
10. Are the resources devoted to the program being expended appropriately?
Scale rating: Quality and satisfaction Judgments. Use +, 0, - to rate or degree of the quality or your
satisfaction with specific aspects of the course:
1. Quality and satisfaction of objectives, competencies, and/or tasks_____
2. Degree or match between learning activities and objectives______
3. Quality of test tests and degree of match with objectives________
4. Quality and satisfaction with of performance checklists and degree of match with
objectives________
5. Quality and satisfaction of directions for how students are to proceed through the materials_______
6. Quality of visuals, videos, games, experiences, practices_______
7. Overall design of the learning activities for individualized instruction_____
8. Quality and satisfaction on safety practices_____
9. Satisfaction with degree of freedom from bias with respect to sex, race, origin, age, religion, etc,?
________
10. Quality and satisfaction of content list or the course content-map and competencies covered by the
course_________
Short answer: brief comment
Does the program have basic elements, such as those listed below? Please mark with an “x” and make a
comment if necessary:
1. Clearly stated outcomes objectives____
2. Sufficient directions_____
3. Prerequisite knowledge based and existing programs___
4. Fit with knowledge base and existing programs___
5. Materials required
6. Relevance of Frequency of interactions among participants: help to achieve the goals?
Have these interactions being evaluated?_______
Open ended questions: Please explain or illustrate
- What aspects of the program require improvement?
- Do the outcomes achieved match the goals?
- Is there evidence of effectiveness available regarding the program?
34

- - Does the program or plan match the values of the stakeholders?

Reflection: the internal evaluator should make a reflection


-Does the program or plan match the needs of the people to be served?
-Does the program as implemented fulfill the plans?
Adapted from: Jody L. Fitzpatrick, James R. Sanders & Blaine R. Worthen, 2004, p.100)

Appendix # 4 Second Version:


Modified Survey Questionnaire (after sampling application)

A. Knowledge: short answers - yes or not -


1. Is the program content of Intermediate Spanish up-to-date?_____
2. Is the program level appropriate for most students?_____
3. Are objectives, competencies, or tasks satisfactory stated?____
4. Is the program presented in a logical sequence?_____
5. Are you satisfied with the program have basic elements, such as those listed below?

B. Judgments/Opinions: Please write appropriate letter on each spaces below:


Very Good (VG), Good (G) or Bad (B), and make a comment if necessary:
a) Outcomes, objectives, competencies or tasks____
b) Directions or instructions for how students are to proceed through the materials___
c) Materials ____
d) Prerequisite knowledge based ___
e) Performance checklists____
f) Student’s guide_____
g) Fit with knowledge base and program___
h) Tests and Rubrics___________

C. Behaviors/Reactions - Scale rating: Use +, 0, - to rate or degree of the quality or


your satisfaction with specific aspects of the course:
- Degree or match between learning activities and objectives______
- Quality of test tests and degree of match with objectives________
- Quality and satisfaction with of performance checklists and degree of match
With objectives_____
- Quality of visuals, videos, games, experiences, practices_______
- Overall design of the learning activities for individualized instruction_____
- Quality and satisfaction on safety practices_____
- Satisfaction with degree of freedom from bias with respect to sex, race, origin,
age, religion, etc,?__
- Quality and satisfaction of content list or the course content-map and
competencies covered by the program___
Open ended questions: Please feel free to make any suggestions, comments that
can help us to improve our Program on Spanish Intermediate:

Appendix # 5
Third Version Survey Questionnaire
35

Thank you for taking the time to complete this evaluation. Your input will help us continue to offer
quality programs; making changes as we can to better serve your needs. Please mark an “x” or fill
the blanks wherever is necessary

Program Name: __________________________________________

Instructor Name: _________________________________________

I am:
- Alumni
- Teacher
- Staff
- Student
- Member of Community
Session:
- Fall
- Winter
- Spring/Summer

PROGRAM DESIGN
Objectives - Scale rating: Use +, 0, - to rate or degree of the quality or
your satisfaction with specific aspects of the course:
- Degree or match between learning activities and objectives______
- Quality of test tests and degree of match with objectives________
- Quality and satisfaction with of performance checklists and degree of match
With objectives_____
Content/Knowledge: short answers - yes or not -
1. Is the content up-to-date?_____
2. Is the program level appropriate for most students?_____
3. Are competencies or tasks satisfactory stated?____
4. Is the program presented in a logical sequence?_____

Materials/Guidelines: Please write appropriate letter on each spaces below:


Very Good (VG), Good (G) or Bad (B)
a) Outcomes, objectives, competencies or tasks____
b) Directions or instructions for how students are to proceed through the materials___
c) Materials ____
d) Prerequisite knowledge based ___
e) Performance checklists____
f) Student’s guide_____
g) Fit with knowledge base and program___
PROGRAM DEVELOPMENT (registered or taken in previous session)
_____________________________________________________________________
Organization:
-excellent
36

-good
-adequate
-unsatisfactory
Comments:____________________________________________________________
Content:
-excellent
-good
-adequate
-unsatisfactory
Comments:____________________________________________________________
Program Quality:
-excellent
-good
-adequate
-unsatisfactory
Comments:____________________________________________________________
Visuals, videos, games, experiences, practices:
-excellent
-good
-adequate
-unsatisfactory
Degree of freedom from bias with respect to sex, race, origin, age, religion, etc,?
-excellent
-good
-adequate
-unsatisfactory
Comments:____________________________________________________________
Content-map and competencies covered by the program:
-excellent
-good
-adequate
-unsatisfactory
Comments:____________________________________________________________
Activities: Overall design of the learning activities for individualized instruction
-excellent
-good
-adequate
-unsatisfactory
Comments:____________________________________________________________
Tests and Rubrics:
-excellent
-good
-adequate
-unsatisfactory
Comments:____________________________________________________________
37

INSTRUCTOR:
Rapport with participants:
-excellent
-good
-adequate
-unsatisfactory
Comments:____________________________________________________________
Professionalism:
-excellent
-good
-adequate
-unsatisfactory
Comments:
Organization:
-excellent
-good
-adequate
-unsatisfactory
Comments:_____________________________________________________________
Preparation and punctuality:
-excellent
-good
-adequate
-unsatisfactory
Comments:
Open ended questions:
Did the program meet your expectations? Yes___ No_____ Undecided_____
Comments:____________________________________________________________
Would you register again for this program or recommend it to your friends? Yes__No__
Program:
Comments:______________________________________________________________
Please feel free to make any suggestions, comments that can help us to improve our Program on
Spanish Intermediate: _______________________________________________________
ABOUT YOU
How did you hear about this program?
- Activity Brochure
- Internet/Website
-Friend/Family
-Live in the Neighbourhood
-Students
-Other:______________
Are you a first time participant in this program?: Yes____ No____
REGISTRATION PROCESS
Did you find our customer service staff knowledgeable and courteous? Yes____ No____
38

Were the times offered for this program/class convenient?: Yes_____ No_____
What other programs/classes would you like us (Languages/Sports Services) to run?:

________________________, _______________________, ______________________

When would you like them to run?:___________, _____________, _________________

If you would like us to respond to your comments please complete below:

Name:__________________________________________________________________
Day-time phone number:___________________________________________________
E-mail:_________________________________________________________________
 
Appendix # 6
Final Version of Survey Questionnaire

www.researchphilosophy.blogspot.com

Appendix # 7
Data analysis
www.researchphilosophy.blogspot.com

You might also like