Professional Documents
Culture Documents
Abstract - Two interrelated methodological towards placing a higher emphasis on applied capabilities.
transformations involved in the current transition of That is in turn related to the major overhaul proposed for the
European universities towards the European Higher evaluation procedures; the currently prevailing approach
Education Area (EHEA) are the role of applied based solely on written final exams is postulated to
capabilities and the evaluation process. In this context encourage learning by rote and being inappropriate for
this paper presents the results of a structured appraising applied capabilities. According to the European
comparison, throughout a five course period, of the University Association’s Trends V report to the Conference
impact of alternative evaluation methods in courses of Ministers of Education meeting in London on 17/18 May
aimed at the development of applied engineering 2007 to discuss the culmination of the Bologna process by
capabilities. The comparison perspective is twofold: how 2010, a majority of the participating institutions continue to
accurately does the evaluation method measure the rely on traditional end-of-year examinations to assess
competence level attained by the students, and how does student knowledge [2]. Progress is, however, being made, as
it affect their active learning. The experiment was shown by the comparison with the equivalent Trends III
conducted in a simulation course from the Industrial report figures.
Engineering curriculum and the aim was the evaluation
of the capability of using a simulation software. The recently approved legal framework aimed at revamping
Evaluation was traditionally based on a written final the Spanish higher education system to adapt it to the
exam and two other evaluation methods were then EHEA’s requirements highlights the focus on the
introduced: Computer exam and team project development of capabilities, as opposed to the mere
assignment. The assessment of the evaluation methods accumulation of knowledge, and the need to establish
was carried out by both faculty members and students appropriate evaluation procedures for these capabilities [4].
(through anonymous surveys). Results suggest that both
group assignments and computer exam perform far In the USA, the Accreditation Board for Engineering and
better, in this environment, than written exams. The Technology (ABET), among the criteria it applies for
comparison between group assignments and computer accrediting engineering programs during the 2007-2008
exam is less straightforward, being dependant on which accreditation cycle, requires that Engineering programs
criterion is being appraised. demonstrate that their students attain applied capabilities
such as “an ability to design and conduct experiments, as
Index Terms – Evaluating capabilities, on-line testing, well as to analyze and interpret data” and “an ability to use
evaluation methodologies, problem based learning. the techniques, skills, and modern engineering tools
necessary for engineering practice” [5]. It also requires the
INTRODUCTION implementation of an appropriate assessment process, with
The current transition of European universities towards the documented results, that demonstrates that the degree of
European Higher Education Area (EHEA) requires a move achievement of these capabilities is being measured. There
towards student-centered higher education and away from are, however, some worrying indicators, such as the
teacher driven provision, as well as a renewed emphasis on sustained “grade inflation” reported for a wide sample of US
employability and the development of transferable skills and universities [6].
capabilities [1], [2], [3]. Out of the many methodological
transformations involved, two significant and interrelated Appropriate assessment and evaluation procedures
components are the role of applied capabilities and the contribute to the effectiveness of the educational process
assessment of learning outcomes. through two complementary mechanisms. On the one hand,
student’s expectations about the evaluation system heavily
EHEA’s recommendations encourage a shift from the highly condition their chosen course of action. On the other hand,
theoretical approach widespread in most national higher the evaluation’s results will only be used in order to
education systems, such as the Spanish university system, continuously improve the educational process if the quality
978-1-4244-1970-8/08/$25.00 ©2008 IEEE October 22 – 25, 2008, Saratoga Springs, NY
38th ASEE/IEEE Frontiers in Education Conference
F4H-18
Session F4H
of the evaluation is perceived as being high. Additionally, in Two other evaluation methods were then introduced. Group
highly competitive educational environments, such as the project assignment (team development of a simulation
Spanish engineering schools, evaluation procedures also project) was used as a major evaluation element for two
determine which students do and which ones do not finally years. The other three years, the evaluation involved a
get the engineering degree; the net impact of this filtering is practical, computer based exam, whereby students were
again contingent on the appropriateness of the assessment summoned into a computer lab and assigned a practical case,
and evaluation procedures. for which they individually had to develop a model and
carry out experiments using the simulation software. The
The choice of the most appropriate assessment method(s) is resulting model was then uploaded to the instructor’s system
dependant on a number of parameters, such as the specific for grading.
educational outcome to be measured and the resources
available, since the resource requirements by the various The results have been appraised from both perspectives
assessment approaches differ widely. Proponents of mastery (measurement accuracy and impact on active learning).
exams point at options, such as applying Item Response Assessments were carried out by both faculty members and
Theory to analyze the exam results in order to assess student students (through anonymous surveys). In each case, the first
learning and the focus on the feedback loop to continuously year was considered a “warm-up” period, during which
improve the educational program, that can lead to an overall initial difficulties were ironed out, thus comparative
satisfactory result under certain circumstances [7]. However, measurements took place in the second year. Therefore,
for some educational outcomes, such as ABET’s “soft” there are three sets of data to be compared: pre-2003 data
professional skills, conventional assessment approaches are from the steady-state, final examination based alternative,
clearly not up to the task [8]. and data from 2004 and 2006 corresponding to the second
year of the alternative evaluation methods.
OBJECTIVES AND RESEARCH DESIGN
ASSESSMENT THROUGH A PRACTICAL, COMPUTER BASED
EXAM
Within this framework, the research project presented in this
paper was started in 2003 at the Engineering School of the
University Carlos III de Madrid (UC3M). Its goal was the Until 2002, grading for this course was based on a
structured comparison, in courses in which some of the conventional written final exam. Since a large percentage of
objectives are linked to acquiring practical capabilities in the the coursework was devoted to hands-on simulation work in
use of a software tool, of the impact of alternative evaluation the laboratory, 40% of the simulation part of the written
methods. The incidence of the evaluation methods was exam consisted of questions aimed at assessing the
compared from two perspectives: how accurately do they competence of the students in actually designing and
measure the actual competence level attained, and how do developing simulation models. Additionally, attendance to
they affect active learning by the students. These two basic the practical sessions was monitored, and students were
perspectives had to be complemented with an estimation of required to carry out a set of structured exercises utilizing
resource consumption, in terms of both student time and the simulation software.
instructor time, and the parameters on which this resource
usage was dependent (e.g. number of students enrolled) in To overcome the limitations of written exams in assessing
order to understand the feasibility of their implementation. this type of applied capability, the simulation evaluation was
then split into two different exams. Theoretical concepts
The course chosen, “Quantitative Methods in Management were still tested through a conventional written final exam,
II”, from the Industrial Engineering curriculum, covers accounting for 50% of the grade. For the remaining 50%, an
discrete event simulation and optimization (60% of the on-line, computer based exam was designed.
credits devoted to simulation and 40% to optimization). The
experiment was conducted over the discrete event simulation For the computer exam design there was little former
part of the course. As programming is unavoidable in experience from which to benefit. So a careful design phase
simulation, a substantial part of the student’s effort is was required before implementation. The exam takes place
devoted to developing the capability of constructing models in the same labs as the practical sessions. This has two main
and carrying out experiments using a commercial simulation advantages: the students are familiar with the context, which
software package (Witness®). Traditionally the evaluation contributes to reduce the stress of facing this new exam, and
was solely based on a written final exam. This approach fits the reliability of the computers has been evaluated before the
well for theoretical and numerical exercises, but it was exam so that the real capacity of the lab (in terms of number
considered less adequate for assessing the capabilities of computers expected to be available) is known and the
associated to the use of a software tool. corrective actions in case of a computer failure can be better
planned.
As for the open questions, in the case of the computer based [1] Crosier, D, Purser, L, Smidt, H, "Trends V: Universities shaping the
practical exam, most students stated that this evaluation European Higher Education Area", European University Association
procedure was more appropriate for the subject matter, and report, 2007.
therefore provided a fairer and more precise assessment. A [2] Education Ministers of Bologna Process countries, "London
significant number of responses also stated that it led to a Communiqué - Towards the European Higher Education Area:
deeper learning, even though it required additional effort. responding to challenges in a globalised world", 2007.
[3] Huba, M E, Freed, J, " Learner-centered assessment on college
83% of the students were in favor of maintaining the campuses: Shifting the focus from teaching to learning", Needham
computer based practical test, while as only 10% preferred a Heights, MA: Allyn-Bacon , 2000.
conventional written final exam and 7% had mixed feelings. [4] Ministerio de Educación y Ciencia de España, "Real Decreto
1393/2007, de 29 de octubre, por el que se establece la ordenación de
las enseñanzas universitarias oficiales", Boletín Oficial del Estado,
Regarding the project assignment, student feedback was No. 260, 2007, pp. 44037-44048.
quite similar, highlighting the positive impact on the
learning outcome. However, in this case the perception that [5] ABET, "Criteria for Accrediting Engineering Programs. Effective for
evaluations during the 2007-2008 accreditation cycle ", Engineering
workload requirements were higher than with traditional Accreditation Commission, Baltimore, MD, 2007
methods was much more acute; 100% of the students
[6] Rojstaczer, S , "Grade Inflation at American Colleges and
thought so, and over 50% described the workload Universities", Accesible at www.gradeinflation.com/, 2003.
requirements as “a lot heavier”.
[7] Qualters, D M et al., "Improving Learning in First-Year Engineering
Courses Through Interdisciplinary Collaborative Assessment",
From the faculty members’ perspective, the feedback was Journal of Engineering Education, Vol 97, No 1, 2008.
very similar, with a very positive perception of the
effectiveness of the alternative evaluation methods in [8] Shuman, L J, Besterfield-Sacre, M, McGourty, J. " The ABET
“Professional Skills” – Can They Be Taught? Can They Be
promoting the active learning of the students but at the same Assessed?", Journal of Engineering Education, Vol 94, No 1, 2005,
time leading to a much heavier assessment workload, pp. 41-55.
particularly for the project assignment option. As for their
ability to precisely and fairly measuring the knowledge
acquired by the students, both methods were considered
superior to conventional exams. The project assignment and
computer based practical exam allowed the faculty to
properly assess the level acquired by the students, although
in the case of group assignment what was accurately graded
was the team as a whole, not the individuals. In an attempt to
mitigate this intra-team blurness, 16,7% of the grade was
evaluated through an individual, assignment-related question
in the final exam.
CONCLUSIONS