You are on page 1of 12

RESEARCH AND TEACHING

Assessing and Refining Group Take-


Home Exams as Authentic, Effective
Learning Experiences
By Corey M. Johnson, Kimberly A. Green, Betty J. Galbraith, and Carol M. Anelli

A
The learning goals of a lower dvocates for improving un- and engagement, behaviors driven
division honors course, Science dergraduate science edu- by feelings of responsibility to the
as a Way of Knowing, include cation call for pedagogies group (Cortright, 2003; Zipp, 2007).
critical thinking, scientific that engage students in rel- Educators and learning experts in-
literacy, quantitative reasoning, evant, “real-world” problem solving creasingly view fixed-choice exams as
communication, and teamwork. and cooperative learning (American limited measures of student learning
To help students develop skills Association for the Advancement of as they create artificial situations that
and competencies for the course Science, 2011; Handelsman et al., do not reflect learners’ responses in
learning outcomes, we used a case 2004). From his literature review, in- real-world situations (Oakleaf, 2008;
study and developed scaffolded cluding a meta-analysis of 164 stud- Simkin, 2005). Similarly, hourly or
activities and assignments that ies of cooperative learning methods, “midterm” exams fall short because
targeted discipline-relevant Michael (2006, p. 162) concluded they impose unrealistic time limits
tasks, for example, primary that “little doubt [exists] that stu- and often do not target higher level
literature search, evaluation of dents in groups learn more.” Studies cognitive skills. In contrast, perfor-
source credibility, hypothesis also report the effectiveness of group mance-based tasks that simulate real-
construction, data interpretation, exams as a learning tool. Group ex- life application of skills, knowledge,
and restatement of scientific content ams that require critical and higher and competencies enable assessment
into lay terminology. We then order thinking skills deepen contex- in authentic contexts (Mueller, 2012;
implemented group take-home tualization and improve retention Oakleaf, 2008). By emphasizing what
exams, which feature rigorous, (Drouin, 2010; Michael, 2006; Zipp, students can do with what they know,
open-ended questions in authentic 2007). Challenging exam questions authentic assessment complements
contexts, requiring students to promote student discussion, foster- traditional approaches that emphasize
apply knowledge and competencies ing communication skills (oral, writ- what students know about a body of
cooperatively to new situations. ten, listening) and facilitating learn- knowledge.
Data from five semesters show that, ing, in part through students serving The literature on group work,
in comparison to traditional exams, as peer instructors (Michael, 2006; together with that on authentic as-
many students feel that group take- Simkin, 2005). Group exams can sessment, suggests that group take-
home exams reduce test anxiety, reduce test anxiety (Morgan, 2005; home exams could provide both a
foster interpersonal skills, are more Simkin, 2005; Zipp, 2007) and en- valuable learning experience and a
rigorous, and better enable them to able students to practice interperson- means to assess student performance
apply and synthesize knowledge and al skills such as collaboration, reci- on real-world tasks. The authors had
deepen their comprehension of the procity, team building, troubleshoot- the opportunity to experiment with
subject matter. Our study augments ing, leadership, conflict resolution, such exams when our honors college
research on group exams that use an and trust (Rao, Collins, & DiCarlo, adopted a new curriculum in 2008,
open-ended response format. 2002; Simkin, 2005; Zipp, 2007). in which introductory courses focus
Students can build on one another’s on scholarly literature use in prepara-
academic and personal strengths and tion for the required thesis. (Honors
demonstrate enhanced motivation courses fulfill general education

Vol. 44, No. 5, 2015 61


RESEARCH AND TEACHING

requirements for the baccalaureate with class size limited to 25 lower temporary and historical), interpret
degree.) We were asked to develop division students. We used back- data, and restate scientific content
a new course, Science as a Way of ward design (Wiggins & McTighe, and theories into lay terminology
Knowing, emphasizing informa- 2005) to align course learning goals (sample exam, Appendix C, www.
tion and scientific literacies, critical and outcomes (contextualized with- nsta.org/college/connections.aspx).
thinking, quantitative reasoning, and in evolutionary theory and the his- Because our students comprise a
communication. We hypothesized tory of biology; sample syllabus, spectrum of backgrounds, majors,
that group take-home exams, which Appendix A, www.nsta.org/college/ and interests, for each exam the
relieve the time constraint of hourly connections.aspx), created appro- instructor assembled students into
exams, could provide real-world priate activities, and provided time- small groups, comparably mixed
problems while fostering learning ly written feedback (Fink, 2003). (gender, students’ self-reported
and cooperative work. Our literature Earlier assessment efforts (Johnson, coursework, quality of student’s
review yielded no published reports Anelli, Galbraith, & Green, 2011) individual work to date, major and
on exams like ours, which student guided our pedagogy. Interactive experience; Siciliano 2001). The
groups complete entirely outside of lectures and discussions focused instructor assembled new groups for
class and which feature authentic, on scientific database use, types of each exam. Within the course online
discipline-relevant tasks. sources, credibility of sources and learning management system (LMS),
Here we report our findings on expertise, and interpretation and each group had its own workspace,
student attitudes and perceptions and evaluation of research articles. To accessible only to its student mem-
our research on implementation and move them progressively through bers and the current authors. On the
refinement of group take-home exams skills and competencies emphasized day that students gained access to the
used in lieu of midterm or hourly on exams, we had students work in exam, the instructor allowed 10–15
exams. Our exams feature rigorous, small groups (three to five mem- minutes of class time for groups to
open-ended questions in authentic bers each), in and outside of class, meet and make organizational plans.
contexts and require students to ap- to complete scaffolded assignments The instructor exhorted students
ply knowledge and competencies that included a case study (Hollis- to be “good team members” and
cooperatively to new situations. Such ter, 2005) and several question sets to contact her with unresolvable
exams embody four of seven long- (samples, Appendix B, www.nsta. group issues/questions. She moni-
recognized essentials of good practice org/college/connections.aspx), most tored groups’ progress informally
in undergraduate education: time on of which scrutinized Moran (2004), by querying groups during class or
task, communication of high expecta- a primary article selected for its via LMS; clarification issues were
tions, active learning techniques, and rigor and design (i.e., test of predic- resolved at once. Most groups did
reciprocity and cooperation (Chicker- tions of evolutionary theory using their own troubleshooting, but the
ing & Gamson, 1987). In addition, several experimental approaches), instructor intervened at her discre-
good practice for effective assessment minimal use of disciplinary jargon, tion when asked. With the exam, the
involves representatives from across and conceptual accessibility. Gil- instructor posted a dissension form
the educational community (Ameri- len (2007) served as a supplemental for students who wished to submit
can Association for Higher Education “how to” guide on the structure, sty- their own answer(s) if they disagreed
and Accreditation, 1996). Our team listic conventions, and critical read- with their groups’ answer(s); in our
comprises a subject expert (professor ing of primary scientific articles. five semesters of teaching the course,
and instructor of record), an assess- For group take-home exams, the no student ever used the form.
ment specialist, and two instruction instructor selected research articles When the deadline for the com-
librarians, one specializing in science. to which students had not been pre- pleted exam expired, the instruc-
viously exposed and whose subject tor posted the answer key (sample
Methods of instruction and matter had not been discussed. The grading key, Appendix D, www.
assessment exams featured a mix of brief essays nsta.org/college/connections.aspx)
Nonscience majors constituted and short answers, requiring students to the LMS site. She devoted the
20%–25% of course enrollment, to search the primary literature (con- next class period to discussion of the

62
Journal of College Science Teaching
exam, facilitated by visual projection college/connections.aspx). to take ownership of their working
of the grading key, and emphasized We typically administered two relationship, individual effort was
problematic questions as revealed group take-home exams per semes- not factored into group exam grades;
by student feedback. Discussion ter, in Weeks 6 and 11 of a 15-week within a given group, all members
sometimes led to improvements in semester. Groups had 7–10 days received the same exam grade. All
how future exams were written (e.g., for each exam, each of which con- students took an individual, in-
providing detailed breakdown of tributed 14% ± 2% (M ± SD) to the class, closed-book final exam, worth
point allocations for exam questions; overall course grade, depending 16% ± 3% (M ± SD) of the overall
also see Appendix E, www.nsta.org/ on semester. To encourage groups course grade, depending on semes-
ter. Individual effort (attendance,
various assignments; see syllabus,
FIGURE 1 Appendix A, www.nsta.org/college/
Template for small group take-home exam contract connections.aspx) contributed 48% ±
4% (M ± SD) to a student’s overall
GROUP CONTRACT FOR EXAM #___ course grade.
(COURSE NAME & NUMBER, DATE) To assess students’ attitudes,
GROUP NAME ______________________________ background knowledge, and perfor-
mance, we used pre- and postcourse
We, the undersigned, have together devised and agreed to an initial plan (below) for anonymous questionnaires (Appen-
working on the exam (insert initial deadline(s) for work to be shared via Google docs). dix F, available at www.nsta.org/
We will indicate individual contributions (tracking, color coding, etc.) and contact (in-
structor name) ASAP with concerns. college/connections.aspx), honors
college anonymous end-of-course
Plan: (Group inserts details here) evaluations, and exam scores (gener-
ated by the instructor using grading
Each of us also agrees to do the following:
1. Attend group meetings (virtually or in person) keys; see sample, Appendix D, www.
2. Maintain contact w/group members nsta.org/college/connections.aspx).
3. Communicate constructively to group discussion & answers We did not track individual students
4. Be cooperative and understanding in pre–post pairings. For Likert-scale
5. Take a leadership role as needed
6. Encourage and assist my team members pre- and postcourse questionnaire
7. Complete all tasks agreed upon by the group on time prompts, which did not change from
8. Complete/upload my exam portion and share it with the team by (date, time) semester to semester, we pooled
9. Read, comment on, and edit the entire exam by the time agreed upon by the data from all five semesters that we
group
10. Ensure that the exam final version is ready to be uploaded by (date, time) taught the course: fall 2008 (initial
11. Notice and work to curtail whatever tendencies I may sometimes exhibit that course offering), spring 2009, fall
others may perceive as uncooperative 2010, fall 2011, and fall 2012. For
open-ended prompts as to the posi-
If any one of us causes difficulty with the group, and/or breaks the contract in any
way, we understand that the other team members have the right and are expected tive and negative aspects of group
to contact (instructor name) and inform her of the situation. We further understand exams, we coded and summarized
that (instructor name) will serve as arbiter and may decide to penalize the teammate data by semester and across all five
in question by lowering his/her grade in accord with the situation, or making the semesters. Beginning with fall 2010,
teammate complete the exam alone.
students also completed self- and
peer-performance forms after each
Signed, exam (based on criteria in Isaacs,
2002), which the instructor used
___________________________ ____________________________ to assess group dynamics (student
grades were not impacted). Our
___________________________ ____________________________
institutional review board approved
our protocols and instruments prior
to implementation.

Vol. 44, No. 5, 2015 63


RESEARCH AND TEACHING
FIGURE 2
Pre- and postcourse questionnaire responses regarding group work and communication (Likert scale
responses).

64
Journal of College Science Teaching
significant shift in the means, but there
FIGURE 3
was a significant difference in the dis-
Comparison of exam-type preference, pre- vs. postcourse tribution of responses between pre- and
questionnaires. postcourse observations (cross-tabu-
lation analysis: χ2(4) = 10.24, p < .05;
Figure 3). At the course outset, 41% of
students expressed neutrality on exam
preference versus 32% in postcourse
analyses. Apparently a comparable
percentage of students pre- versus
postcourse preferred learned/memo-
rized exams (39% vs. 35%), whereas
the percentage of students who dis-
sented from that preference increased
by 13% in pre- versus postcourse data
(33% vs. 20%).
A set of six questions in the post-
course questionnaire targeted stu-
dents’ perceptions of our group
take-home exams compared with
traditional exams (hourly, in-class,
completed individually). Fifty-five
Our experiences and formative as- structure of the process remained the percent of respondents indicated that
sessment over five semesters led us same throughout the five semesters. group take-home exams were more
to implement changes to improve the Assessment of improvement based rigorous than traditional exams,
group-exam process: (a) designate a on these specific changes is not pos- compared with 15% who dissented
group leader to keep members on task sible because, given our enrollment, from that view (31% being neutral);
and assemble their work (we assigned we taught only one course section per student perceptions were split on
leaders randomly; no student served semester and had no “nontreatment” whether take-home exams required
more than once; implemented with group. more (40%) or less (35%) individual
Exam 1, fall 2011); (b) schedule the effort than traditional exams (Figure
exam over a 10-day period (instead Results 4a). The vast majority reported that
of 7), spanning two weekends (ours Pre- and postcourse anonymous compared with hourly exams, group
is a rural campus and students’ co- questionnaires take-home exams enabled them to
curricular activities often occur at At the course outset, the majority apply and synthesize knowledge,
distant locations; this change mitigated of students indicated that they were deepened their comprehension, ben-
stress; implemented with Exam 1, fall comfortable working in groups, and efitted their interpersonal skills, and
2011); (c) require groups to submit a most appeared motivated to gain increased their awareness of course
completed contract detailing plans for experience in oral and written com- relevance (Figure 4b).
exam completion and participation, munication of scientific research
with frequent updates (implemented (Figure 2a and b). Most postcourse Course activities and resources
with Exam 1, fall 2012; Figure 1); and respondents felt they had made gains Our postcourse questionnaire in-
(d) implement a draft deadline on Day in these areas (Figure 2c and d). cluded a list of course activities and
7 for groups’ answers to be uploaded We asked students if they preferred resources from which we asked stu-
to their LMS site (implemented with exams that ask for information learned/ dents to select those that helped them
Exam 2, fall 2012). Although these memorized or exams that require the learn. Of 77 respondents, 71% (N =
changes were important to the group application of skills or knowledge. 55) selected the entry “group work
exam experience, the fundamental The data did not indicate a statistically for exams,” making it the fourth

Vol. 44, No. 5, 2015 65


RESEARCH AND TEACHING

most selected item after PowerPoints


FIGURE 4
(84%), class discussions (84%), and
the textbook (80%; 5 semesters; data Postcourse questionnaire responses regarding group take-home
not shown). exams.

Comments on group take-home


exams
Postcourse questionnaires prompted
students to “comment on the posi-
tive and/or negative aspects of group
take-home exams.” In five semes-
ters’ data, 43 of 77 respondents
(56%) yielded a pool of 32 negative
and 41 positive comments, which
we categorized and summarized
(Table 1). Most negative comments
focused on group members’ unequal
contributions to the exam (quality
and/or quantity), followed by exam
length/workload. The largest number
of positive comments cited various
benefits of the group experience, fol-
lowed by learning quality afforded
by group exams. Analysis showed no
significant trend in positive-to-neg-
ative comment ratios in successive
semesters (logistic regression: χ2(1)
= 1.73, p = .19). The largest positive-
to-negative comment ratios occurred
in the last two semesters, by which
time some or all improvements to the
exam process (see Methods section)
had been implemented.

Honors course evaluations


Students completed honors college / 36% agreed; data not shown). not shown). For informational (not
anonymous course evaluations for comparative) purposes, final exam
three semesters (fall 2010 to fall Exam scores grades (closed book, completed by in-
2012). Respondents (58.6% response Group take-home exams required dividual students) over five semesters
rate, N = 58) indicated that they had transfer of skills in information and averaged B to B+ (M = 82.3% ± 3.0%
improved at collaborating with class- science literacies, critical thinking, SD; range of averages for 5 semesters
mates (62.1% strongly agree / 32.8% quantitative reasoning, and communi- = 76.8% – 85.6%). No exams were
agree) and that the course had posi- cation about science to new situations graded on a “curve.”
tively impacted their skills in criti- (sample exam, Appendix C, www.
cal thinking (64% strongly agreed / nsta.org/college/connections.aspx). Postexam self- and peer-
36% agreed), writing (33% strongly On our grading scale, group exam performance forms
agreed / 59% agreed), and quantita- grades averaged B+ to A– (M = 88.3% Students rated their own and each
tive reasoning (53% strongly agreed ± 3.7% SD for 10 exam averages; data group member’s performance on ex-

66
Journal of College Science Teaching
ams according to 12 criteria using a no response = 9 [<1%]). The criteria to postcourse questionnaire data in
Likert scale (1 = very well, 2 = ad- with the most 3s (poor) were “Read Table 1, the largest number of nega-
equate, 3 = poor). Eighty-six stu- and commented in a timely manner tive student comments postexam
dents completed self-performance on drafts of the exam” and “Encour- focused on group dynamics (e.g.,
forms, yielding a total of 1,032 rat- aged and assisted other group mem- problems with scheduling, com-
ings (86*12; data not shown). The bers.” munication, leadership, peers’ work
vast majority of students rated their habits/behavior), followed by gen-
own efforts highly (very well = 815 Comments on group take-home eral/specific criticism of the exam
[79%], adequate = 183 [18%], poor exams (e.g., word limits, certain exam ques-
= 23 [2%], no response = 11 [1%]). Performance forms prompted stu- tions, challenges dividing work-
The criteria with the most 3s (poor) dents to state positive and negative load). The largest number of posi-
were “I took a leadership role as things about the group exam and tive comments cited greater time
needed” and “I read and commented suggest improvements. Of 81 stu- to complete the group take-home
in a timely manner on drafts of the dents who completed the forms, 68 versus an in-class exam, followed
exam.” Peer-performance forms (83%) responded to the prompts, by benefits of the group experience
(N = 260) yielded 3,120 ratings yielding a total of 41 negative and (e.g., collaboration/brainstorming,
(260*12), with aggregate ratings 85 positive comments (Table 2). To greater exposure to learning styles,
mirroring those for self-performance categorize and summarize students’ practice with conflict resolution,
forms (very well = 2,609 [84%], ad- comments, we used the same crite- authenticity of tasks, sharing of
equate = 415 [13%], poor = 87 [3%], ria as those for Table 1. In contrast workload, knowing classmates bet-

TABLE 1
Student response rates on postcourse questionnaires, with distribution and coding of open-ended comments
about the group take-home exams.

# Negative comment codings # Positive comment codings


1. Unequal group sizes and individual contributions 1. Benefits of group collaboration
2. Worse than in-class exams 2. Conducive to quality learning
3. Doesn’t prepare for in-class, individual final exam 3. Praise for exam, general and specific
4. Group dynamics issues 4. Better than in-class exams
5. Criticism of exam, general and specific 5. More time to complete vs. in-class exam
6. Requirement for draft exam on day 7
Note: An individual student may have offered >1 comment, both positive and negative.

Vol. 44, No. 5, 2015 67


RESEARCH AND TEACHING

ter, working harder for the group). to gauge students’ performance, but knowledge and skills collaboratively
The ratio of positive-to-negative cognizant of challenges. Honors to authentic, real-world tasks. Initially
comments trended toward increas- students are academically motivat- our students expressed anxiety that
ing over the last three semesters, al- ed and have good track records as their semester exam grades would
though this trend was not significant independent achievers, as reflected reflect their group’s collective efforts,
(logistic regression analysis: χ2(1) = in their scores on the cumulative partly because they lacked experience
1.63, p = .20); most positive com- final exam (which included higher with this type of exam, but also be-
ments for fall 2012 Exam #2 (32%, order questions), and many decry cause, as highly engaged individuals,
N = 10) cited implementation of the group work. In addition, nonmajors’ their busy lives presented scheduling
Day 7 draft deadline. attitudes toward science can damp- difficulties. Yet most students left
en their interest, and some students the course feeling more comfortable
Discussion in our lower division course are still about performing group work, and
The abilities to evaluate and make adjusting to the demands of college the majority felt that group take-
sense of scientific findings and to in- coursework. home exams enabled them to apply
teract productively with others rep- Challenges notwithstanding, our and synthesize knowledge, deepened
resent critical life skills and embody results suggest that group take-home their comprehension of the course
our course learning outcomes. We exams can provide positive learn- material, made them more aware of
were eager to explore group take- ing experiences and simultaneously its relevance, and helped them hone
home exams as impactful student facilitate assessment and evaluation interpersonal skills. The vast majority
learning experiences and as tools of students’ ability to apply science also indicated that our course posi-

TABLE 2
Student response rates on postexam performance forms, with distribution and coding of open-ended
comments about the group exam.

# Negative comment codings # Positive comment codings


1. Unequal group sizes and individual contributions 1. Benefits of group collaboration
2. Worse than in-class exams 2. Conducive to quality learning
3. Doesn’t prepare for in-class, individual final exam 3. Praise for exam, general and specific
4. Group dynamics issues 4. Better than in-class exams
5. Criticism of exam, general and specific 5. More time to complete vs. in-class exam
6. Requirement for draft exam on day 7

Note: An individual student may have offered >1 comment, both positive and negative.

68
Journal of College Science Teaching
tively affected their critical thinking, comparable, the relative rankings home exams as impactful learning
writing, quantitative reasoning, and of comments differed between the experiences. In the last two semes-
science communication skills (An- two types of data. The top two ters that we taught the course, on
elli, Johnson, Galbraith, & Green, negative comments on postcourse both postexam performance forms
2015). As group take-home exams questionnaires focused on unequal and postcourse questionnaires, we
and related group work constituted a group sizes/individual contribution received more positive than nega-
considerable portion of course activi- issues and the group exam format, tive comments, suggesting that the
ties and final grade computation, these compared with group dynamics is- assessment-based changes we im-
findings lend support to the positive sues and criticism of the exam itself plemented helped improve the exam
impact of cooperative work. Our ex- on postexam performance forms. experience.
ams did not compromise rigor; groups Similarly, the top positive comment It does appear that group take-
generally performed well but seldom category on postcourse question- home exams were on students’
achieved 100%, and about half of our naires was “benefits of group col- minds, as it was the most prolific top-
students felt our exams were more laboration,” whereas on postexam ic for comment among postcourse re-
rigorous than traditional exams. forms, “greater time allowance” spondents who provided “one or two
Students’ open-ended comments was the most popular commenda- concrete suggestions” to improve
provide insight into their perceptions tion, accounting for 30% of student the course. Many students wanted
of group take-home exams. Some comments compared with 5% on at least one take-home exam to be a
students expressed excitement that postcourse questionnaires. Not to solo effort; this would have signifi-
the exams required critical thinking overemphasize these data, but they cantly increased our grading effort
and application of knowledge, and do suggest that instructors should and diminished the opportunity for
some characterized the exams as a consider that students’ feedback can students to hone collaborative skills.
great way to learn and retain infor- vary over time. Perhaps when the Others suggested that we account for
mation, practice concise writing, exam experience is relatively fresh, individual effort by giving students
have fruitful discussions, and learn students value most the greater time individual grades plus the group’s
conflict resolution. Others saw the allowance for take-home exams, but grade, an approach many instructors
group take-home format as superior with the passage of time they may use (Simkin, 2005; Rao et al., 2002);
because it relieved the pressure from recall both the benefits of teamwork we chose not to do so because we
individual performance and “cram- and their discontent with particular wanted to incentivize students to
ming” for exam preparation and/or teammates. Teaching and assessment improve at working cooperatively.
completion. One of the students de- expert Maryellen Weimer (2002) Still other students believed that the
scribed our exams as “fun,” another urged instructors not to ask students group exam ill-prepared them for
claimed to “love them,” yet another whether they “liked” a particular the final exam, administered all five
called it “one of [his/her] best experi- activity: semesters as an individual, closed-
ences for college exams.” book, in-class exam. To address this
As anticipated, based on the liter- That is an irrelevant criterion concern we provided a study sheet
ature and our teaching experiences, . . . The questions you need that showed point allocations for the
challenges surfaced and students answered are these: “How various topics and types of questions
voiced complaints and concerns. We did that activity. . . affect that would appear on the final. On
strove to address issues proactively, your learning?” “What about the basis of five semesters’ worth
and as students will undoubtedly en- it needs to change so that if of data, students earned on aver-
counter undesirable peer behaviors we do it again, you will learn age 5 percentage points less on the
throughout their lives, we do not more?” (p. 199) final exam versus group take-home
see challenges as reason to avoid exams. We embedded questions in
group exams. Interestingly, although Almost three quarters of our students the final that required application
the types of positive and negative selected “work for group exams” as of knowledge and skills per the syl-
comments made on questionnaires a course activity that helped them labus, and students’ performance
versus performance forms were learn, underscoring group take- and their self-reported gains suggest

Vol. 44, No. 5, 2015 69


RESEARCH AND TEACHING

that course learning outcomes were many pairs of student eyes reviewed positive learning experience.
adequately met (Anelli et al., 2015). each exam answer for proper idea We have not found a published re-
Most groups functioned well, even attribution. We monitored student port on exams that features authentic,
excellently (e.g., one student wrote, progress, conceptual blocks, and/or discipline-relevant tasks for student
“I love my group!”), but we instituted misperceptions by informally asking groups to complete entirely outside
changes to diminish procrastination how the exam was going. We could of class. We believe instructors can
and project management issues. assess each student’s contribution by use such exams to measure and enrich
Our Day 7 draft policy with a Day having groups indicate in the e-ver- student learning and transferability of
10 exam deadline garnered many sion of their exam who contributed science knowledge and competencies,
positive responses and improved col- what. Having only 5 or 6 exams to a need increasingly drawing the atten-
laboration. Students used the “extra” grade versus 25 represented a “plus.” tion of educators (Mervis, 2013). We
time to review and edit teammates’ Preparing a grading key with detailed offer these recommendations. First,
answers and complained less about point allocations before evaluating build motivation: Articulate your
the perceived need to “divide-and- groups’ answers promoted instructor course learning goals to your students
conquer” exam questions, which they objectivity, saved grading time, and and explain how their attainment
felt negatively impacted their learn- helped provide feedback during dis- of those goals will be demonstrated
ing. We concur, and through assess- cussion of the exam. For large enroll- by their performance. Second, build
ment we discovered that our students ment courses with significant grading skills and give practice: Have students
needed practice developing the habit demands, O’Dowd (2011) provided practice teamwork and self/peer
of reviewing teammates’ answers. guidance on implementing a choice evaluations on scaffolded, low-stakes
Assessment also led us to designate of fixed answers for questions that activities, for example, homework
a leader for each group, causing “lack target higher levels of understanding. assignments worth minimal points,
of leadership” complaints to disap- On the negative side, we found before administering a group take-
pear. Technological advances also that developing a group take-home home exam (Morgan, 2005). Third,
facilitated group work. In 2008, stu- exam required more effort than a assess and guide: Assess students’ col-
dents had difficulties communicating traditional hourly exam as did writ- laborative and work skills and provide
with group members and tracking the ing a student-friendly grading key prompt guidance and intervention
latest version of their group’s exam for open-ended responses. Locating as needed; give feedback regularly.
answers; both problems diminished research articles that were relatively Fourth, be clear and fair: State your
with the greater prevalence of mobile free of scientific jargon yet amenable exam policies and implement them
phones, texting, and students’ famil- to testing students’ skills (as opposed consistently and fairly. Finally, refine:
iarity with and use of Google Docs. to specific content knowledge), and Use assessment recurrently to make
For instructors, group take-home creating exam questions aimed at the needed adjustments to your exams
exams offer plusses and minuses. application of scientific competen- and policies. ■
The format affords development of cies, took time. Scientific research in
complex, in-depth questions that can the media (including the “Ig Nobel Acknowledgments
draw on an array of materials: the Prizes”) often provided leads to pro- We acknowledge Dr. Erica Suchman,
current scientific literature, online ar- vocative, suitable articles. Colorado State University, whose 2008
chival materials, the New York Times presentation on group exams for her mi-
Tuesday science section, scholarly Conclusions and crobiology course inspired the instructor
essays, YouTube videos, etc. Tak- recommendations to experiment with and expand on this
ing advantage of this flexibility, On balance, our experiences with pedagogy, and Dr. David Sloan Wilson,
we designed creative (dare we say group take-home exams lead us to Binghamton University, who suggested
“fun”?) exams that were amenable conclude that the positive aspects the use of contracts for our exams.
to assessment of student achievement outweigh the negative. Students can
of outcomes. Plagiarism concerns practice cooperative skills and per- References
were essentially nonexistent because form discipline-relevant tasks, and American Association for the
answers could not be “googled,” and many come to view the exams as a Advancement of Science. (2011).

70
Journal of College Science Teaching
Vision and change in undergraduate A., DeHaan, R., . . . Wood., W. B. whatisit.htm
biology education: A call to action. (2004). Scientific teaching. Science, Oakleaf, M. (2008). Dangers and
Washington, DC: Author. Retrieved 304(5670), 521–522. opportunities: A conceptual map
from http://visionandchange.org/ Hollister, C. (2005). Rising of information literacy assessment
files/2013/11/aaas-VISchange- temperatures, differing viewpoints: approaches. Portal: Libraries and
web1113.pdf A case study on the politics the Academy, 8, 233–253.
American Association for Higher of information. Buffalo, NY: O’Dowd, D. (2011). How to write good
Education and Accreditation. (1996). University of New York at Buffalo’s multiple-choice exam questions.
Nine principles of good practice National Center for Case Study Retrieved from http://pdfooz.
for assessing student learning. Teaching in Science. Retrieved from org/k-28158359.html
Washington, DC: Author. Retrieved http://sciencecases.lib.buffalo.edu/ Rao, S. P., Collins, H. L., & DiCarlo,
from http://assessment.uconn.edu/ cs/index.asp S. E. (2002). Collaborative testing
docs/resources/AAHE_Principles_ Isaacs, G. (2002). Assessing group enhances student learning. Advances
of_Good_Practice.pdf tasks. Queensland, Australia: in Physiology Education, 26, 37–41.
Anelli, C. A., Johnson, C. M., University of Queensland, Siciliano, J. (2001). How to incorporate
Galbraith, B. J., & Green, K. A. Teaching and Educational cooperative learning principles in
(2015). Using group take-home Development Institute. Retrieved the classroom: It’s more than just
exams to develop collaborative from http://www.itl.usyd.edu.au/ putting students in teams. Journal
learning, scientific literacy, and assessmentresources/pdf/Link11.pdf of Management Education, 25(8),
higher-order cognitive skills. Johnson, C. M., Anelli, C. A., 8–20.
Unpublished manuscript. Galbraith, B. J., & Green, K. Simkin, M. G. (2005). An experimental
Chickering, A. M., & Gamson, Z. A. A. (2011). Information literacy study of the effectiveness of
(1987). Seven principles for good instruction and assessment in an collaborative testing in an entry-
practice in undergraduate education. honors college science fundamentals level computer programming class.
American Association for Higher course. College and Research Journal of Information Systems
Education Bulletin, 120, 3–7. Libraries, 72, 533–547. Education, 16, 273–280.
Cortright, R. N. (2003). Student Mervis, J. (2013). Transformation is Weimer, M. (2002). Learner-centered
retention of course content is possible if a university really cares. teaching: Five key changes to
improved by collaborative group Science, 340(6130), 292–295. practice. San Francisco, CA: Jossey-
testing. Advances in Physiology Michael, J. (2006). Where’s the Bass.
Education, 27, 102–108. evidence that active learning works? Wiggins, G., & McTighe, J. (2005).
Drouin, M. (2010). Group-based Advances in Physiology Education, Understanding by design (2nd ed.).
formative summative assessment 30(4), 159–167. Upper Saddle River, NJ: Pearson
relates to improved student Moran, A. (2004). Egg size evolution Education.
performance and satisfaction. in tropical American arcid bivalves: Zipp, J. T. (2007). Learning by exams:
Teaching of Psychology, 37, 114– The comparative method and the The impact of two-stage cooperative
118. fossil record. Evolution, 58, 2718– tests. Teaching Sociology, 35, 62–76.
Fink, L. D. (2003). A self-directed 2733.
guide to designing courses for Morgan, B. M. (2005). Cooperative Corey M. Johnson (coreyj@wsu.edu) is
significant learning. San Francisco, learning in higher education: A head, Library Instruction Team; Kimberly
CA: Jossey-Bass. comparison of undergraduate and A. Green is director, Office of Assessment
Gillen, C. M. (2007). Reading primary graduate students’ reflections on and Teaching; and Betty J. Galbraith is
literature: A practical guide group exams for group grades. science librarian and instruction coordi-
to evaluating research articles Journal on Excellence in College nator, all at Washington State University
in biology. San Francisco, CA: Teaching, 16, 79–95. in Pullman. Carol M. Anelli is a professor
Pearson/Benjamin Cummings. Mueller, J. F. (2012). What is authentic and associate chair in the Department of
Handelsman, J. D., Ebert-May, D., assessment? Retrieved from http:// Entomology at The Ohio State University
Beichner, R., Bruns, P., Chang, jfmueller.faculty.noctrl.edu/toolbox/ in Columbus.

Vol. 44, No. 5, 2015 71


Copyright of Journal of College Science Teaching is the property of National Science
Teachers Association and its content may not be copied or emailed to multiple sites or posted
to a listserv without the copyright holder's express written permission. However, users may
print, download, or email articles for individual use.

You might also like