You are on page 1of 7

Evaluating innovation in studio physics

Karen Cummings and Jeffrey Marxa)


Rensselaer Polytechnic Institute, Troy, New York 12180

Ronald Thornton and Dennis Kuhl


Center for Science and Mathematics Teaching, Tufts University, Medford, Massachusetts 02155

~Received 7 January 1999; accepted 8 April 1999!


In 1993, Rensselaer introduced the first Studio Physics course. Two years later, the Force Concept
Inventory ~FCI! was used to measure the conceptual learning gain ^g& in the course. This was found
to be a disappointing 0.22, indicating that Studio Physics was no more effective at teaching basic
Newtonian concepts than a traditional course. Our study verified that result, ^ g FCI,98& 50.18
60.12 (s.d.), and thereby provides a baseline measurement of conceptual learning gains in Studio
Physics I for engineers. These low gains are especially disturbing because the studio classroom
appears to be interactive and instructors strive to incorporate modern pedagogies. The goal of our
investigation was to determine if incorporation of research-based activities into Studio Physics
would have a significant effect on conceptual learning gains. To measure gains, we utilized the
Force Concept Inventory and the Force and Motion Conceptual Evaluation ~FMCE!. In the process
of pursuing this goal, we verified the effectiveness of Interactive Lecture Demonstrations @^ g FCI&
50.3560.06 (s.d.) and ^ g FMCE& 50.4560.03 (s.d.)# and Cooperative Group Problem Solving
( ^ g FCI& 50.36 and ^ g FMCE& 50.36), and examined the feasibility of using these techniques in the
studio classroom. Further, we have assessed conceptual learning in the standard Studio Physics
course @^ g FCI,98& 50.1860.12 (s.d.) and ^ g FMCE,98& 50.2160.05 (s.d.)#. In this paper, we will
clarify the issues noted above. We will also discuss difficulties in implementing these techniques for
first time users and implications for the future directions of the Studio Physics courses at Rensselaer.
1999 American Association of Physics Teachers.

I. INTRODUCTION
A. Discussion of Studio Physics at Rensselaer
Introductory physics at Rensselaer Polytechnic Institute is
taught in a studio format with nearly 1000 students per
term enrolling in Physics I or Physics II.1 The defining characteristics of Studio Physics are integrated lecture/laboratory
sessions, small classes of 3045 students, extensive use of
computers, collaborative group work, and a high level of
facultystudent interaction. Each section of the course is led
by a professor or experienced instructor, with help from one
or two teaching assistants. The teaching assistants roles are
to circulate throughout the classroom while the students are
engaged in group work. There is currently no explicit training of teaching assistants. As a result, there is great variation
in their effectiveness. Introductory Studio Physics is a
calculus-based, two-semester sequence equivalent to the
standard physics for engineers and scientists. Classes meet
twice a week for sessions lasting 110 min each. The studio
model has reduced the number of contact hoursfrom 6.5 h
per week to less than 4 hwithout significantly reducing
course content. An expectation of some independent learning
on the part of the students has become the norm.
The studio format was first introduced at Rensselaer in
1993. During the Fall of 1995, Cooper used the Force Concept Inventory ~FCI! to measure conceptual learning gains.2,3
The fractional gain in conceptual learning, ^g&, was found to
be a disappointing 0.22. This indicated that the studio format
was no more effective than a traditional course structure in
regard to teaching for conceptual learning. The fractional
gain, ^g&, is defined as follows:

^g&5
S38

%Correctpost-instruction2%Correctpre-instruction
.
1002%Correctpre-instruction
Phys. Educ. Res., Am. J. Phys. Suppl. 67 ~7!, July 1999

This expression is often referred to in the literature as the


g or Hake factor, and is the ratio of the actual gain to
the maximum possible gain.4 The low gain in student understanding in Studio Physics classes was puzzling because the
studio classroom appeared to be interactive.
One noticeable difference between Studio Physics and
more successful curricula is that the activities used in the
studio classroom are predominately traditional activities
adapted to fit the studio environment and incorporate the use
of computers. Standard laboratory exercises were simply
modified to allow the data to be collected or analyzed via the
computer without making any real changes to the nature of
the exercise. For example, a traditional laboratory activity
which uses a spark timer or photogates to measure the acceleration of an object in free fall has been replaced with a
video analysis activity in which students measure the acceleration of an object in free fall. In general, the activities used
are not based on the findings of physics education research in
that they do not attempt to directly address known student
misconceptions and employ neither cognitive conflict5 nor
bridging techniques.6
Since the introduction of the studio format in 1993, a standard approach to instruction in these courses has evolved.
Although the Physics I and Physics II courses are broken
into a total of approximately 20 class sections taught by 10
different instructors, all students enrolled in a given course
follow the same syllabus, do the same homework assignments, and take common exams as a single group, both at
finals and during the semester. A standard course design including daily lectures, in-class activities and solutions,
homework assignments and solutions, and reading assignments is provided by a course supervisor for use by all instructors. The course supervisor is also responsible for exam
development. The motivation for this approach is twofold.
1999 American Association of Physics Teachers

S38

First, it reduces redundancy in class preparation. Second, it


provides for consistency in the material covered by the various instructors. Nearly all instructors adhere to this standard
course design. Nevertheless, inherent to the Studio Physics
model is a certain flexibility that enables motivated instructors to include diverse approaches.
B. Methods of inquiry
The physics education community assiduously develops
creative techniques for engaging students in their own learning process. Two of the authors7 were Studio Physics I instructors during the Spring semester of 1998, and merged
two such techniques, Interactive Lecture Demonstrations
~ILD!8 and Cooperative Group Problem Solving ~CGPS!,9,10
with the standard set of activities in the Studio classrooms at
Rensselaer. These approaches are discussed in more detail
below. Our intent was to ascertain the effectiveness of these
techniques while concurrently establishing the feasibility of
incorporating these methods into Studio Physics.
As part of our study, we divided the Studio Physics I
sections into two broad categoriesexperimental and standard. Standard classes were taught by instructors who delivered the standard studio instruction that was described
above. Experimental classes were taught by instructors who
modified the course design to include either Interactive Lecture Demonstrations, Cooperative Group Problem Solving or
both. There were seven standard classes and five experimental classes. Two of the experimental sections were taught by
one of the authors and the other three experimental sections
were taught by another author.7 The seven standard classes
were taught by four different instructors, one of whom was
the course supervisor. There was no overlap between the
instructors for experimental and standard sections. To offer
some objective measure of the conceptual learning gains in
the two groups, the authors administered two diagnostic exams, the Force Concept Inventory3 and the Force and Motion
Conceptual Evaluation ~FMCE!,11 to every student both preand post-instruction.
II. REVIEW OF CURRICULA
As noted above, two techniques developed by the physics
education community were incorporated into the standard
studio model of instruction on an experimental basis.
A. Interactive Lecture Demonstrations
Interactive Lecture Demonstrations ~ILDs! are an instructional technique developed by R. K. Thornton and D. R.
Sokoloff.8 They were developed and refined based on the
findings of education research, and exploit the real-time data
acquisition and display powers of microcomputer-based
laboratory ~MBL! tools.12 Interactive Lecture Demonstrations were designed to create an active learning environment
in large lecture classes, or any class where providing equipment for student use is an obstacle.
The steps currently prescribed by Thornton and Sokoloff
for use during all Interactive Lecture Demonstrations are as
follows:
~1! The instructor describes and performs the demonstration
without the use of MBL tools.
~2! Students record their names and individual prediction on
a Prediction Sheet that is to be collected at the end of the
demonstration period.
S39

Phys. Educ. Res., Am. J. Phys. Suppl., Vol. 67, No. 7, July 1999

~3! Students engage in small group discussion about these


predictions.
~4! Students make any desired changes to their predictions
on the prediction sheet.
~5! The instructor elicits common student predictions from
the class.
~6! The instructor performs the demonstration with MBL
tools which allow the data to be displayed to the entire
class. Attention is called to the most important features
of the graphs, and how they relate to the physical situation.
~7! Students record and discuss the results.
~8! The instructor discusses analogous physical situations.
Interactive Lecture Demonstrations have been developed
for a wide range of topics. The Interactive Lecture Demonstration packet which was commercially available in 1998
was used in this study.13 The purchased package provided a
teachers guide, computer files for the Mac or PC, student
Prediction Sheets and Result Sheets, and teachers presentation notes for each of four Interactive Lecture Demonstration
sequences. These sequences cover kinematics ~through units
on Human Motion and Motion with Carts! and dynamics
~through units on Newtons First and Second Laws, and
Newtons Third Law!. Each sequence takes between 40 and
50 min to perform. The commercially available package was
implemented right out of the box at Rensselaer with little
preparation by Cummings and Marx who were motivated,
but had no prior experience in performing Interactive Lecture Demonstrations.
Deviations from the prescribed implementation of the Interactive Lecture Demonstrations occurred and were a consequence of two overlapping areas of difficulty. The first area
of difficulty was the instructors inexperience performing the
Interactive Lecture Demonstrations which was compounded
by the use of new software and hardware. Consequently,
they paid less attention than desired to following the pedagogical suggestions set forth in the teachers notes. Especially notable were consistent failures to make analogies to
similar situations and to have students discuss the result. Furthermore, students were routinely allowed too much time to
make their predictions. This was primarily a result of the
instructors desire to have every student complete the prediction before continuing. The additional prediction time was
counterproductive because some students lost interest and
moved on to other things, rather than staying focused on the
demonstration.
The second area of difficulty encountered was performing
Interactive Lecture Demonstrations in physically small studio classrooms with a student population accustomed to interacting with one another and their instructors. An important
hindrance was that the classrooms used have level floors, as
opposed to the raised tiers of seats present in most lecture
halls. Consequently, students in the back of the room had
trouble seeing the demonstration. Additionally, since
studentfaculty interaction is the norm, instructors answered
questions on an individual basis that would have been more
constructive if they would have been asked and answered
publicly. Furthermore, students in the studio courses at Rensselaer have begun to expect interaction with their peers, and
hence tended to share predictions too soon, or even to make
initial predictions as small groups rather than individually.
As a result, some students never made a personal intellectual commitment to their predictions. It was also routinely
Cummings et al.

S39

difficult for instructors to elicit incorrect answers from the


class. This made discussion of common misconceptions
awkward.
B. Cooperative group problem solving
Cooperative Group Problem Solving ~CGPS! is a strategy
for peer instruction in quantitative problem solving which
was developed by P. Heller et al.9,10 This instructional approach involves the following:
~1! formation of well-functioning cooperative groups,
~2! presentation, modeling, and reinforced use of an explicit
and effective problem-solving strategy,
~3! use of context-rich problems for group practice,
~4! rewarding cooperative learning through grading.
In order to facilitate learning, instructors organize teams of
three students in which high, average, and underachievers are
represented. ~Students are not informed of this grouping procedure.! Further, students are informed about the roles and
responsibilities taken on by members of well-functioning cooperative groups and are encouraged to assign these roles to
members of their group. In this study, a students initial
achievement level was based on the results of conceptual
diagnostic tests ~FMCE and FCI!, and groups were rearranged every five to seven weeks following each course
exam.
A key tenet of Cooperative Group Problem Solving is that
students, as opposed to expert problem solvers, have not yet
developed a generalized strategy for quantitative problem
solving. Hence, students are explicitly taught such a strategy
during the first days of the course. The strategy adopted for
use in the experimental group was the same as that used at
the University of Minnesota14 and involves a five-step process described as follows:
~1! Comprehend the problem by drawing a picture and noting the given information.
~2! Represent the problem in formal terms by translating the
picture into a diagram or graph.
~3! Decide on the general approach and write down equations that you believe appropriate.
~4! Combine equations in such a way as to solve for the
target quantity before substituting in the given information. Check to make sure that units are consistent. If
units are consistent, substitute in the given information
and solve for a numerical answer.
~5! Check to make sure that your numerical answer is reasonable, that you have answered the correct question and
that units are correct.
The problem-solving strategy outlined above is modeled by
the instructor during lectures and then practiced by students
in their group.
Deviations from the prescribed approach discussed above
resulted predominately from testing and grading constraints
within the Studio Physics course structure. As previously
mentioned, students in all sections took common exams
throughout the semester. The course supervisor frequently
included material based on the standard activities on these
exams. Hence, the investigators felt that it was imprudent to
completely displace these activities. Instead they opted to
spend one class period per week on Cooperative Group Problem Solving using recitation-style context-rich problems
available from the University of Minnesota,14 and one class
S40

Phys. Educ. Res., Am. J. Phys. Suppl., Vol. 67, No. 7, July 1999

period per week doing the same activity that students in the
standard sections did. Further, context-rich problems were
not included on exams, although part of the students class
activity grade was based on their Cooperative Group Problem Solving work.
The investigators encountered three other major difficulties. The first was that due to time constraints, the instructor
in the Cooperative Group Problem Solving sections did not
model the problem solving technique as often as desired. The
two other difficulties we encountered appear to be more general in nature. The first was student resistance to assignment
of roles within the group. Several students expressed that
they were very uncomfortable with this process. As a whole,
the students could not be encouraged to adopt roles within
the groups, and hence this aspect of the technique soon died
out. The second general difficulty we encountered was that
the problem-solving procedure outlined above is typically
not relevant when solving textbook-style homework problems like those assigned in the Studio Physics course at
Rensselaer. Students were required to use the problemsolving procedure on all their homework assignments and
resented having to use this procedure if they could easily
have solved the problem without it.
III. ASSESSMENT AND EVALUATION
As mentioned above, the authors administered two diagnostic examinations, the Force Concept Inventory and Force
and Motion Conceptual Evaluation ~FMCE!. These two exams were labeled as Part A and Part B of a single
exam packet, and both exams were given pre- and postinstruction. Pre-instructional testing was done during the first
class session. The authors allotted 60 min for the students to
finish the two exams25 min for the Force Concept Inventory and 35 min for the Force and Motion Conceptual Evaluation. The time allotted for each exam was determined by
dividing by the total number of questions ~77! into the total
exam time ~60 min! and then multiplying by the number of
questions on the particular exam in question ~47 for the
Force and Motion Conceptual Evaluation and 30 for the
Force Concept Inventory!. The latest version of the Force
Concept Inventory was used. The version of Force and Motion Conceptual Evaluation used had 43 questions on force
and motion topics and 4 questions on the conservation of
energy. This yielded an allotted time which was less than
that suggested by the exams authors. However, most students finished both exams and all were given the same
amount of time to work on the exams during pre- and posttesting periods. One of the authors was present for every
administration of the exam. Post-instructional testing was
done after all of the relevant material had been covered in the
course. This was approximately two-thirds of the way
through the semester, or about ten weeks after preinstructional testing.
In the Spring 1998 semester, studio classes were divided
into the categories discussed below based on the nature of
the instruction they received. The division of students into
these categories was essentially a random assignment. Students chose to enroll in a particular section of the course
based on scheduling issues and before any information as to
which professor would be assigned to teach the class became
available. Division of class sections into these categories was
based on the section instructors willingness to experiment
with new methods and materials.
Cummings et al.

S40

Table I. Instructional techniques used. Sections referred to as experimental are those in which either Interactive Lecture Demonstrations ~ILDs!,
Cooperative Group Problems Solving ~CGPS! or both techniques were used.
Instructional techniques used

Section

Full ILD
sequence
3
3

4
6
7
8
9
11

3
3

Incomplete
ILD
sequence

CGPS
3

3
3

A. Standard group
Seven class sections comprised the standard group. These
students were taught the standard studio course. In this
model, the first 30 min of class was devoted to answering
questions and working problems on the board. Then next
1020 min were devoted to a brief lecture, complete with
derivations and examples. The remainder of class time was
used by the students to work on an in-class assignment based
on the days material. The scope of these assignments ranged
from pen and paper exercises to spreadsheet exercises to
computer-based laboratories. For the most part, students
were able to complete the in-class assignments before the
class hour was through. Some instructors found activities to
occupy the students time; others simply let students leave
early.

B. Experimental group
Five sections were taught by either Cummings or Marx
and comprised the experimental group. This group of students had an instructional experience which was predominately the same as that of the standard group. However, these
groups were also exposed to Interactive Lecture Demonstrations, Cooperative Group Problem Solving or both. Students
in sections 4 and 11 were given all four Interactive Lecture
Demonstration sequences. Students in sections 6 and 8 did a
simplified version of the human motion Interactive Lecture Demonstrations in small groups as an in-class activity.
They were then given the last three Interactive Lecture Demonstration sequences. A Cooperative Group Problem Solving
model was implemented in sections 4 and 9. Sections 6 and
8 were given context-rich problems as extra, in-class activities on three occasions throughout the semester. Table I summarizes the breakdown of the experimental groups.
Primarily, experimental activities were done in place of
the work performed by the standard group. On the occasions
that time allowed, they were done in addition to that work.
Hence, we estimate that the inclusion of these activities resulted in an increase in instruction time of about 1% for
Interactive Lecture Demonstrations and about 5% for Cooperative Group Problem Solving. The additional time came
about because the experimental sections did not leave class
early, while the standard sections occasionally did. The topics covered by the experimental and standard groups were
identical; the two groups remained synchronized throughout
the semester and took common exams.
S41

Phys. Educ. Res., Am. J. Phys. Suppl., Vol. 67, No. 7, July 1999

Fig. 1. Fractional gain on the Force and Motion Conceptual Evaluation


~FMCE! and the Force Concept Inventory ~FCI! for groups of students
having had various instructional experiences.

C. Intermediate group
Inclusion of a group of students who could act as an indicator of instructor influence, separating the effects of the
instructional techniques and curricular materials from the influence of the instructor ~an instructor control group!, was
not part of this studys design. Nevertheless, such a group
serendipitously formed. Section 7 began the semester as a
standard section, taught by a professor other than one of the
authors. However, due to low enrollment in this class and in
section 8 ~which ran concurrently in another room!, section 7
was merged with section 8 approximately three weeks into
the semester. From this point on, these students were taught
by one of the authors. Aside from having missed the first
three sequences of Interactive Lecture Demonstrations ~out
of four!, they had an identical educational experience as the
students who began the semester in section 8. Hence, we do
not consider section 7 to be part of the standard or experimental group, but rather they are a weak control for the
influence of the authors on the outcome. We will refer to
section 7 as the intermediate group.
IV. RESULTS AND DISCUSSION
Figure 1 provides an overview of the results of this investigation. We have assessed conceptual learning in the standard Studio Physics course during the Spring 1998 semester
to be ^ g FCI& 50.1860.12 (s.d.) and ^ g FMCE& 50.21
60.05 (s.d.). In the studio sections in which Interactive Lecture demonstrations were performed we found ^ g FCI& 50.35
60.06 (s.d.) and ^ g FMCE& 50.4560.03 (s.d.). In studio sections in which Cooperative Group Problem Solving was used
we found ^ g FCI& 50.36 and ^ g FMCE& 50.36. The fractional
gains discussed here and represented by the height of each
bar in Fig. 1 is the ^g& factor discussed in Sec. I. In this
analysis, we considered only students for which there were
both pre- and post-test scores ~matched samples!. The fractional gain for a group of students was calculated using the
average of post-test scores for the group and the average of
pre-test scores for the group. ~This is referred to as calculating the gain on averages.! For two reasons, we chose to
calculate average gain in this manner, rather than to average
the individual student gains ~referred to as calculating the
average of the gains!. First, it allowed us to keep students
who achieved a 100% correct score on the pre-test in the
Cummings et al.

S41

post-test scores on this exam. In contrast, several questions


on the Force and Motion Conceptual Evaluation were disregarded in score calculations. The disregarded questions are
those which have been found to be important for identifying
students conceptual models. Hence, they remain a part of
the assessment. However, these questions are not appropriate
for inclusion in an overall measurement of the level of student understanding. Additionally, several groups of closely
related questions on the Force and Motion Conceptual
Evaluation are considered as a unit when calculating a total
score on the exam.11,15 This method of calculating a single
number score on the Force and Motion Conceptual Evaluation was done on the advice of the exams developers and is
discussed in detail in Ref. 15.
Average fractional gain for each section on the Force and
Motion Conceptual Evaluation and Force Concept Inventory
is shown in Fig. 2~a! and ~b!, respectively. Standard sections
are on the left-hand side while experimental sections are on
the right-hand side; between them is the intermediate section
~section 7!. Standard and experimental group averages for
the Force and Motion Conceptual Evaluation and the Force
Concept Inventory are indicated by a thin, horizontal line
spanning their respective groups. On both exams the average
gain for the experimental group was approximately twice
that of the standard group. Moreover, the lowest-scoring experimental section was at least one standard deviation away
from the average of the standard group. Table II contains the
section-by-section data.
It is interesting to note results for the intermediate group
and compare them to those for section 8. Despite the fact that
these two groups had the same instructor, and were students
in the same class for most of the semester, the intermediate
group had conceptual learning gains which were more in line
with the standard studio sections than with section 8 or other
experimental sections. Recall that section 8 had a complete
sequence of Interactive Lecture Demonstrations, while the
intermediate group did not.
Figure 3 shows scatter plots of students post-test score
versus their pre-test score on the Force and Motion Conceptual Evaluation. Figure 3~a! shows this result for the standard
sections and Fig. 3~b! shows this result for the experimental

Fig. 2. ~a! Fractional gain on the FMCE by class section number. Standard
sections are on the left; experimental sections are on the right. The average
gain with standard deviation is indicated for each of the two categories. ~b!
Fractional gains on the FCI by class section number. Standard sections are
on the left; experimental sections are on the right. The average gain with
standard deviation is indicated for each of the two categories.

study. Individual gains cannot be calculated for such students, and so they cannot be included in the investigation if
one chooses to calculate the average of individual gains. Second, calculating the average in this way reduces the skewing
which occurs when students who pre-test with quite high
scores then post-test with somewhat lower scores. The error
bars shown in Fig. 1 represent the standard deviation of the
averages of class sections, with each section weighted
equally.
Every question on the Force Concept Inventory was considered and equally weighted in calculation of pre-test and

Table II. Section-by-section data for the Force and Motion Conceptual Evaluation ~FMCE! and the Force Concept Inventory ~FCI!.
FMCE

FCI

Section

Pre
ave

Post
ave

^g&
ave

Pre
ave

Post
ave

^g&
ave

1
2
3
5
10
12
13

27
17
27
28
30
23
32

37.3
25.4
25.7
43.2
31.8
43.6
40.9

55.1
44.1
39.9
51.7
43.0
55.8
54.9

0.28
0.25
0.19
0.15
0.17
0.22
0.24

28
18
28
29
31
24
33

49.6
43.0
42.1
61.0
42.0
57.5
53.7

63.9
51.7
54.8
59.0
56.6
63.9
66.2

0.28
0.15
0.22
20.05
0.25
0.15
0.27

Intermediate

13

34.8

54.3

0.30

13

56.7

66.7

0.23

ILD

6
8

39
22

37.3
33.0

63.7
65.1

0.42
0.48

40
23

53.3
52.2

69.8
66.4

0.35
0.30

11
9

40
32

29.6
40.3

60.7
61.6

0.44
0.36

41
33

46.3
49.9

68.2
68.0

0.41
0.36

30

32.2

67.6

0.52

28

50.1

66.8

0.33

Group

Standard

CGPS and ILD

S42

Phys. Educ. Res., Am. J. Phys. Suppl., Vol. 67, No. 7, July 1999

Cummings et al.

S42

Fig. 4. ~a! Average ^g& on the Force and Motion Conceptual Evaluation
~FMCE! for students divided into groups based on whether their pre-test
scores were in the upper, middle, or lower third of the entire pool for their
group ~experimental or standard!. ~b! Average ^g& on the Force Concept
Inventory ~FCI! for students divided into groups based on whether their
pre-test scores were in the upper, middle, or lower third of the entire pool
for their group ~experimental or standard!.

Fig. 3. ~a! Post vs pre-test score on the Force and Motion Conceptual Evaluation ~FMCE! for students in standard sections. The size of the bubble
indicates the number of students represented by the point. The lines shown
are lines of constant gain. The lowest of the four lines shown corresponds to
a gain of 20.20, the line which passes through the origin corresponds to a
gain of zero, and the highest line corresponds to a gain of 10.40. The
associated table indicates the percentage of students who increased their
exam score by the percentage shown. ~b! Post- vs pre-test score on the Force
and Motion Conceptual Evaluation ~FMCE! for students in experimental
sections. The size of the bubble indicates the number of students represented
by the point. The lines shown are lines of constant gain. The lowest of the
four lines shown corresponds to a gain of 20.20, the line which passes
through the origin corresponds to a gain of zero, and the highest line corresponds to a gain of 10.40. The associated table indicates the percentage of
students who increased their exam score by the percentage shown.

sections. The size of the bubble indicates how many students


fell on the same coordinate. The smallest bubble indicates
that one student had that set of scores, the next largest bubble
means there were two students on that coordinate, and so on.
S43

Phys. Educ. Res., Am. J. Phys. Suppl., Vol. 67, No. 7, July 1999

Since there was a smaller number of students in the experimental group, the tables below the graphs indicate the percentage of students that raised ~or lowered! their grade by the
amount indicated from pre- to post-test. The diagonal lines
shown are lines of constant gain. It is apparent from these
graphs that students in the experimental group did better in
terms of absolute gains on the Force and Motion Conceptual
Evaluation. A strikingly similar trend was seen for the Force
Concept Inventory results.
Furthermore, when we plotted these data for either exam,
we found that there were fewer experimental students around
or below the zero gain line. There were also many more
students in the upper left-hand region ~low pre-test scores
and gains of over 50%! in the experimental group than in the
standard group.
Upon viewing the data in Fig. 3, we note that weaker
students ~i.e., students with low pre-test scores! benefited
from being in the experimental sections. What about the
strongest students ~i.e., those with high pre-test scores!? Figure 4 is a graph of the average ^g& for students divided into
groups based on whether their pre-test scores were in the
upper, middle, or lower third of the entire pool for their
group ~experimental or standard!. Figure 4~a! represents this
result on the Force and Motion Conceptual Evaluation and
Fig. 4~b! is for the Force Concept Inventory. These figures
clearly show that all students, whether they pre-tested high,
middle, or low, benefited from these experimental teaching
techniques. This result was consistent on both exams. The
Cummings et al.

S43

correlation coefficient between pre-test score and gain, for


the entire group of students ~experimental and standard taken
together, N5347), is 20.06 for the Force Concept Inventory
and 10.16 for the Force and Motion Conceptual Evaluation.
Somewhat stronger correlations seem to exist for subsets of
the population.

As a result of our investigations, we are optimistic about


the future of the Studio Physics program at Rensselaer. The
entire infrastructure necessary for true interactivity in the
classroom is in place; we feel we need only to adopt
researched-based student activities.
a!

V. CONCLUSION
Overall this study implies that the standard studio format
used for introductory physics instruction at Rensselaer is no
more successful at teaching fundamental concepts of Newtonian physics than traditional instruction. The average ^g&
on the Force Concept Inventory reported here for standard
studio sections falls within the range of earlier reported values for traditionally taught courses.4 This result is disappointing in light of the fact that Rensselaer has expended the
effort and resources necessary to break-up large ~5001 student! classes into small ~3545 student! sections. Rensselaer
has introduced group work and computer use as components
of in-class instruction. Furthermore, lecture time has been
reduced. In general, the Studio Physics classrooms appear to
be interactive and students seem to be engaged in their own
learning. Nevertheless, use of the studio format alone does
not produce improvement in conceptual learning scores as
compared to those measured on average in a traditionally
structured course.
The implication of this study, that ostensibly interactive
classrooms do not necessarily result in above average levels
of conceptual learning, verifies the work of others. For example, Redish, Saul, and Steinberg found that even lectures
with much student interaction and discussion had very
little impact on student learning.16 After lengthy investigations, Kraus reported:
In many of our efforts to improve student understanding
of important concepts, we have been able to create an
environment in which students are mentally engaged
during lecture. While we have found this to be a necessary condition for an instructional intervention to be
successful, it has not proved sufficient. Of equal importance is the nature of the specific questions and situations that students are asked to think about and
discuss.17
However, introduction of research-based techniques and
activities does have clear beneficial effects. Interactive Lecture Demonstrations generated significant gains in conceptual understanding with remarkably little instructional time.
Cooperative Group Problem Solving resulted in similar conceptual learning gains and seemed to also provide a mechanism which fostered improved quantitative problem-solving
skills.
Students in Cooperative Group Problem Solving sections
not only had significant gains on the Force and Motion Conceptual Evaluation and Force Concept Inventory but also
performed better on the problem-solving section of the last
course exam. Nevertheless, implementing Cooperative
Group Problem Solving required a semester-long commitment on the part of the instructor.

S44

Phys. Educ. Res., Am. J. Phys. Suppl., Vol. 67, No. 7, July 1999

Current address: Department of Physics, University of Oregon, Eugene,


OR.
1
J. Wilson, The CUPLE Physics Studio, Phys. Teach. 32 ~12!, 518522
~1994!.
2
M. A. Cooper, An Evaluation of the Implementation of an Integrated
Learning System for Introductory College Physics, Ph.D. thesis, Rutgers,
The State University of NJ, 1993.
3
D. Hestenes, M. Wells, and G. Swackhamer, Force concept inventory,
Phys. Teach. 30 ~3!, 141158 ~1992!.
4
R. Hake, Interactive-engagement versus traditional methods: A sixthousand-student survey of mechanics test data for introductory physics
courses, Am. J. Phys. 66 ~1!, 6474 ~1998!.
5
P. W. Hewson and M. G. ABeckett-Hewson, The role of conceptual
conflict in conceptual change and the design of science instruction, Instrum. Sci. 13, 113 ~1984!.
6
J. Clement, Using bridging analogies and anchoring intuitions to deal
with students preconceptions in physics, J. Res. Sci. Teach. 30 ~10!,
12411257 ~1993!.
7
K. Cummings and J. Marx were instructors in Studio Physics I for engineers during the Spring 1998 semester, and experimented with the use of
research-based activities in the Studio classroom. Cummings taught sections 4, 9, and 11. Marx taught sections 6 and 8 as well as a weak control
group, section 7.
8
D. Sokoloff and R. Thornton, Using Interactive Lecture Demonstrations
to Create an Active Learning Environment, Phys. Teach. 35 ~10!, 340
347 ~1997!.
9
P. Heller, R. Keith, and S. Anderson, Teaching Problem Solving through
Cooperative Grouping. 1. Group versus Individual Problem Solving,
Am. J. Phys. 60 ~7!, 627636 ~1992!.
10
P. Heller and M. Hollabaugh, Teaching Problem Solving through Cooperative Grouping. 2. Designing Problems and Structuring Groups, Am. J.
Phys. 60 ~7!, 637644 ~1992!.
11
R. Thornton and D. Sokoloff, Assessing student learning of Newtons
Laws: The Force and Motion Conceptual Evaluation and the Evaluation of
Active Learning Laboratory and Lecture Curricula, Am. J. Phys. 66 ~4!,
338352 ~1998!.
12
R. K. Thornton and D. R. Sokoloff, Learning motion concepts using
real-time microcomputer-based laboratory tools, Am. J. Phys. 58 ~9!,
858867 ~1990!.
13
Mechanics Interactive Lecture Demonstration Package ~ILD!, Vernier
Software, 8565 S.W. Beaverton-Hillsdale Hwy., Portland, OR 972252429, 503-297-5317.
14
Instructors Handbook, identified as TA Orientation, School of Physics and Astronomy, Fall 1997, See also http:www.physics.umn.edu/
groups/physed/
15
K. Cummings, D. Kuhl, J. Marx, and R. Thornton, Comparing the Force
Concept Inventory and the Force and Motion Conceptual Evaluation
~unpublished!.
16
E. F. Redish, J. M. Saul, and R. N. Steinberg, On the effectiveness of
active-engagement microcomputer-based laboratories, Am. J. Phys. 65,
4554 ~1997!.
17
Pamela Ann Kraus, Promoting Active Learning in Lecture-Based
Courses: Demonstrations, Tutorials, and Interactive Tutorial Lectures,
Ph.D. dissertation, University of Washington, 1997, University Microfilms, UMI No. 9736313.

Cummings et al.

S44

You might also like