You are on page 1of 12

First Steps in Understanding Engineering Students Growth of Conceptual and Procedural Knowledge in an Interactive Learning Context

ROMAN TARABAN
Department of Psychology Texas Tech University a baseline for additional studies of more advanced students in order to gain insight into how students develop skill in engineering. Keywords: cognitive processing, instructional software, skill development

EDWARD E. ANDERSON
Department of Mechanical Engineering Texas Tech University

ALLI DEFINIS
Department of Psychology Texas Tech University

I. INTRODUCTION
A. Cognitive Influence on Engineering Education Research Some recent initiatives in engineering education research have adopted a cognitive framework for designing and implementing studies of student learning behaviors and outcomes [1]. This is an encouraging development because it allows engineering education reform to benefit from the basic research on cognition and learning that has been going on since the 1970s. According to cognitive theories, learning results in changes in mental representations and processes and depends critically on learners prior knowledge and their ability to effectively synthesize and store what they gained from problem-solving episodes [2]. Indeed, curriculum reform efforts affirm the centrality of students experiences and attempt to identify those experiences that maximize student gains within the available time for learning. As one significant manifestation of the cognitive approach, there are several ongoing efforts to advance classroom practice and outcomes by identifying and understanding misconceptions held by engineering students regarding basic engineering concepts, like rate and energy [3]. One way this research is being pursued is through the development and implementation of concept inventories [4, 5]. In recent work, Streveler et al. [6] have expanded this effort and have proposed to identify misconceptions that occur across multiple curricular areas and to repair these misconceptions through remediation involving changes in mental schemas. Misconceptions are part of a learners prior knowledgemore technically, declarative knowledgein a specific knowledge domain. In cognitive theories of skilled problem solving [2, 7], acquiring and using declarative knowledge is crucial for effective performance. However, problem solving is largely procedural, i.e., action-oriented knowledge, and draws on a distinct form of memory that stores procedural knowledge [2]. The development of procedural knowledge in students, i.e., the ability to effectively solve domain problems, is the goal of many instructional initiatives in engineering education. Gray et al. [8] outline five steps for solving equilibrium and kinetics problems that they assert should be followed without exception. These steps are Journal of Engineering Education 57

ASHLEE G. BROWN
Department of Psychology Texas Tech University

ARNE WEIGOLD
Department of Psychology Texas Tech University

M.P. SHARMA
Department of Chemical and Petroleum Engineering University of Wyoming

ABSTRACT
The development of procedural knowledge in students, i.e., the ability to effectively solve domain problems, is the goal of many instructional initiatives in engineering education. The present study examined learning in a rich learning environment in which students read text, listened to narrations, interacted with simulations, and solved problems using instructional software for thermodynamics. Twenty-three engineering and science majors who had not taken a thermodynamics course provided verbal protocol data as they used this software. The data were analyzed for cognitive processes. There were three major findings: (1) students expressed significantly more cognitive activity on computer screens requiring interaction compared to text-based screens; (2) there were striking individual differences in the extent to which students employed the materials; and (3) verbalizations revealed that students applied predominantly lower-level cognitive processes when engaging these materials, and they failed to connect the conceptual and procedural knowledge in ways that would lead to deeper understanding. The results provide January 2007

guided largely by the problem statement and include (1) stating what needs to be found, (2) considering the problem assumptions, (3) finding the equations needed for solution, (4) computing the solution, and (5) verifying the solution. Litzenger et al. [7] also stress the structured nature of setting up, solving, and checking the equations involved in problem solving. Importantly, their Integrated Problem Solving Model explicitly combines declarative knowledge with procedural knowledge by treating the activation and application of prior knowledge as a definite step at each stage of problem solving. Other research, like the work of Zywno and Stewart [9], has incorporated additional cognitive factors. They investigated computer-based learning using measures of cognitive complexity based on Blooms taxonomy [10], and learning styles based on the Felder-Solomon Index of Learning Styles [11]. B. Theory Behind the Practice Structured approaches to problem solving, like the Integrated Problem Solving Model [7], are well-aligned with cognitive theories of the development of expertise at the postsecondary level and beyond [12, 13, 14, 15]. These theories assert that skill is the result of declarative knowledgefacts associated with the domain being integrated and transformed into procedures in a continuous process of refinement over time, as the result of deliberate practice on the part of the learner [14]. It needs to be emphasized that from a cognitive theoretical perspective, ones skill is not fully characterized by problem solving performance in ones area of training, but also by how ones conceptual knowledge expresses itself in problem solving, and ultimately by ones grasp of the deep principles governing the domain [16, 17]. Some of the most compelling research on the development of skilled performance has involved comparisons between novice and expert problem solvers. This research has shown how novice problem solvers are guided by superficial problem features (e.g., involving an inclined plane), whereas experts draw on theoretical principles (e.g., conservation of energy) when asked to sort problems into openended categories [18]. Other findings have identified characteristic differences in constructing mental representations and solutions for problems, with experts constructing a representation of the problem as they read through it and reasoning forward to the desired quantity (e.g., velocity v). Conversely, novices reason backward from the desired quantity, trying to solve for the desired variable value (e.g., v) by patching together knowledge of equations [19; see also, 20]. This research has been successful in identifying critical differences between experts and novices in terms of conceptual knowledge and procedural skill by examining differences in how individuals approach and solve discrete problems. It is less clear how students develop from novices to experts through engagement with the learning resources in their training programs. C. Rationale for the Present Study This study builds on prior research examining students use of learning resources in thermodynamics courses [21, 22] that showed that students devoted the majority of their study time to developing problem-solving skill and less time to reading their textbooks, suggesting a bias for developing procedural skill and less inclination toward increasing conceptual knowledge. Other research has suggested that students strive to develop conceptual knowledge, but do so at lower cognitive levels. Zywno and Stewart [9] studied the learning effects of a computer-based module on the topic of control sys58 Journal of Engineering Education

tems. Comparing pre-test to post-test results, they found greater gains at lower cognitive levels of Blooms taxonomy [10] (Level 2: Comprehension; Level 3: Application) rather than at higher levels (Level 4: Analysis; Level 6: Evaluation). Streveler et al. [6] used a Delphi method with experienced engineering faculty in order to identify concepts in engineering mechanics and electric circuits that were difficult and poorly understood by students. The researchers found that faculty underestimated the difficulty that students faced in understanding many of the concepts. This research and related work by Miller and colleagues [5, 23] shows that academically successful engineering students often lack deep understanding of the concepts and principles that underlie their areas of training. The present study is concerned with the early stages of learning in an engineering area. The analyses address what students know in terms of definitions, facts, and concepts (i.e., declarative knowledge), how they use that knowledge, and how they solve problems (i.e., procedural knowledge). The data were collected while students were learning. A coding system was developed for capturing what students know and do. These were the questions that were addressed: What cognitive processes do students use when comprehending text and solving problems in thermodynamics? Are there differences in the cognitive processes that students apply in different learning contexts, specifically, those involving text, those involving student interactions, and those requiring problem solving? Do students cognitive processes help to distinguish between relatively good and weak learners? Are some learning contexts relatively more evocative of higherorder cognitive processes?

II. CASE STUDY


The study examined learning in a rich visual, auditory, and print environment in which students read text, listened to narrations, interacted with simulations, and solved problems using instructional software. This software implemented active-learning methods and exploited state-of-the-art technology and authoring tools for learning. It was important to explore learning in this context because it represented a major trend in contemporary teaching using technology to provide students with engaging and evocative learning materials and aids. Critically, these kinds of learning materials were consistent with theories of skill development, which demand that students be provided with relevant factual knowledge and the means to transform that knowledge into skill through applications to problems. These materials differed from traditional lecture and textbook learning resources in their ability to engage students senses and cognitive faculties more fully and provide immediate and constructive feedback to student inputs. The goals of the study were to apply a method for describing and understanding the cognitive processing of students as they interacted with introductory engineering learning materials and to provide exploratory data and analyses that yielded insights into the general nature of these students cognitive processing, as well as the capacity of this method to characterize individual differences between students. The method, known as verbal protocol analysis, is a data-collection method used in human factors usability studies and in psychological studies, primarily in the fields of January 2007

Figure 1. Example of an interactive screen. expert/novice research and text comprehension research [24, 25]. Verbal protocols are open-ended think-aloud reports, through which participants are asked to verbalize what they are thinking as they complete a task, without attempting to interpret or summarize the materials for the experimenter, unless those interpretations or summaries are a natural part of their thought processes. Atman and Bursic [26] showed how the method could be used to document design processes applied by engineering students, and how to use those detailed descriptions in order to evaluate the quality of students solutions, to look for growth in students performance over time, and to evaluate whether engineering curricula were meeting their stated objectives. As another example, Litzenger et al. [7] collected verbal protocol data from engineering majors in order to confirm elements of the Integrated Problem Solving Model. A. Participants, Materials, and Procedure Twenty-five undergraduate students at Texas Tech University were recruited through General Psychology classes. Science and engineering students often enroll in General Psychology to fulfill a general education requirement, and they participate in experiments to earn extra credit in the course. All volunteers for this study were science or engineering majors who had not taken an introductory thermodynamics course. Two participants were eliminated due to low audibility in the tape recordings that provided the main data for this study. Basic demographic data for the remaining participants were as follows. Eighteen participants were male and five were female. Their mean age was 19.61 years (standard deviation, SD 1.88], the mean number of self-reported completed college credits January 2007 was 37.65 (SD 30.11), and self-reported science and engineering credits was 10.26 (SD 11.04). These participants were considered appropriate for this study because they had academic backgrounds typical of students who would be required to learn the concepts of introductory thermodynamics at an early point in their academic training. The materials used in this study were computer-based instructional supplements authored by E.E. Anderson for the textbook Thermodynamics: An Engineering Approach, 4th Edition [27]. The computer screens present students with text content, tables, figures, and graphs. They also include active-learning screens with interactive exercises, graphical modeling, physical world simulations, exploration, and quiz screens [cf., 28]. Each content screen includes a voice narration related to the subject matter on the screen. The materials were reviewed by engineering faculty during development for the textbook and after publication, and data were collected from students on usability and comparisons to other media [29]. Figure 1 provides an example of an interactive exercise screen and Figure 2 provides an example of a quiz screen. Chapter 1Introduction to Thermodynamicsand Chapter 2 Thermodynamic Propertieswere selected for this study, in order that the participants, who had not had the thermodynamics course, would find the material comprehensible. Roughly 43 percent of the total number of screens in these chapters contained an interaction of some sort. Due to the relatively high proportion of screens requiring student interaction, these materials provided many opportunities for active learning and problem solving, which are considered essential to the development of conceptual knowledge and procedural skill. Journal of Engineering Education 59

Figure 2. Example of a quiz screen.

Participants took part in the experiment through individual meetings in a quiet room with the experimenter. They were given detailed instructions for the verbal protocol (think aloud) task and were (falsely) informed that they would take a short test on the covered material after working through it, in order to ensure that they applied themselves to learning the material as if studying for an exam. They then proceeded to complete the think-aloud task, which took between 30 and 60 minutes. Participants worked with one half of one chapter. Eleven participants completed the first half of Chapter 1 and twelve completed the first half of Chapter 2. The data were tape-recorded for later transcription, with the permission of participants. During data collection, the primary role of the experimenter was to prompt participants regularly to continue to verbalize their thoughts. B. Results The twenty-three protocols were transcribed by two of the experimenters and an assistant. The initial phase of the analysis consisted of establishing a rubric for parsing participants utterances into segments for coding and then developing the codes in Table 1. The convention adopted for parsing, i.e., for segmenting the protocols, was to code idea segments, which were often indicated by noticeable pauses in a participants speech pattern. The parsed segments were typically clauses or sentences. The second task in analyzing the protocols was to develop a set of codes. This was accomplished through individual coding and group discussion among three of the experimenters using the two longest protocols. The task of developing codes was approached in a bottom-up fashion [30], with two goals in mind that emerged through the preliminary coding and discus60 Journal of Engineering Education

sion. One goal was to distinguish the context of an utterance, as indicated in the columns of Table 1. The six contexts were these: Comments made that were associated with the process of moving between computer screens (Navigation). Comments made in response to the spoken narration that initiated content screens (Narration). Comments made on screens that contained only text (Text Only Screen). Comments made on screens that contained text and a related table, figure, or graphic, but that did not allow for student interaction (Text plus Table, Figure, or Graphic Screen). Comments made on screens with interactive graphics or simulations that allowed or required student interactions (Interactive Screen). Comments made on screens that required students to process text and tables and answer one or more questions related to the preceding content screens (Quiz Screen). The other goal was to identify descriptive labels at appropriate levels of abstraction for the participants verbalizations, as indicated by the rows in Table 1. Selection of these labels was guided by prior research on comprehension strategies [31, 32, 33], but many of the labels were composed in a bottom-up manner [30] in direct response to the data. The twenty-three protocols were then coded independently by two experimenters, with the understanding that codes could be added to the table as needed. After coding began, only one major code was added, which was the Vague code in Table 1 (#8a). The process of transcription and coding of the 23 protocols was time-intensive and took approximately 900 person-hours. The data January 2007

Table 1. Frequencies of each code, summed across the 23 participants, for each type of context (see Appendix for example statements).

set consisted of 3501 coded utterances. In the initial coding, the raters agreed on parsing decisions 92.35 percent of the time. That means that one rater coded a piece of text while the other rater combined the text with a contiguous piece of text less than 8 percent of the time. For the 92.35 percent of the cases in which both raters January 2007

assigned a code, the raters agreed on codes 79.13 percent of the time, which was a moderately high level of agreement. To further analyze the raters level of agreement, a Kappa statistic [34] was calculated. The use of Kappa is often advocated by researchers because it adjusts the agreement measure for chance. The Kappa statistic for Journal of Engineering Education 61

these data was equal to 0.77. This was in the range of Substantial Agreement (0.610.80) [34]. Discrepancies in parsing and coding were resolved through discussion among the raters, followed by mutual agreement. These final codes were used in subsequent analyses. Representative examples of verbalizations associated with codes can be found in the Appendix. The complete transcript and a more comprehensive list of examples can be obtained from the first author. The development of the codes and a coding rubric address the first question in this study and make explicit the cognitive processes that students use when comprehending text and solving problems. The raw frequencies of the final codes are shown in Table 1, with column sums indicating the frequencies of codes for the six contexts in which an utterance might have been made, and row sums indicating the frequencies of specific kinds of utterances, summed across contexts. An examination of column sums shows that about 13 percent of the comments related to navigationthe transitions of the user from screen to screen. Only about 30 percent of the Navigation comments (n 126) signaled some difficulty on the part of the user with screen controls, indicating that, in general, the software was quite useable. The comments related to difficulty in navigating sometimes indicated potential improvements to the software that are discussed elsewhere [35] and will not be considered further here. Less than 1 percent (n 30) of the total comments indicated difficulty using or comprehending the narration provided on content screens, indicating that audio information, as opposed to the more familiar visual information in computer-based sources, was a workable element of the software [cf., 36]. Narration comments were related to the content of computer screens and could be made on any type of screen (e.g., text screen). Navigation comments related to the process of movement from screen to screen and could occur in transitioning from one type of screen to another (e.g., from a text screen to an interactive screen). In order to further analyze the comments made on the four types of content screens (Text, Text-Table-Figure-Graphic, Interactive, and Quiz) the Narration comments, coded for type of comment, were incorporated into the counts for those screens. The Navigation comments were not analyzed further. These modified data (which now included Narration comments) were used to calculate the average number of comments that each participant made on the four types of screens. The means are displayed in Table 2. Participants made fewest comments on Text-Table-Figure-Graphic screens and on Text screens. The number of comments increases substantially on Interactive and Quiz screens. These data were subjected to further statistical tests by first confirming that the data for each of the screen types were normally distributed, using the KolmogorovSmirnov test [37]. The data were then submitted to a Repeated Measures Analysis of Variance (ANOVA), and showed a

Table 2. Mean number of comments made by each participant per screen on each screen type. (This Table excludes Navigation comments in Table 1. *Text Plus is Text with a Table, Figure, or Graphic. **The mean number of comments is based on the number of screens of each type that each participant viewed.) 62 Journal of Engineering Education

significant effect for type of screen [F (3, 63) 15.08, p 0.001]. Pairwise comparisons of means, using a Bonferroni adjustment for the number of tests [38], showed that the mean number of comments on Text-Table-Figure-Graphic screens did not differ significantly from Text screens, but significantly more comments were made on Interactive and Quiz screens compared to Text and TextTable-Figure-Graphic screens. Means for Interactive and Quiz screens were not significantly different from each other. As a follow-up to these statistical tests, an examination of individual means showed that for 21 of the 23 participants, the number of comments made on Interactive and Quiz screens combined exceeded the number of comments made on Text and Text-Table-Figure-Graphic screens combined. Addressing the second question in this study, there were significant differences in the number of verbalizations that were associated with differences in kinds of screens. Screens that allowed or required students to do something evoked more overt cognitive activity. Participants produced more verbalizations and were presumably more cognitively active on screens that required interaction compared to text-based screens. In a second analysis, the codes were divided into those associated with lower-level cognitive processes (Codes 13) and higher-level cognitive processes (Codes 45), based on a search of cognitive research on comprehension and problem solving [18, 25, 3133, 39, 40]. Summing the total number of the two types of cognitions by participant revealed substantial individual differences in the types of cognitions utilized, as shown in Figure 3. Individuals differed in two ways. First, some participants were more active than others in processing the materials, as indicated by their overall number of comments. Second, individuals also differed in the relative frequency of lower-level versus higher-level cognitive comments. Addressing the third question in this study, the results showed that an analysis of cognitive processes on an individual basis has the potential to distinguish between good and weak learners, based on the frequency with which they engage in lower-order and higher-order cognitive processes. Predominant use of lower-order processes is suggestive of a weak learner. This point will be discussed further in Section III. Table 3 compiles the data in Figure 3 into the average number of lower- and higher-level cognitive comments made on each of the four screen types. An examination of the means shows that, overall, students made few higher-level cognitive comments and many more lower-level comments. Kolmogorov-Smirnov tests [37] showed that the data for lower-level cognitive codes, averaged across the four screen types, were normally distributed, but not the data for higher-level cognitive codes; therefore, Wilcoxon rank order tests were used to compare the average number of lower- and higher-level cognitive comments. Rank-order tests are appropriate for data that are not normally distributed. The statistical results showed that students made significantly more lower-level than higher-level cognitive verbalizations [Z 3.68, p 0.001], suggesting that they were processing the material in a shallow fashion. To the extent that higher-level comments were made, they were especially more frequent on quiz screens, and somewhat more frequent on interactive screens, compared to text-based screens. This latter observation is consistent with the analyses for the data summarized in Table 2. Addressing the fourth question in this study, the data in Table 3 suggest that screens requiring student interactions are relatively more effective in evoking higher-order cognitive processes than are text screens. January 2007

Figure 3. Sums of coded statements by participant. Lower-Level Cognition includes statement types 13 from Table 1; Higher-Level Cognition includes statement types 45 from Table 1 (Navigation statements are excluded from sums).

Table 3. Mean number of lower-level and higher-level cognitive comments made by each participant per screen on each screen type. (This Table excludes Navigation comments and comment types 6, 7, and 8 in Table 1.)

III. DISCUSSION AND CONCLUSIONS


The present research is part of new initiatives in engineering education to incorporate pedagogical innovations into the classroom, and to do so with the benefit of cognitive theories and methodologies in order to advance instructional practice and effectiveness in engineering education. An assumption of the present research was that engineering education takes place through multiple media and through a variety of opportunities to interact with learning tools. There are strong precedents for using students overt verbalizations to identify the cognitive representations that they construct to complete a task [7, 19, 20, 2426, 41], as in this study. Sampling students verbalizations while they complete academic tasks in engineering holds strong promise for providing powerful and directive insights for curricular activities and learning objectives. The present analyses revealed significant differences in the number and kind of cognitions students engage in, depending on the nature of the materials. The situations demanding overt actions from studentsinteractive exercises and quiz problemswere also January 2007

the ones that evoked the larger measure of cognitive activity as evidenced through more verbal activity. These preliminary findings provide an argument for further investigation into the quantity and quality of learning interactions that come about due to faculty choices and provision of activities for students. The results also revealed clear individual differences in responding. An examination of student learning behaviors showed that the majority of students approached the materials using very simple cognitive strategiespulling words from the screens and constructing rudimentary paraphrases of the materials. Chi et al. [39] have described students like these as poor learners. The present set of verbalizations revealed little use of metacognitive reading strategies, like activating background knowledge to increase comprehension, or revising background knowledge based on new information [42]. The most striking finding in these data was the virtual absence of inference, explanation, and drawing conclusions from the information provided in the text, interactive, and problem-solving formats. These missing elements warrant a more comprehensive investigation in future studies, as these are the kinds of cognitive processes Journal of Engineering Education 63

that underpin scientific practice across all disciplines [43] which are necessary to build deep conceptual understanding and the ability to solve novel problems in a domain [2]. An alternative explanation of students reliance on lower-level processes is that these are beginning students and therefore lowerlevel processing is most appropriate. They strive to comprehend the content, but are not yet cognitively prepared for finding connections between pieces of information, making inferences, posing questions, self-explaining, drawing conclusions, and ultimately constructing rich cognitive representations. From a cognitive perspective, this alternative explanation is not credible. Good learners are metacognitive even when the material is unfamiliar. That is, they activate background knowledge that may be relevant, seek to infer definitions for unknown words from context, paraphrase, selfexplain, and generally try to maintain a coherent representation of the material. In future studies, it will be important to examine cognitive processes for students who are more advanced in their engineering major than the students in this study to ascertain whether the good-versus-weak-learner distinction persists for those students. If it does, there will be a strong incentive for searching for ways to better identify processing weaknesses in studentseven advanced studentsand to more deliberately assist them to become more effective engineering learners. There are clear limitations to the conclusions that can be drawn from the present data. This study was not meant to be definitive in any sense, but rather a starting point in a promising direction of research that examines the cognitive practices of students presented with rich and complex learning materials. Nevertheless, the present study invites further probing into the complex cognitive landscape that underpins the emergence of professional skill and expertise over the course of many years of training and practical experience. The next steps in this line of research should include sampling of a wider range of student abilities and the examination of the connections between the breadth and depth of students cognitive engagement with course materials and their performance in the class, as indicated by more conventional measures like tests, homework, and project grades. The claim that engineering is a profoundly creative process [44] seems entirely apt as a description of the nature of professional engineering. It also conveys a sense of the mindset and skill levels that are set as goals for advanced students in engineering through the ABET engineering standards. But how does a student become a reflective thinker and effective problem solver? How does a student get started on the path to becoming facile at tackling problems in his or her domain, and what are the signs of advancement? This paper applies a cognitive theory of the development of expertise [2, 13, 14] and a bottom-up approach [30] in an attempt to address what the early stages of development might be like for engineering students. The knowledge that we gain as researchers about the early stages of knowledge and skill development in engineering students will be important to the development of teaching methodologies and learning aids and to curriculum reform initiatives. Formidable challenges still lie ahead, as researchers pursue better theoretical and pedagogical knowledge of students misconceptions [36], how students acquire and develop the rhetorical structures and schemas for scientific comprehension [57, 44, 45], and how they develop the problem solving skills necessary to become skilled engineers [7, 8].

ACKNOWLEDGMENTS
This research was supported, in part, by a grant from the National Science Foundation, NSF-CCLI 0088947 and a grant from the Texas Tech University Graduate School. We would like to thank Krystal Blankenship for assistance in transcribing the data.

REFERENCES
[1] DiGregorio, J., Advancing Scholarship in Engineering Education: Launching a Year of Dialogue, Proceedings, American Society for Engineering Education Annual Conference and Exposition, Chicago, IL, 2006. [2] VanLehn, K., Cognitive Skill Acquisition, Annual Review of Psychology, Vol. 47, 1996, pp. 513539. [3] Prince, M., and M. Vigeant, Using Inquiry-Based Activities to Promote Understanding of Critical Engineering Concepts, Proceedings, American Society for Engineering Education Annual Conference and Exposition, Chicago, IL, 2006. [4] Steif, P.S., An Articulation of Concepts and Skills Which Underlie Engineering Statics, Proceedings, 34th Frontiers in Education Conference, Savannah, GA, 2004. [5] Miller, R.L., R.A. Streveler, B.M. Olds, M.A. Nelson, and M.R. Giest, Concept Inventories Meet Cognitive Psychology: Using Beta Testing as a Mechanism for Identifying Engineering Student Misconceptions, Proceedings, American Society for Engineering Education Annual Conference and Exposition, Portland, OR, 2005. [6] Streveler, R., M.R. Geist, R. Ammerman, C. Suizbach, R.L. Miller, B.M. Olds, and M. Nelson, Identifying and Investigating Difficult Concepts in Engineering Mechanics and Electric Circuits, Proceedings, American Society for Engineering Education Annual Conference and Exposition, Chicago, IL, 2006. [7] Litzinger, T., P. Van Meter, M. Wright, and J. Kulikowich, A Cognitive Study of Modeling During Problem Solving, Proceedings, American Society for Engineering Education Annual Conference and Exposition, Chicago, IL, 2006. [8] Gray, G.L., F. Costanzo, and M.E. Plesha, Problem Solving in Statics and Dynamics: A Proposal for a Structured Approach, Proceedings, American Society for Engineering Education Annual Conference and Exposition, Portland, OR, 2005. [9] Zywno, M.S., and M.F. Stewart, Learning Styles of Engineering Students, Online Learning Objects and Achievement, Proceedings, American Society for Engineering Education Annual Conference and Exposition, Portland, OR, 2005. [10] Bloom, B.S., and D.R. Krathwohl, Taxonomy of Educational Objectives: The Classification of Educational Goals, New York, NY: Longmans, Green, and Co., 1956. [11] Felder, R.M., and B.A. Solomon, Index of Learning and Teaching Styles in Engineering Education, accessed at http://www2.ncsu. edu/unity/lockers/users/f/felder/public/ILSdir/ILS-a.htm. [12] Anderson, J. R., Cognitive Psychology and Its Implications (6th ed.), New York, NY: Worth Publishers, 2005. [13] Bedard, J., and M.T.H. Chi, Expertise, Current Directions in Psychological Science, Vol. 1, 1992, pp. 135139. [14] Ericsson, K., and A. Lehmann, Expert and Exceptional Performance: Evidence of Maximal Adaptation to Task Constraints, Annual Review of Psychology, Vol. 47, 1996, pp. 273305.

64

Journal of Engineering Education

January 2007

[15] Sahdra, B., and P. Thagard, Procedural Knowledge in Molecular Biology, Philosophical Psychology, Vol. 18, No. 4, 2003, pp. 477498. [16] National Research Council, How People Learn, Washington, D.C.: National Academy Press, 2000. [17] Chi, M.T.H., Common Sense Conceptions of Emergent Processes: Why Some Misconceptions Are Robust, Journal of the Learning Sciences, Vol. 14, 2005, pp. 161199. [18] Chi, M.T.H., P.J. Feltovich, and R. Glaser, Categorization and Representation of Physics Problems by Experts and Novices, Cognitive Science, Vol. 5, 1981, pp. 121152. [19] Larkin, J.H., Enriching Formal Knowledge: A Model for Learning to Solve Textbook Physics Problems, in J.R. Anderson (Ed.), Cognitive Skills and Their Acquisition (pp. 321335), Hillsdale, NJ: Erlbaum Associates, 1981. [20] Priest, A.G., and R.O. Lindsay, New Light on Novice-Expert Differences in Physics Problem Solving, British Journal of Psychology, Vol. 83, 1992, pp. 389405. [21] Taraban, R., M.W. Hayes, E.E. Anderson, and M.P. Sharma, Giving Students Time for the Academic Resources That Work, Journal of Engineering Education, Vol. 93, No. 3, 2004, pp. 205210. [22] Taraban, R., E.E. Anderson, M.W. Hayes, and M.P. Sharma, Developing On-Line Homework for Introductory Thermodynamics, Journal of Engineering Education, Vol. 94, No. 3, 2005, pp. 339342. [23] Miller, R.L., R.A. Streveler, B.M. Olds, M.T.H. Chi, M.A. Nelson, and M.R. Giest, Misconceptions about Rate Processes: Preliminary Evidence for the Importance of Emergent Conceptual Schemas in Thermal and Transport Sciences, Proceedings, American Society for Engineering Education Annual Conference and Exposition, Chicago, IL, 2006. [24] Ericsson, K.A., and H.A. Simon, Protocol Analysis: Verbal Reports as Data, Cambridge, MA: MIT Press, 1984. [25] Pressley, M., and P. Afflerbach, Verbal Protocols of Reading: The Nature of Constructively Responsive Reading, Hillsdale, NJ: Erlbaum Associates, 1995. [26] Atman, C.J., and K.M. Bursic, Verbal Protocol Analysis as a Method to Document Engineering Student Design Processes, Journal of Engineering Education, Vol. 87, No. 2, 1998, pp. 121132. [27] Cengel, Y.A., and M.A. Boles, Thermodynamics: An Engineering Approach, 4th Edition, Boston, MA: McGraw-Hill, 2001. [28] Anderson, E.E., R. Taraban, and M.P. Sharma, Implementing and Assessing Computer-Based Active Learning Materials in Introductory Thermodynamics, International Journal of Engineering Education, Vol. 21, No. 6, 2006, pp. 11681176. [29] Taraban, R., E.E. Anderson, M.P. Sharma, and A. Weigold, Developing a Model of Students Navigations in Computer Modules for Introductory Thermodynamics, Proceedings, American Society for Engineering Education Annual Conference and Exposition, Nashville, TN, 2003. [30] Strauss, A.L., and J.M. Corbin, Basics of Qualitative Research, Newbury Park, CA: Sage Publications, 1990. [31] Taraban, R., K. Rynearson, and M. Kerr, College Students Academic Performance and Self-Reports of Comprehension Strategy Use, Journal of Reading Psychology, Vol. 21, 2000, pp. 283308. [32] Saumell, L., M. Hughes, and K. Lopate, Underprepared College Students Perceptions of Reading: Are Their Perceptions Different Than Other Students? Journal of College Reading and Learning, Vol. 29, 1999, pp. 123135. [33] Nist, S.L., and J.L. Holschuh, Comprehension Strategies at the College Level, In R. Flippo and D. Caverly (Eds.), Handbook of College Reading and Study Strategy Research (pp. 75104), Mahwah, NJ: Erlbaum Associates, 2000.

[34] Landis, J.R., and G.G. Koch, The Measurement of Observer Agreement for Categorical Data, Biometrics, Vol. 33, 1997, pp. 159174. [35] Taraban, R., A. Weigold, E.E. Anderson, and M.P. Sharma,Students Cognitions When Using an Instructional CD for Introductory Thermodynamics, Proceedings, American Society for Engineering Education Annual Conference and Exposition, Portland, OR., 2005. [36] Mayer, R. E., Multi-Media Learning, Cambridge, UK: Cambridge University Press, 2001. [37] SPSS Inc., SPSS Advanced Statistics Users Guide, Chicago, IL: Author, 1990. [38] Box, G., H. Hunter, and J. Hunter, Statistics for Experimenters, New York, NY: John Wiley & Sons, 1978. [39] Chi, M.T.H., M. Bassok, M. Lewis, P. Reimann, and R. Glaser, Self-Explanations: How Students Study and Use Examples in Learning to Solve Problems, Cognitive Science, Vol. 18, 1989, pp. 145182. [40] Kintsch, W., Comprehension, New York, NY: Cambridge University Press, 1998. [41] Hmelo-Silver, C.E., and M.G. Pfeffer, Comparing Expert and Novice Understanding of a Complex System from the Perspective of Structures, Behaviors, and Functions, Cognitive Science, Vol. 28, 2004, pp. 127138. [42] Taraban, R., The Growth of Text Literacy in Engineering Undergraduates, Proceedings, American Society for Engineering Education Annual Conference and Exposition, Chicago, IL, 2006. [43] Taraban, R., A. Pietan, and R. Myers, Discourse Functions in Student Research Reports: What Can We Say About What Students Know and Learn Through Research Experiences, Paper presented at To Think and Act Like a Scientist, a conference at Texas Tech University, Lubbock, TX, 2006. [44] National Academy of Engineering, The Engineer of 2020, Washington, D.C.: National Academies Press, 2004. [45] Otero, J., J. Len, and A. Graesser, (Eds.), The Psychology of Science Text Comprehension, Mahwah, NJ: Erlbaum Associates, 2002.

AUTHORS BIOGRAPHIES
Roman Taraban is associate professor in the Department of Psychology at Texas Tech University. He received his Ph.D. in cognitive psychology from Carnegie Mellon University. His interests are in how undergraduate students learn, and especially, how they draw meaningful connections in traditional college content materials. Address: Department of Psychology, Mail Stop 2051, Texas Tech University, Lubbock, TX, 79409; telephone: ( 1) 806.742.3711 ext. 247; fax: ( 1) 806.742.0818; e-mail: roman.taraban@ttu.edu. Edward E. Anderson is professor of Mechanical Engineering at Texas Tech University. He received his B.S. and M.S. degrees in Mechanical Engineering from Iowa State University and Ph.D. degree from Purdue University. His research interests are in applying technology to teaching. Address: Department of Mechanical Engineering, Mail Stop 1021, Texas Tech University, Lubbock, TX, 79409; telephone: ( 1) 806.742.0133; fax: ( 1) 806.742.0134; e-mail: ed. anderson@ttu.edu. Alli DeFinis is a graduate student in the Psychology program at Texas Tech University. Journal of Engineering Education 65

January 2007

Address: Department of Psychology, Mail Stop 2051, Texas Tech University, Lubbock, TX, 79409; telephone: (1) 806.742.3711; fax: (1) 806.742.0818; e-mail: alli.definis@ttu. edu. Ashlee G. Brown is a graduate student in the Psychology program at Texas Tech University. Address: Department of Psychology, Mail Stop 2051, Texas Tech University, Lubbock, TX, 79409; telephone: ( 1) 806.742.3711; fax: ( 1) 806.742.0818; e-mail: ashlee.g. brown@ttu.edu. Arne Weigold is a graduate student in the Psychology program at Texas Tech University.

Address: Department of Psychology, Mail Stop 2051, Texas Tech University, Lubbock, TX, 79409; telephone: (1) 806.742.3711; fax: (1) 806.742.0818; e-mail: arne.weigold@ttu.edu. M.P. Sharma is professor of Chemical and Petroleum Engineering at the University of Wyoming. He received his Ph.D. degree in Mechanical Engineering from Washington State University. A current area of interest is conducting research on teaching and learning methods, especially on the use of synchronous and asynchronous tools using Web technology. Address: 1000 E. University Avenue, Department of Chemical and Petroleum Engineering, University of Wyoming, Laramie, WY 82071; telephone: ( 1) 307.766.6317; fax: ( 1) 307.766.6777; e-mail: sharma@uwyo.edu.

66

Journal of Engineering Education

January 2007

APPENDIX
Examples of Participant Statements Associated with Codes 1b NavigationSimply Describes Action (Neither the statement nor context suggest a specific purpose.) started it over. 3a NavigationSignals Comprehension Ok, back to Chapter 2. 4a NavigationMakes Inference About Content So, it says TOC Im assuming that its Table of Contents. 4g NavigationExpresses Comprehension Strategy I think I may go back because I dont remember the equation right now. 7c NavigationSignals Difficulty Using Ok now I dont even think Im in the right place anymore. 3a NarrationSignals Comprehension Well now that makes a little more sense. 4g NarrationExpresses Comprehension Strategy so now Im going to play page three over to hear the description of the graph again 7b NarrationSignals Difficulty Comprehending Ok, I dont get that one. 1b Text OnlySimply Describes Action (Neither the statement nor context suggest a specific purpose.) I am reading the paragraph. 2a Text OnlyReads/Repeats Verbatim When a system consisting of a chemically homogenous substance is divided into 2b Text OnlyShows Early Comprehension So there are solid, liquid, and gaseous phases. 2c Text OnlyParaphrases Ok pure substance is when its the same substance all the way through. 3a Text OnlySignals Comprehension It looks like I am going to learn about chemicals called pure substances. 4a Text OnlyMakes Inference About Content and the 1 minus x is to compensate for the m sub f 4b Text OnlyConnects Information in Current Screen The mass of everything is m sub f plus m sub g. 4c Text OnlConnects To Previous Screens x is the other equation that they used.

8a Text OnlyVague Lets see, hmm 1a Text plus Table, Figure, or GraphicOrients Ok and theres a little graph 2c Text with Table, Figure, or GraphicParaphrases The graph is explaining that water will flat line at different levels of pressure 4a Text with Table, Figure, or GraphicMakes Inference About Content but it looks like we combined a few of those equations. 4c Text with Table, Figure, or GraphicConnects To Previous Screens And now theyre doing the same thing with the log specific volume graph. 4e Text with Table, Figure, or GraphicExplains Content by looking at the diagram, by comparing the pressure and specific volume you can tell states at which the substance is both liquid and vapor form or when its in neither 4f Text with Table, Figure, or GraphicDraws Conclusion they all rise about to, uh at the same slope. 7b Text with Table, Figure, or GraphicSignals Difficulty Comprehending Im not really sure what z sub f and z sub fg stand for. 4a Interactive ScreenMakes Inference About Content but since theres not as much pressure I guess it allows the temperature to increase more than it did with the 50kPa. 4b Interactive ScreenConnects Information in Current Screen and the log volume behaves similarly to before but its a lot smaller now. 4d Interactive ScreenConnects To Outside Knowledge I think I remember the triple point from high school. 4e Interactive ScreenExplains Content the temperature is staying the same because the pressure of the gas coming up from the water kept on pushing the pressure 4f Interactive ScreenDraws Conclusion so basically, the lower the pressure, the least likely its going to be any kind of liquid 5a Interactive ScreenAnticipates/Predicts I guess its going to the left of the critical point. 5b Interactive ScreenChecks/Confirms Prediction Lets see if I was rightalright got it!

January 2007

Journal of Engineering Education 67

6a Interactive ScreenMakes Metacognitive Comment Thats kind of weird. 4c Quiz ScreenConnects To Previous Screens If I remember right, the ss are enthalpy. 5a Quiz ScreenAnticipates/Predicts Im going to say 1.312 because thats what its lined up with, so Im going to try it.

5c Quiz ScreenApplies Mental Math so I got 600, um, 2133 times .22 times .1 would be uh 213, 213 times 2 is 426 plus 604 is about 1009. 6a Quiz ScreenMakes Metacognitive Comment Ok, well I guess that shouldnt be too hard if thats how the questions are going to be asked.

68

Journal of Engineering Education

January 2007

You might also like