Professional Documents
Culture Documents
of Psychology
http://top.sagepub.com/
Published by:
http://www.sagepublications.com
On behalf of:
Additional services and information for Teaching of Psychology can be found at:
Email Alerts: http://top.sagepub.com/cgi/alerts
Subscriptions: http://top.sagepub.com/subscriptions
Reprints: http://www.sagepub.com/journalsReprints.nav
Permissions: http://www.sagepub.com/journalsPermissions.nav
Teaching of Psychology
38(4) 278-281
The Author(s) 2011
Reprints and permission:
sagepub.com/journalsPermissions.nav
DOI: 10.1177/0098628311421330
http://top.sagepub.com
Abstract
Although digital texts are growing in popularity, few studies have systematically examined whether students can learn as well
when reading a digital text, compared to reading a paperbound textbook. The present study examined several variables related
to comprehension of digital versus paper textbooks, including text complexity, engagement with the text, and long-term retention. Seventy-four students, randomly assigned to read an entire chapter from either a paperbound text or its digital equivalent,
completed a 20-item multiple-choice quiz immediately after reading and 1 week later. No differences in comprehension
emerged across any variables except for a main effect of test time. More important, there were no interactions with text
medium. Therefore, the key to student comprehension appears not to be the method of text delivery but, rather, getting students to read in the first place.
Keywords
digital text, reading comprehension
Corresponding Author:
Annette Taylor, Department of Psychology, University of San Diego, 5998
Alcala Park, San Diego, CA 92110
Email: taylor@sandiego.edu
Taylor
279
Method
Participants
Seventy-four introductory psychology students, primarily
Caucasian, female, and traditional college age, participated for
subject pool credit. When asked directly, all claimed to have
never taken a course in economics. Results for two students
were later eliminated when a review of transcripts showed that
they had, in fact, completed a prior course in microeconomics.
Materials
From the publishers representative, we obtained multiple
copies of two microeconomics texts marketed by the publisher
as introductory college level textsone a basic (core) text,
(Stone, 2008), and the other a comprehensive text (Krugman &
Wells, 2005). Both texts have a paperbound version and an
equivalent e-text, and the publishers representative provided
access to the digital counterparts for both texts. The target
reading was the chapter on the topic of Supply and Demand
from each text25 pages in the Stone text and 24 pages in the
Krugman and Wells text.
Procedure
Following random assignment and informed consent, each participant received either the paperbound text or instruction on
how to navigate the digital text. Participants read the chapter
under similar environmental conditions in individual testing
rooms. We instructed participants in the clean engagement
condition to not make any marks or take any notes. We encouraged those in the annotated condition to underline, highlight,
make margin notes for the paper text condition, and take notes
or use digital highlighting in the digital text condition. We provided pens, pencils, highlighters, and paper. Upon completion,
participants alerted us when they had completed the reading.
Each participant then received the 20-item pencil-and-paper
quiz. One week later, we sent an e-mail to each participant with
a link to the same quiz items posted at an online survey site.
Results
A 2 2 2 2 analysis of variance (ANOVA) with one withinsubjects variable (test time: immediate vs. 1 week delay) and
three between-subjects variables (medium: digital vs. paper; complexity: basic vs. comprehensive; engagement: clean vs. annotated) showed a single main effect for test time. Participants
scored lower on the quiz after 1 week (mean accuracy 58.3%)
than immediately after reading (63.5%), with F(1, 64) 9.49,
p .003. In particular, the critical comparison between digital
and paper texts was nonsignificant with mean quiz scores for the
paper text at 61.5% and for the digital text at 60.4%. The difference between the basic and comprehensive texts was nonsignificant, with 59.8% quiz accuracy for the basic text and 62.2% for
the comprehensive text. For the engagement variable, the mean
quiz scores were 59.1% for clean reading and 62.8% for annotated reading. This difference was also nonsignificant.
Furthermore, there were no significant interactions, at any
levels, between any of the variables. Thus, neither the immediate nor the delayed testing interval produced any difference in
comprehension between the digital and the paperbound texts.
Finally, the complexity of text did not produce any difference
in comprehension between the digital and the paperbound text.
Discussion
The purpose of this study was to determine whether students can
learn new information equally well from a digital text as from a
paperbound text. The first research question we asked was, After
280
Funding
The author disclosed receipt of the following financial support for the
research, authorship, and/or publication of this article: This research
was supported in part by a Faculty Research Grant from the University
of San Diego.
References
Beach, K. L. (2009). The effect of media, text length, and reading rates
on college student reading comprehension levels (Doctoral dissertation, University of Phoenix). Available from ProQuest digital
dissertations (AAT 3322886).
Belmore, S. M. (1985). Reading computer-presented text. Bulletin of
the Psychonomic Society, 23(1), 12-14.
Brown, G. J. (2001). Beyond print: Reading digitally. Library Hi Tech,
19(4), 390-399.
Clyde, L. A. (2005). Electronic books. Teacher Librarian, 32(5), 45.
Retrieved July 28, 2009, from ERIC database. (EJ728078).
Coiro, J. (2003). Exploring literacy on the Internet. The Reading
Teacher, 56(5), 458-464.
Coyle, K. (2008). Managing technology. The Journal of Academic
Librarianship, 34(2), 160-162.
Dillon, A. (1992). Reading from paper versus screens: A critical
review of the empirical literature. Ergonomics, 35(10), 1297-1326.
Joly, M.C.R.A., Capovilla, A.S.G., Bighetti, C., Neri, M. L., & Nicolau, A. F. (2009). Reading comprehension of freshmen students:
Comparing printed and digital texts. Paper presented at the
International Conference on Multimedia and ICT in Education,
April 2024, 2009, Lisbon, Portugal.
Kang, Y., Wang, M. J., & Lin, R. (2009). Usability evaluation of
e-books. Displays, 30, 49-52. doi:10.1016/j.disp1.2008.12.002
Kingsbury, A., & Galloway, L. (2006, October 16). Textbooks enter
the digital era. U.S. News & World Report, 141(14), 63-65.
Krugman, P., & Wells, R. (2005). Microeconomics. New York: Worth
Publishers.
Liu, Z. (2005). Reading behavior in the digital environment: Changes
in reading behavior over the past ten years. Journal of Documentation, 61(6), 700-712.
Longhurst, J. (2003). World history on the World Wide Web: A student satisfaction survey and a blinding flash of the obvious. The
History Teacher, 36(3), 343-356.
Taylor
281