Professional Documents
Culture Documents
I, Michael Halloran, declare that the attached essay/project is entirely my own work, in my own words, and that all sources used in researching it are fully acknowledged and all quotations properly identified
Course Name: Graduate Certificate in Technical Communication Module Code: TW5221 Supervisor: Dr. Yvonne Cleary, University of Limerick Date of Submission: 29 / 11/ 2013
TW5221
Abstract
This research report details a usability study of an online grammar course for teachers of English as a foreign language. The purpose of the study was to investigate if participants understanding of course concepts improved after completing self-reflection exercises. Ten teachers divided into two teams acted as test participants. All participants did the first unit of the course. However, the test intervention required one team to complete all embedded self-reflection exercises while the other team ignored them. Afterwards, the researcher tested each participant on grammar concepts in a post-course exam of ten questions. The researcher used these exam results to carry out a t-test of two independent means in order to evaluate the test hypothesis. This report includes an extensive literature review, a methodology section, a discussion section, and three appendices. KEY WORDS: Constructivism, e-learning, hypothesis test, online course, self-reflection, usability test.
Acknowledgements
I want to thank Prof. Philip Rubens for helping me to refine this research project. I also want to thank Dr. Yvonne Cleary for her advice, encouragement, and mentorship over the course of TW5221. I want to thank the participants for setting aside time to take part in the test. Finally, I want to thank Liam Halloran and Bryna Greenlaw for proofreading the final report.
TW5221
Contents
Abstract ....................................................................................................................................... i Acknowledgements ..................................................................................................................... i Section 1: Introduction ............................................................................................................... 1 Section 2: Literature review ....................................................................................................... 2 Section 2.1: Overview ........................................................................................................... 2 Section 2.2: Theories of education and e-learning: behaviourism, cognitivism, constructivism and connectivism ........................................................................................... 2 Section 2.3: Usability, user experience and e-learning .......................................................... 5 Section 2.4: Data analysis of usability tests (hypothesis testing) .......................................... 8 Section 2.5: Conclusion ......................................................................................................... 8 Section 3: Methodology ............................................................................................................. 9 Section 3.1: Hypothesis ......................................................................................................... 9 Section 3.2: Quantitative usability testing an empirical research method .......................... 9 Section 3.2.1: Sample selection: choosing participants ..................................................... 9 Section 3.2.2: Control conditions..................................................................................... 10 Section 3.2.3: Ethical considerations ............................................................................... 10 Section 3.2.4: Procedure ................................................................................................. 10 Section 3.2.5: Statistical data analysis ............................................................................. 11 Section 4: Results and discussion ............................................................................................ 12 Section 4.1: Testing the difference between two means ...................................................... 12 Section 4.2: Discussion of research questions ..................................................................... 13 Section 4.3: Possible reasons for test results ....................................................................... 13 Section 4.4: Conclusion and recommendations ................................................................... 14 Section 5: References ............................................................................................................... 15 Appendix 1: Exam sheet .......................................................................................................... 18 Appendix 2: Instruction sheet .................................................................................................. 19 ii
TW5221
List of Figures
Figure 1. The User Experience Honeycomb (Morville 2004)................................................ 5 Figure 2. Post-course exam results. ......................................................................................... 12
iii
TW5221
Section 1: Introduction
This is a report on a usability study of e-learning interfaces and whether self-reflection exercises should be included in online courses. The usability study was quantitative in that it gathered numeric data from user testing. The purpose of the usability study was to test the impact of embedded self-reflection exercises on users of online courses and whether self-reflection is useful in helping users understand concepts they come across. The course that I tested was Grammar for Teachers: Language Awareness. Teachers at the university where I work did the course as part of their professional development. However, I tended to ignore the self-reflection exercises when I did the course myself and I wanted to know why. Through this study, I wanted to find out why the designer included the self-reflection exercises in the course, whether the exercises were really necessary, and whether they raised or lowered student attrition rates. However, these questions proved to be subjective and not easily tested. Therefore, Professor Philip Rubens helped me refine my ideas into a quantitative study based on a post-course exam of ten questions (see Appendix 1). This report details the background, the methodology of data collection, the control conditions, the ethical considerations, the results of the study, and a discussion of the data. Ten participants volunteered to do the first unit of the course as well as the exam. The number of participants, the scope and the length of the study were limited by time, location, and resources. The layout of this report is as follows: Section 2 is a review of literature dealing with learning theory, usability testing and hypothesis testing. Section 3 deals with the test hypothesis and the methodology used to carry out the usability study. Finally, Section 4 presents the results of the study and offers some recommendations.
TW5221
Section 2.2: Theories of education and e-learning: behaviourism, cognitivism, constructivism and connectivism
Early online courses were designed based on behaviourism (Ally 2011, p.19). This school of thought focuses on learners observable and measurable behaviours (Ally 2011, p.19). A behaviourist online course would: Inform students of course outcomes so they can set expectations. Have regular tests as an imbedded part of course design. Introduce learning material in a sequenced way. Encourage students to provide feedback.
Cognitivism looks at learning from an information processing point of view (Ally 2011, p.20). A cognitivist online course would: Place important information in the centre of the screen for reading. 2
TW5221
Highlight critical information with headings and clear formatting. Tell learners why they should take the course. Match the difficulty level to the learners cognitive level.
Jakob Nielsen in an interview with elearningpost (Nichani 2001) touches on an aspect of cognitivism:
You need to keep all the content fresh in learners mind [ sic]For example, response time. Even after a few seconds you always forget what was the track or sequence you were followingIt is important that your brain keeps the context. (Nichani 2001 para. 4.)
Ally also says a cognitivist course would encourage students to use their existing knowledge to help them make sense of the new information (Ally 2011, p.24). Some other aspects of this approach include: Chunking information to make it more memorable (Miller 1956). Varied learning strategies to accommodate different kinds of learners. Varied modes of information delivery: textual, visual and verbal. Learner motivation strategies: intrinsic motivation and extrinsic motivation. Metacognition: make a student aware of their learning capabilities. Assignments that have real-life application and information.
As mentioned above, constructivism sees the learner as active. Stimuli are received from the outside but it is the learner who actively creates the knowledge (Ally 2011, p.30). A constructivist online course would: Give learners meaningful activities in practical situations. Provide first-hand information, without the contextual influence of an instructor, so that students can personalise the information themselves. Encourage cooperative learning. Provide guided discovery activities. Use embedded questions (or a learning journal) to encourage learner reflection. Provide a high-level of interactivity.
Finally, connectivism is a theory for the digital age, where individuals learn and work in a networked environment (Ally 2011, p.34). Ally sketches some general guidelines based on this theory. Learners need to: 3
TW5221
Be autonomous and independent; appropriate use of the internet is encouraged. Unlearn old information and models in favour of the most up-to-date information and models; learners need to identify the most important information. Be active in a network of learning and acquire knowledge on an ongoing basis. Must be allowed to connect with others around the world in order to share knowledge and opinions. Gather information from many resources to reflect the networked world and the diversity of thinking within it.
Ally states that further work needs to be done on how this theory can be used by educators to design learning materials (Ally 2011, p.38). Finally, Ally suggests these different theories can be used to deal with different aspects of a course: Behavourism to teach facts. Cognitivism to teach principles and processes. Constructivism to teach real-life applications of learning.
Anderson (2011) provides a framework of how people learn: Knowledge-centred learning give access to a vast selection of content and activities but quality information is highlighted and filtered by the community of users. Assessment-centred learning is based on formative and summative assessment by self, peer and teachers (Anderson 2011, p.66). Learner-centred learning changes in response to group and learner models and content is changed based on student and teacher use. Community-centred learning uses many formats for collaborative and individual interaction.
In relation to knowledge-based and community-centred learning, Nielsen discusses usability, design and aesthetics of good discussion forums for learners:
I actually believe much more in discussion groups than I believe in chat rooms as ways of allowing students to interactreal-time chat effectively becomes very thin and not nearly as valuable as discussion groups where people can think a little bit before they post and the instructor can moderate it which a also good. (Nichani 2001, para. 10)
TW5221
User experience (UX) is related to usability in that it focuses on having a deep understanding of users, what they need, what they value, their abilities, and also their limitations (U.S. Department of Health & Human Services 2013a, para. 1). Morville (2004) uses a honeycomb to illustrate the facets of user experience:
The most basic way to improve usability is user-testing (Nielsen 2012, How to Improve Usability, para. 1) and this process is three-fold: Get representative users to test the interface. Ask users to do representative tasks. 5
The U.S Department of Health & Human Services (2013b) outlines some other evaluation methods: Focus groups: moderated discussion involving five to ten participants. Card sort testing: participants organise topics into categories that make sense to them. Wireframing: creating a two-dimensional illustration of a pages interface. First click testing: examines what a test participant would click on first on the interface in order to complete their intended task. Satisfaction surveys.
The U.S Department of Health & Human Services goes on to discuss what the researcher should do after gathering data from one of the above methods: Evaluate the usability of the website. Recommend improvements. Implement recommendations. Re-test the site to measure the effectiveness of your changes.
While these methods can help researchers test usability, there are recognised usability principles for interaction design, often referred to as heuristics (Nielsen 1995a). Heuristic evaluation is part of an iterative design process and usual involves a team of evaluators. Nielsen recommends the use of five evaluators, but three at the least (Nielsen 1995b, para. 2). Jefferies and Desurvire (1992, p.39) found that just one evaluator was the least powerful evaluating technique when they experimented with different usability tests. Heuristic evaluation does not provide a systematic way to generate fixes, but rather aims to solve design issues by reference to established usability principles (Nielsen 1995b, para. 12). Nielsen (1995a) provides a list of 10 Usability Heuristics for User Interface Design: Visibility of system status. Match between system and the real world. User control and freedom. Consistency and standards. Error prevention. 6
TW5221 Recognition rather than recall. Flexibility and efficiency of use. Aesthetic and minimalist design. Help users recognize, diagnose, and recover from errors. Help and documentation.
Do these principles apply to e-learning? A team at The University of Georgia found that Nielsens list needed to be augmented (Benson et al 2002). They evaluated an e-learning program designed for the American Red Cross. They created a protocol for e-learning heuristic evaluation and fifteen usability and instructional design heuristics for the evaluation of e-learning programs. Their augmented evaluation heuristics included Nielsens original ten and five new principles: Learning Design. Media Integration. Instructional Assessment. Resources. Feedback.
Zaharias and Poylymenakou (2009) suggest that a usability evaluation method for e-learning needs to place motivation above functionality. They split e-learning usability attributes in two: usability and instructional design. Under usability they include, navigation learnability, accessibility, consistency and visual design. Under instructional design they include, interactivity/engagement, content and resources, media use, learning strategies design, feedback, instructional assessment and learner guidance and support (Zaharias and Poylymenakou 2009, p.80). This ultimately feeds into the most important part of e-learning, motivation by students: attention, relevance, confidence and satisfaction. Intrinsic motivation can be characterised as the drive arising within the self to carry out an activity whose reward is derived from enjoyment of the activity itself (Zaharias and Poylymenakou 2009, p.80) The learning interface needs to encourage intrinsic learning motivation. Ally says that extrinsic motivation should also be used, citing Kellers ARCS model (Attention, Relevance Confidence and Satisfaction) (Keller 1987; Ally 2011, p.28).
TW5221
TW5221
Section 3: Methodology
Grammar for Teachers: Language Awareness is a course designed for future, inexperienced and experienced teachers of English as a foreign language. To successfully complete the course users must reflect upon [their] own knowledge of the English languageparticipate in discussion forums with other teacherskeep a learner journal (Cambridge University Press and UCLES 2013). All of these components suggest a strong constructivist basis for the course.
TW5221
I observed all participants to ensure they followed this intervention protocol. I prompted them when they strayed from the instructions or provided them with assistance when they asked. 10
TW5221
Registration and unit completion took forty to eighty minutes depending on the participant. After they complete the unit, I removed the laptop and task sheet and gave the participants a pencil, eraser and exam sheet. The exam consists of ten questions which tested participants comprehension of the course concepts. Participants had to attempt all questions. The exam took less than five minutes to complete per participant.
11
TW5221
Post-Course Exam Results Team A Team B 8 18 17 12 15 14 16 14 11 15 n 5 5 mean 13.4 14.6 SD 3.78153 2.19089 p 0.28018
By reviewing the means of both groups we can see that the average score for Team A was 13.4 out of 20 while for Team B it was 14.6 out of 20. Already we see that the hypothesis is not accepted based on the raw averages. If the probability is less than 0.1 than the null hypothesis can be rejected. However, the p value is 0.28018, meaning the null hypothesis 12
TW5221
cannot be rejected. Therefore, based on the available data: the test hypothesis is not accepted.
13
TW5221
In hindsight, the hypothesis is flawed. A better hypothesis might be worded to include the dependent variable; in this case, the results of the post-course exam: Users will have stronger recall of online course concepts in a post-task exam if they complete embedded self-reflection exercises.
Allowing participants to access the online journal of their self-reflections may have altered their test results. Allowing participants to discuss the reflections in the course forum could also have altered the results as well. The course self-reflection exercises may have been poorly designed themselves; they may have discouraged participants from really considering the course concepts.
14
TW5221
Section 5: References
Ally, M. (2011) Foundation of Educational Theory for Online Learning, in Anderson, T., ed., The Theory and Practice of Online Learning, 2nd ed., Edmonton: AU Press, 15-44. Anderson, T. (2011) Towards a Theory of Online Learning in Anderson, T., ed., The Theory and Practice of Online Learning, 2nd ed., Edmonton: AU Press, 45-74.
Benson, L., Elliott, D., Grant, M., Holschuh, D., Kim, B., Kim, H., Lauber, E., Loh, S. and Reeves, T.C. (2002) Usability and Instructional Design Heuristics for E-Learning Evaluation, in Barker, P. and Rebelsky, S., eds., Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2002, Chesapeake, VA: AACE, 1615-1621. Cambridge University Press and UCLES 2013 (2013) Grammar for Teachers: Language Awareness, Cambridge English Teacher available: http://www.cambridgeenglishteacher.org/courses/details/18606 [accessed 2 Nov 2013]. David, A. and Glore, P. (2010) The Impact of Design and Aesthetics on Usability, Credibility, and Learning in an Online Environment, Online Journal of Distance Learning [online], 13(4), available: http://www.westga.edu/~distance/ojdla/winter134/david_glore134.html [accessed 13 Oct 2013].
Hughes, M. and Hayhoe, G. (2007) A Research Primer for Technical Communication: Methods, Exemplars, and Analyses, New York: Lawrence Erlbaum Associates. Jeffries, R. and Desurvire, H. (1992) Usability testing vs. heuristic evaluation: was there a contest?, SIGCHI Bulletin, 24 (4), 39-41. Keller, J. (1987) Development and use of the ARCS model of instructional design in Journal of Instructional Development, 10(3), 2-10.
15
TW5221
Koohang, A., Riley, L. and Smith, T. (2009) E-learning and Constructivism: From Theory to Application in Interdisciplinary Journal of E-Learning and Learning Objects [online], 5, 91109, available: http://ijklo.org/Volume5/IJELLOv5p091-109Koohang655.pdf [accessed 2 Nov 2013]. Miller, G.A. (1956) The magical number seven, plus or minus two: Some limitations on our capacity for processing information in Psychological Review, 63, 81-97. Morville, P. (2004) User Experience Design, Semantic Studios [online], 21 June, available: http://semanticstudios.com/publications/semantics/000029.php [accessed 24 Nov 2013]. Nielsen, J. (1995a) 10 Usability Heuristics for User Interface Design, Nielsen Norman Group [online], 1 January, available: http://www.nngroup.com/articles/ten-usabilityheuristics/ [accessed 13 Oct 2013]. Nielsen, J. (1995b) How to Conduct a Heuristic Evaluation, Nielsen Norman Group [online], 1 January, available: http://www.nngroup.com/articles/how-to-conduct-a-heuristicevaluation/ [accessed 13 Oct 2013]. Nielsen, J. (2001) First Rule of Usability? Don't Listen to Users, Nielsen Norman Group [online], 5 Aug, available: http://www.nngroup.com/articles/first-rule-of-usability-dontlisten-to-users/ [accessed 13 Oct 2013]. Nielsen, J. (2012) Usability 101: Introduction to Usability, Nielsen Norman Group [online] 4 January, available: http://www.nngroup.com/articles/usability-101-introduction-tousability/ [accessed 13 Oct 2013]. Nichani, M., ed. (2001) Jakob Nielsen on e-learning, elearningpost [online], 16 January, available: http://www.elearningpost.com/articles/archives/jakob_nielsen_on_e_learning/ [accessed 13 Oct 2013].
16
TW5221
U.S. Department of Health & Human Services (2013b) Usability Evaluation Basics Usability.gov [online], available at: http://www.usability.gov/what-and-why/usabilityevaluation.html [24 Nov 2013]. U.S. Department of Health & Human Services (2013a) User Experience Basics, Usability.gov [online], available: http://www.usability.gov/what-and-why/userexperience.html [accessed 23 Nov 2013]. Zaharias, P. and Poylymenakou, A. (2009) Developing a Usability Evaluation Method for eLearning Applications: Beyond Functional Usability, International Journal of HumanComputer Interaction, 25(1), 75-98.
17
TW5221
Total Score:
/20
18
TW5221
19
TW5221
FACULTY OF ARTS, HUMANITIES AND SOCIAL SCIENCES RESEARCH ETHICS COMMITTEE CONSENT FORM
Consent Section: I, the undersigned, declare that I am willing to take part in research for the project entitled TW5221 Usability Study. I declare that I have been fully briefed on the nature of this study and my role in it and have been given the opportunity to ask questions before agreeing to participate. The nature of my participation has been explained to me and I have full knowledge of how the information collected will be used. I am also aware that my participation in this study may be audio recorded and I agree to this. However, should I feel uncomfortable at any time I can request that the recording equipment be switched off. I am entitled to copies of all recordings made and am fully informed as to what will happen to these recordings once the study is completed. I fully understand that there is no obligation on me to participate in this study. I fully understand that I am free to withdraw my participation at any time without having to explain or give a reason. I am also entitled to full confidentiality in terms of my participation and personal details.
__________________________ Date
20