You are on page 1of 29

Running Head: EVALUATION AND ASSESSMENT OF eLEARNING MODULE

Evaluation and Assessment of the Evaluating Information Sources


Using Authority eLearning Module
Michele Alaniz & Cynthia Sargent
California State University Monterey Bay

IST622 Assessment and Evaluation


Professor Su
July 28, 2015

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

TABLE OF CONTENTS
Page
INTRODUCTION...

METHOD

Learners

Participants...

Prototype.......

Process..

Observation...

RESULTS

Entry Conditions..

Instruction

Outcomes..

10

Usability..

10

Evaluation...

12

Recommendations

12

REFERENCES

14

APPENDICES.............

19

Appendix A...

19

Appendix B...

23

Appendix C...

29

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

Page
FIGURES AND TABLES
Table 1 Measures of entry conditions.

15

Table 2 Individual results and summary of scores before and after the lesson..

16

Table 3 Statistical analysis of difference in means.

17

Figure 1 Individual results and summary of scores before and after the lesson..

16

Figure 2 Participants recommendation of eLearning module use in a classroom.

18

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

Introduction
Though college students are encouraged by their instructors to use credible
sources in their college level research, it is not always apparent or known to them how to
evaluate information sources (Taylor & Dalal, 2014, p. 2). One goal of college libraries
and library faculty is to ensure that students receive instruction in information literacy
and learn to critically evaluate the sources they find in their research. According to the
Association of College and Research Libraries (ACRL) standards, information literacy
entails knowing how to 1) define an information need, and 2) access, evaluate, and use
information ethically. Currently, in the library of the City College of San Francisco
(CCSF), instruction occurs in a one-on-one basis, via library workshops or, through
online tutorials. The Capstone project of one of the authors of this paper involves
redesigning and adding to the online tutorials to provide a more effective learning
experience for the CCSF students. The product featured in the assessment process
described here is a prototype of one portion of this Capstone. The module is intended to
instruct learners on how to evaluate the authority of information sources that they use in
their college research, and thus ultimately improve the quality of sources students choose
to use. The purpose of this evaluation and assessment is to test the effectiveness of the
instruction in this lesson, and to also assess the products usability.
Method
Learners
The target audience of the lesson is the CCSF student body. CCSF is a large,
urban community college with 9 campuses dispersed throughout the city of San
Francisco. The student body is very diverse ethnically, culturally, linguistically and

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

socioeconomically. The age range varies greatly from 18 to 65 and older. Students also
vary in terms of academic levels, backgrounds and readiness. A typical student can be
working towards a certificate, a GED or an Associates degree, but it is not uncommon
for students with Bachelors degrees and advanced degrees to be enrolled at CCSF. The
college also has a large population of students learning English as a second language.
Participants
Ideally the prototype would be tested with the target audience, CCSF students.
Summer session provided the opportunity to invite such students to participate and a
small incentive was offered, however, at the time assessment data was aggregated only
one such student had taken part in the assessment process. Data included in this
assessment comes from a variety of other participants: students of the MIST program at
CSU Monterey Bay, some graduates of college that are now working professionals, and
some college or high school students that attend schools other than CCSF. The age range
and varied academic levels of the participants did mirror what could be expected of a
random sampling of CCSF students. Some of these participants were also offered a small
incentive, but their participation was voluntary. Overall, ten volunteers participated in
this evaluation.
Prototype
Evaluating Information Sources Using Authority is the prototype lesson used in
this evaluation. It is designed for community college students at CCSF to find and use
more authoritative sources in their research. This is part of a larger module that will teach
users how to evaluate sources using the following criteria: authority, currency,
objectivity, and relevance. The lesson is web-based and will eventually be delivered via

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

Moodle to the target audience. The module was created in Adobe Captivate and utilizes
multimedia rich elements including video, graphics, animation and audio. Learners have
the opportunity to practice through activities during the lesson. It also includes
assessment and feedback throughout the lesson to reinforce the learning. There are plans
for improvement or modifications based on the usability testing and evaluation.
Process
A google document was created as the means to provide instructions to the
participants. They were provided information about the purpose of the assessment
process, an approximation of the time requirement for their participation, and directions
and web links for completing the three parts of the process: pre-test survey, online lesson,
and post-test survey. The pre-test survey (Appendix A) gathered basic demographic
information from participants and also tested their current knowledge of evaluating the
authority of information sources. We asked participants to provide a name to ensure the
ability to perform a paired t-test as part of the data analysis. However, participants were
assured that their information and results remain anonymous. The lesson contained
practice questions embedded in the instruction but the scores of these questions were not
reported for the purpose of this assessment. Following their completion of the lesson,
participants completed a post-test survey (Appendix B) that consisted of questions
identical to those of the pre-test. The post-test survey also included questions that served
as a Level 1Reaction evaluation of the lesson (Kirkpatrick & Kirkpatrick, 2006, p. 2122) as well as questions to evaluate usability.
The pre-test and post-test were administered through google forms. Therefore,
participant responses were automatically recorded within a google sheet when they

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

submitted each form. However, scores reported for participants responses were
determined by the authors, and not computer-generated. The data used in the paired t-test
came from 12 questions that comprised a total of 15 points. Multiple-choice, true/false,
and some constructed response questions were each assigned 1 point, while other
constructed response questions and questions in which there was more than one correct
answer to select, were assigned 2 points. When scoring the responses, partial credit was
possible for all of the 2-point questions, as well as a few of the 1-point questions. Scores
were initially assigned by one of the authors and then independently checked by the other
author to check for reliability. There was strong interrater reliability, as almost all scores
assigned by each author matched those of the other. The scores that did differ only
differed by a marginal amount, such as one person assigning a 0.75 and the other a 0.5.
Scoring differences were resolved and the data was compiled into excel for statistical
analysis.
A total score was calculated for each participant on the pre-test and post-test and a
group average was calculated for each testing instance as well. A paired t-test compared
the group mean score of the pre-test to the group mean score of the post-test. The null
hypothesis was that post-test scores would be equal to or less than pre-test scores. The
alternate (experimental) hypothesis was that post-test scores would be greater than pretest scores.

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

null hypothesis

1 2

alternate hypothesis

1 < 2

Since the alternate hypothesis is directional, the paired t-test results were analyzed by
looking at the p-level and by comparing the t-stat result and t critical value for the
one-tailed distribution reported in excel by StatPlus.
Observation
The level 1 evaluation of the lesson and usability testing was conducted mainly
through google forms, however, one of the 10 participants was observed to provide
additional insight. The observer (one of the authors) took on the role of a nonparticipant
observer, also known as onlooker or outsider (Russ-Eft & Preskill, 2009, p.246-247). The
observer encouraged the participant to talk-aloud during his participation and she made
detailed notes throughout the process. She also recorded the participants testing session
using Camtasia screen-recording software. The observer asked permission to record the
session, and the participant complied. Only audio and screen movement was captured.
The video function of the software was not utilized. While video-taping participants can
be useful, it can possibly make the tester nervous, thus possibly invalidating the collected
data (Russ-Eft & Preskill, 2009, p.251).

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

Results
Entry conditions
The authors intended to recruit participants that are not experts at evaluating the
authority of information sources, since a goal of this evaluation was to determine the
lessons effectiveness in teaching the topic. Two measures were used to determine the
entry-level knowledge of the participants: a self-reported prior knowledge score and a
pre-test score. The results of both measures (see Table 1) provide evidence that our
intended entry conditions were similar to the observed entry conditions. While one
person reported herself as an expert, their pre-test score indicates that they had room
for growth in the subject-matter.
Instruction
The intended and observed instructions were similar. The intent was for
participants to complete all aspects of the online lesson module. The intentions of the
authors (evaluators) were met by at least 70% of the participants. This conclusion is
based on the number of participants indicating they spent 10 to 15 minutes on the lesson.
Those that took only 5 to 10 minutes likely did not watch all of the videos or they may
not have answered all of the embedded practice questions. Responses on the usability
questions on the post-test survey indicate that the lesson was functional; no participants
reported difficulties with functionality such as navigating the lesson or watching the
videos. The main issue encountered was the fact that users could click next instead of
submit and move on without receiving feedback on whether their answer to the practice
question was right or wrong, whereas submit provides them feedback.

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

10

Outcomes
The raw scores of the pre-test and post-test and a summary of the results are
provided in Table 2 and Figure 1. The standard deviation was greater in the pre-test than
the post-test, so scores were less variable in the assessment following the lesson. The
group mean score was 2.425 points greater for the post-test compared to the mean of the
pre-test, indicating that the lesson did have an effect on participants knowledge. To
determine if the difference in means for the assessments was significant a paired t-test
was performed. Table 3 shows the results of this statistical analysis. The t-stat result is
greater than the t critical value (5.86593 > 1.83311), and the probability level result was
much less than the probability level used to determine statistical significance (0.00012 <
0.05). The t-stat result and p-level both lead to the conclusion that the null hypothesis
should be rejected.
The very low p-level (see Table 3), much lower than the 0.05 needed to reject the
null hypothesis, indicates that the lesson had a significant effect on the participants
knowledge of evaluating authority. Additionally, the decrease in the standard deviation of
scores indicates that the lesson helped bring a more uniform understanding of authority to
the group. While those that had higher pre-test scores still made growth, those with lower
prior knowledge had more room for growth and made larger gains, and the scores were
less variable as a result.
Usability
One of the ten participants was observed in a controlled setting on a laptop using
the recommended web browser. The participant was able to complete the pre-test survey,
lesson, and post-test survey in 32 minutes, and was able to complete the lesson, including

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

11

embedded activity in twelve minutes. Based on the observation, the author plans to make
modifications to the lesson. These include potentially removing the next button from
the quiz slides, and adding instructions to the web evaluation page to avoid confusion for
user. The tester selected the next button instead of submit button after answering quiz
questions, so did not receive credit for answers. As a result, the tester believed that all of
his answers were incorrect. Two other students reported confusion with the next button,
or lack of feedback from system when selecting incorrect button during quiz. It would be
advantageous for system to prompt user or send them back to complete question
correctly, in such instances.
The author also intends to add a loading video prompt, and provide instructions
for learners to watch video, in order to avoid confusion if video is slow to load. The
author would also consider adding more direction to the book comparison activity, but
would like to do more in-person observations in future to see how other learners react. In
addition to having users inspect linked pages for background information on sources
during activity, it would be ideal for users to also conduct their own search. This might
need to be explicitly stated.
There were a few positive observations from the usability test. As the learner was
very vocal during the trial, it was evident that he was engaged during the short activity
and quiz that took place during the lesson. Additionally, he was also stimulated by the
positive feedback. The author will explore providing more opportunities for feedback in
the revised version of the lesson.

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

12

Evaluation
During the evaluation phase in the post-test survey, a majority of the learners
indicated that they would somewhat recommend this module for use in a classroom by
assigning it a four on a scale of one to five (see Figure 2). While not a majority, 30% of
the learners indicated that they would strongly recommend this lesson.
One useful piece of feedback came from the sole CCSF student that participated
in this evaluation. In response to what she liked best about the lesson, she indicated
1)clarity, 2)helpfulness of questions in lesson, and 3)that it was more succinct than
current CCSF Library Research workshops. This statement is particularly interesting
because this lesson is being developed to replace the outdated and rather lengthy
workshops currently in use at the library. This line of questioning will be developed and
utilized in future evaluations.
Recommendations
While the authors are pleased with the learning gains that were achieved by the
participants, it is advisable to make some modifications based on results and user
feedback before testing again using CCSF students. Testing would hopefully occur at the
beginning or middle of the fall 2015 semester. Ideally, a larger sample of students would
be identified for usability testing and evaluation of the lesson, and would take place in a
controlled setting to better gauge how learners are interacting with the eLearning module.
Another goal, based on feedback, is to make this lesson more visually appealing in terms
of graphics and color design. Additionally, the next button in the quiz slides will be
removed or altered so that they provide feedback to return to the question and use the
submit button. Unless, the user clicks submit button, their answer will not be submitted

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

13

and their results will reflect an incorrect answer. The ideal solution would be to remove
the next button altogether, so that the learner may focus on lesson content rather than
navigation.

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

14

References
Alaniz, M. (July 2015). Evaluating information sources using authority. Retrieved from
http://itcdland.csumb.edu/~malaniz/evaluatingSources/evaluatingSources.htm
Association of College and Research Libraries. Information literacy competency
standards for higher education. (n.d.). Retrieved from http://www.ala.org/
acrl/standards/informationliteracycompetency
Kirkpatrick, D. L., & Kirkpatrick, J. D. (2006). Evaluating training programs: The four
levels. San Francisco, CA: Berrett-Koehler.
Russ-Eft, D. F., & Preskill, H. S. (2009). Evaluation in organizations: A systematic
approach to enhancing learning, performance, and change. New York: Basic
Books.
Taylor, A. & Dalal, H.A. (2014). Information literacy standards and the world wide web:
Results from a student survey on evaluation of internet information sources,
Information Research: An International Electronic Journal, (19) 4, 1-33. Retrieved
from http://files.eric.ed.gov/fulltext/EJ1050475.pdf

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

15

Table 1
Measures of entry conditions
Participant
1
2
3
4
5
6
7
8
9
10
1

Education Status1
MIST student
MIST student
MIST student
Working professional (B.A.)
High school student*
Working professional* (B.A.)
Undergraduate student*
Working professional* (some
college)
Working professional (some
college)
CCSF student*

Self-reported prior
knowledge2
3
2
4
3
4
3
3

Pre-test score
(out of 15)
9.25
13
12.5
11.5
12.75
11.75
9.5

10.75

MIST students are students in the Masters in Instructional Science and Technology
program at CSU, Monterey Bay. For working professionals, their highest degree earned
is indicated. CCSF students attend City College of San Francisco. An asterisk (*)
indicates participants that were close to the age and/or status of the target audience.
2
Participants selected a number in the range of 1-5, with 1 being beginner and 5 being
expert.

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

Table 2
Individual results and summary of scores before and after the lesson
Participant
Pre-test Score1 Post-test Score1
1
9.25
13
2
13
14
3
12.5
14
4
11.5
14
5
12.75
14.25
6
11.75
13.75
7
9.5
11.75
8
6
10.5
9
9
13.25
10
10.75
11.75
Descriptive Statistics: Group Results
Count
10
10
Mean
10.6
13.025
Median
11.125
13.5
Minimum
6
10.5
Maximum
13
14.25
Range
7
3.75
Standard Deviation
2.18327
1.27176
1
The pre-test and post-test assessments consisted of identical questions and were both
scored out of 15 points.
Figure 1
Individual results and summary of scores before and after the lesson

16

EVALUATION AND ASSESSMENT OF eLEARNING MODULE


Table 3
Statistical analysis of difference in means
Paired t-test (one-tailed distribution)
Degrees of freedom 9
t-stat value 5.86593
t critical value (5%) 1.83311
p-level 0.00012
Conclusion Reject the null hypothesis

17

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

18

Figure 2
Participants recommendation of eLearning module use in a classroom

Note: Bar graph demonstrates, on a scale of one to five, if learner would recommend the
use of this eLearning module for a classroom.

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

19

Appendix A
Evaluating Sources Using Authority: Pre-Test
This pre-test is one component of the usability testing and learning evaluation process
you volunteered to participate in. We appreciate your time and willingness to evaluate the
product and help improve it for future learners. This pre-test helps us gauge your prior
knowledge of the topic, and eventually how the module improves your knowledge--if at
all. Do not worry if you do not know the answers to these preliminary questions about
evaluating sources for authority. Just give it your best effort and take your best guess if
necessary. [* Required]
Introductory Questions:
Please provide a little information about yourself.
If you are a college student, which college do you attend? *
First and Last Name *
You will remain anonymous. Your name will not be included in the report. We will only
use name to match pre and post survey data.
What is your age? *
What is your gender? *
What is your education level? *
How will you complete this e-learning activity? *
Note: This lesson will not work on devices, such as iPhone and iPad, that do not support
Flash.

EVALUATION AND ASSESSMENT OF eLEARNING MODULE


How do you prefer to learn? *
o

Traditional face-to-face classroom

Online classroom

Hybrid classroom (mixture of online and face-to-face)

No preference

Pre-Test Questions:
Answer the following questions as best you can based on your current knowledge.
1. What does authority refer to when evaluating information sources for a research
project? *
2. It is necessary to examine the following sources for authority. *
Please check all that apply.
o

Books

Websites

Magazine articles

Online periodicals

3. Which of the following choices is the best match for the URL berkeley.edu? *
o URL for an education website
o URL for a non-profit website
o URL for a government website
4. Which of the following choices is the best match for the URL sierraclub.org? *
o URL for an education website
o URL for a non-profit website
o URL for a government website
5. Are magazine articles or academic journals more authoritative? Why? *

20

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

21

6. When examining the authority of a source you are determining *


o

the expertise of the author

when the source was published

whether you agree with the author's point of view

the relevancy of the source

7. You need to write a research paper and your professor says that you need an
authoritative source. You've searched the library catalog and found a book that might be
useful. How do you determine the book's authority? *
Please check all that apply.
o

Find bibliographic information about the author(s)

Look inside the book to find the identity of the book publisher

Look online for reviews of the book

Find the publication date to determine if the source is current

Look inside the book to see if there is a bibliography that cites the
author's sources

Go to the college bookstore and see if they sell the book

8. Magazines like Time and National Geographic are considered scholarly periodicals. *
o

True

False

9. Trade publications, such as ComputerWorld, are considered scholarly periodicals. *


o

True

False

10. Newspapers are considered popular periodicals, not scholarly sources. *


o

True

False

11. Which of the following are characteristics of articles published scholarly


periodicals? * Please check all that apply.
o

Authors of the articles are experts in their field

Articles are reviewed by experts

The writing contains specialized vocabulary

The article contains an abstract

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

22

Tend to be published by a university press

12. If you find information on a website and the author's name is not given, what can you
do to help you determine if the source is authoritative? *
o

Nothing; if there is no author name then it's a bad idea to use the
website as a source.

Explore the website's "About" or "Contact" sections to examine the


authority of the organization publishing the information.

Find a contact email on the website and email them to ask whether
they are an authoritative source.

Ask your friends if they think the website is credible.

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

23

Appendix B
Evaluating Sources Using Authority: Survey and Post-Test
Thank you for volunteering to participate in usability testing and a learning evaluation
process for the library research module. We appreciate your time and willingness to
evaluate the product and help improve it for future learners. This survey and post-test
helps us gauge whether the module improved your knowledge. Please answer the
questions as best you can based on what you learned in the module. Then, provide your
honest feedback in the survey that follows the post-test.
* Required
Introductory Questions
Please provide a little information about yourself.
If you are a college student, which college do you attend? *
First and Last Name: *
You will remain anonymous. Your name will not be included in the report. We will only
use name to match pre and post survey data.
Post-Test Questions:
Answer the following questions as best you can based on the information presented in the
e-learning module you tested.
1. What does authority refer to when evaluating information sources for a research
project? *
2. It is necessary to examine the following sources for authority. *
Please check all that apply.
o

Books

Websites

Magazine articles

Online periodicals

3. Which of the following choices is the best match for the URL berkeley.edu? *
o URL for an education website
o URL for a non-profit website
o URL for a government website

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

24

4. Which of the following choices is the best match for the URL sierraclub.org? *
o URL for an education website
o URL for a non-profit website
o URL for a government website
5. Are magazine articles or academic journals more authoritative? Why? *
6. When examining the authority of a source you are determining *
o

the expertise of the author

the relevancy of the source

whether you agree with the author's point of view

when the source was published

7. You need to write a research paper and your professor says that you need an
authoritative source. You've searched the library catalog and found a book that might be
useful. How do you determine the book's authority? *
Please check all that apply.
o

Find bibliographic information about the author(s)

Look inside the book to find the identity of the book publisher

Look online for reviews of the book

Find the publication date to determine if the source is current

Look inside the book to see if there is a bibliography that cites the
author's sources

Go to the college bookstore and see if they sell the book

8. Magazines like Time and National Geographic are considered scholarly periodicals. *
o

True

False

9. Trade publications, such as ComputerWorld, are considered scholarly periodicals. *


o

True

False

10. Newspapers are considered popular periodicals, not scholarly sources. *


o

True

False

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

25

11. Which of the following are characteristics of articles published scholarly


periodicals? *
Please check all that apply.
o

Authors of the articles are experts in their field

Articles are reviewed by experts

The writing contains specialized vocabulary

The article contains an abstract

Tend to be published by a university press

12. If you find information on a website and the author's name is not given, what can you
do to help you determine if the source is authoritative? *
o

Nothing; if there is no author name then it's a bad idea to use the
website as a source.

Explore the website's "About" or "Contact" sections to examine the


authority of the organization publishing the information.

Find a contact email on the website and email them to ask whether
they are an authoritative source.

Ask your friends if they think the website is credible.

Lesson Evaluation:
Please provide feedback on the content and functionality of the lesson module.
How much time did it take you to complete the Evaluating Authority lesson? *
At any point did you get stuck in the navigation and have to use the playbar at the bottom
of the lesson to move past the point where you were stuck? If so, describe these
instances. *
Did you find any problems with the functionality of click boxes, videos, assessment
answers, etc.? If so, describe these instances. *
Did you answer any of the practice questions in the module incorrectly? If so, how would
you rate the feedback you were provided? *
1
Not helpful at all (I still don't
know why my answers were
wrong)

5
Very helpful! (I learned more
about the topic through the
feedback)

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

26

From the learner's perspective, would you recommend this eLearning for a
classroom? *
1

Do not recommend at all

Strongly recommend

If you chose "do not recommend," what are your main concerns with the
lesson? *
o

Lack of functionality

Difficulty in navigating the lesson

Lack of clarity in directions or questions

Spelling or grammar mistakes

Lesson content
If you chose "recommend," do any of the following need to be addressed to finalize the
lesson for classroom use? *
o

Lack of functionality

Difficulty in navigating the lesson

Lack of clarity in directions or questions

Spelling or grammar mistakes

Lesson content
For either of the questions above, please provide any additional information that you have
not yet provided in other questions.
o

If you are a CCSF student, how would you compare this to the current CCSF library
research workshops (e.g. Workshops A, B and W) *
o

Strongly prefer this module over current workshops

Somewhat prefer this module over current workshops

Neutral

Somewhat prefer current library research workshops over this module

Strongly prefer current library research workshops over this module

Other:

Usability Survey:
Please provide your honest feedback and be as specific as you can for any questions in
which you type a response. Rank the following statements from "Strongly Disagree" to
"Strongly Agree."

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

27

I think that I would like to use more eLearning lessons like this one *
1

Strongly disagree

Strongly agree

I found the lesson unnecessarily complex *


1

Strongly disagree

Strongly agree

I thought the lesson was easy to use. *


1

Strongly disagree

Strongly agree

I thought there was too much inconsistency in this lesson. *


1

Strongly disagree

Strongly agree

I found the lesson very cumbersome to complete. *


1

Strongly disagree

Strongly agree

I felt very confident working through the lesson. *


1

Strongly disagree

Strongly agree

I think students would need a lot of support or the help of a technical person to be able to
complete this lesson. *
1
Strongly disagree

5
Strongly agree

EVALUATION AND ASSESSMENT OF eLEARNING MODULE


Final Thoughts?
What did you like best about the lesson? *
What did you like least about the lesson? *
Would you like to make any additional comments about your experience with the
Evaluating Authority lesson?

28

EVALUATION AND ASSESSMENT OF eLEARNING MODULE

29

Appendix C
Answer Key
1. Evaluating level of author or organization expertise; author is a subject expert and
has published extensively in the field; peer reviewed
2. Choose all 4 answers: Books, websites, magazine articles, online periodicals
3. URL for education website
4. URL for non-profit website
5. Academic articles are more authoritative because they are written by subject
experts, and are usually peer reviewed. (2 points)
6. Expertise of author
7. Find bibliographic information, Identify book publisher, Look for
bibliography (2 points)
8. FALSE
9. FALSE
10. TRUE
11. Authors are experts, reviewed by experts, writing uses specialized vocabulary, has
an abstract (2 points)
12. Explore the website's about or contact section to evaluate the organization's
authority

You might also like