Professional Documents
Culture Documents
Core Certification
The Internet and Computing Core Certification (IC) program is the worlds
first digital literacy training and certification program, and the only globally
validated computer literacy credential. IC Certification consists of three
individual exams:
Computing Fundamentals
Key Applications
Living Online
Each exam covers a different and significant area of digital literacy as developed
by the Global Digital Literacy Council (www.gdlcouncil.org). Together they form
a composite of the knowledge and skills every computer user should possess.
IC exam questions and skill tasks are psychometrically validated and consistent
across languages and geographies.
Your customers have the freedom and convenience to deliver certification
exams at their own on-site lab in a proctored environment. Ease and accessibility
are a tremendous advantage in motivating candidates to pursue IC certification
by more effectively addressing schedule and budget issues.
Your customers lab can provide a printed report of the candidates test results
immediately after completing an exam. Test results are also uploaded to a global
candidate database from which candidates can view their Online Digital
Transcripts. Candidates can also share this data by authorizing others, such as
prospective employers, to view the transcripts.
Why IC 3 ?
The program will generate additional revenue streams for your company.
Certification meets the market demand for a globally recognized credential.
Exams are created by Certiport, who is also the developer and
administrator of the Microsoft Certified Application Specialist program and
the Adobe Certified Associate program.
Certification may be verified through a global candidate database.
The program is supported by a 24/5 call center which is accessible
worldwide.
Certiport maintains partnership with leading courseware (training material)
vendors.
Features of IC 3 Certifications:
READY-TO-USE EXAM FORMAT
IC exams are consistent and psychometrically valid. Easy to implement and
administer at local testing lab.
PERFORMANCE-BASED EVALUATIONS
Objective, quantifiable evaluation of candidates performance capabilities
provides instructors with an accurate assessment of candidates skills and abilities.
Overview
VALIDATION CODES
IC certificates include a validation code printed on the certificate. By going to
http://verify.certiport.com and entering this code, the associated digital certificate
can be viewed, as well as the objectives for IC and what was learned.
Candidates can include validation codes on their resumes and academic
applications to provide a powerful and professional way to verify IC certification.
A UNIVERSAL STANDARD
IC provides instructors with a reliable and validated measurement tool to
understand the digital literacy skills of their candidates. It is also a standard that
generates consistent results from school to school, geography to geography.
Learn about the standard at www.gdlcouncil.org.
Ab out Cert iport
Certiport prepares individuals with current and relevant digital skills and
credentials for the competitive global workforce. These solutions are delivered
by more than 12,000 Certiport Centers worldwide and include Certiport
Internet and Computing Core Certification (IC), the official Microsoft Office
certification programs and the Adobe Certified Associate certification program.
Certification Pathway
Benchmark
Learn
Validate
| standards |
Step 2 : LEARN
Internet & Computing Mentor
IC Approved Courseware
IC3 Practice Test
Step 3: VALIDATE
Step 4: A DVANCE
Adobe Certified Associate
Microsoft Certified Application Specialist
Microsoft Office Specialist
advance
Sample Questions
Key applications
Living Online
IC Success Stories
RESULT
There are many IT assessments in the industry, but as far as I can see, IC is the
only certification that evaluates skills and knowledge to meet what todays
businesses demand, Watanabe said. Computing Fundamentals teaches very
fundamental IT-related troubleshootingexactly what the real world requires
most. Living Online covers the essential security and morals for an information
society, and Key Applications allows students to learn basic application software
operation. IC allows us to understand and learn the various aspects of IT
specifically and practically.
Nakatsu Commercial High School has already seen positive results since
implementing the IC program. One of the benefits we see is that students now
require less support from instructors during the application operation classes,
Watanabe reported. They are now able to solve by themselves or among
themselves some of the problems they have during operation. I feel students
have become more confident about themselves by solving the small problems
on their own, and this experience has further motivated them to learn.
In fact, more than the usual number of students passed the First Certificate in
Zensho Information Processing Exam in September this year. Watanabe
attributes this success to students anticipating their IC study. They are now
aware that IC is coming after the Zensho exam, and therefore strongly
recognize the importance of passing and finishing the Zensho exam in their
academic calendar, he said. IC motivates students to rise to the challenge of
new certifications and qualifications. Because of the rather big gap between the
difficulty levels of the Zensho exam and that of the System Administrator exam,
quite a few students used to perceive the Zensho exam as their last and as the
prime target to attain. Now, they have another goalICand the Zensho
exam has become just a milestone.
One third-year student who has acquired IC said the certification helped
prepare him for the System Administrator exam. I used to think the System
Administrator exam appeared too difficult to me, and I almost had given up
taking the exam, he said. Now having acquired IC, I am more confident about
myself. While studying for IC, I gained useful study skills such as reading
textbooks carefully and thoroughly and understanding several related issues in
connection to each other. These skills will be quite helpful when studying for
other exams in the future. I am going to try taking the System Administrator
exam someday. IC certification proves my fundamental IT knowledge and skills. I
look forward to building the fundamentals that I have learned through IC.
Watanabe said Nakatsu Commercial High School students are proud they attend
an academic institution that offers IC certification. They feel proud of their learning
environment because they have been given a chance to strive for a qualification that
cannot be acquired in other schools, he said. It builds their confidence and pride in
what they learn and gives them further motivation to study.
Certiport and IC are registered trademarks of Certiport, Inc. in the United States and other countries. Microsoft is either a registered trademark or trademark of Microsoft
Corp. in the United States and other countries. The names of actual companies and products mentioned herein may be the trademarks of their respective owners.
Kofu Commercial
College Adds IC to
Curriculum to Provide
Solid Foundation
for Information
Technology Learning
Two-year program culminates in IC, produces confident
and employment-ready students
January 2007
Kofu Commercial College students studying information technology (IT) have an
exciting, new opportunity this year: Certiport Internet and Computing Core
Certification (IC), the worlds only globally recognized standards-based
certification for basic computer skills, has been added to the colleges curriculum.
To provide a foundation for continued IT instruction and to meet the high
expectations of potential employers, the three IC3 exam components
Computing Fundamentals, Key Applications and Living Onlineare now
included in a two-year program that offers future commercial business
technicians a basis on which they can build and validate their computing skills.
BACKGROUND
Established in 1991 by Kofu City of Yamanashi Prefecture, Kofu Commercial
College is one of the only public technical colleges in Japan that focuses its
academic study on IT and accounting. Using one-on-one instruction and
emphasizing the importance of qualifications, the college strives to educate its
students with practical abilities to become commercial business technicians, an
academic title given to graduates of selected technical colleges.
With only 120 students, Kofu Commercial Colleges small classes provide an
educational environment equipped with the latest facilities and programs.
Starting this academic year, IC is part of the colleges curriculum, providing
students opportunities to gain familiarity with computers and to earn a
worldwide certification.
I first learned about IC a year ago while reading Gokaku Joho Shori (Successes
in ITEE magazine), said Takahiro Harada, the director of IT-related subjects at Kofu
requirement for second-year students to earn course credit. Students pay their
own exam fees, which are collected each April along with fees for reference
books and workbooks.
Kofu Commercial Colleges intensive instructional style has already prepared
many students to pass IC exams. Forty students are expected to earn IC by
the end of the colleges first two-year cycle of the course. The new classes are
running successfully so far, Harada reported.
Masakazu Hatsushika, a student participating in the program, appreciates the
opportunity to earn IC at Kofu Commercial College. Through his study,
Hatsushika has learned computer management skills, software operation,
equipment maintenance and computer configuration. Taking IC exams has
made me more familiar with computers and has given me the knowledge to
solve common personal computer problems, change system settings through the
control panel and install and remove software by myself.
RESULT
Harada hopes Kofu Commercial College students who earn IC are prepared to
continue their IT study and pursue employment.
Acquisition of IC, I think, is a milestone, he said. Individuals have to acquire
qualifications to be the employees needed by the society in which they live. IC
is an international certificate, and I have no doubt that acquiring this certification
will help build student confidence. More will be required of them if they intend
to be experts in their field. To this end, it is very important they have a solid
foundation. I sincerely hope our students will take a full advantage of IC as a
step toward training themselves to be higher-value-added personnel.
Hatsushika said IC is great preparation for the future. Acquiring computer skills
is inevitable for me to live in the future society, and I will continue to improve my
IT skills.
ABOUT CERTIPORT
Certiport prepares individuals and communities with current and relevant digital
skills and credentials for the competitive global workforce. These solutions
include Certiport Internet and Computing Core Certification (IC) and the
Microsoft Office Specialist certification programs delivered by more than 12,000
Certiport Centers worldwide. For more information, visit www.certiport.com.
Certiport and IC are registered trademarks of Certiport, Inc. in the United States and other countries. Microsoft is either a registered trademark or trademark of Microsoft
Corp. in the United States and other countries. The names of actual companies and products mentioned herein may be the trademarks of their respective owners.
Japanese University
Includes IC in Curriculum
to Standardize Digital
Literacy and Inspire
Student Development
Students prepare for future employment by certifying current
and relevant computing skills
Last year, Kyoto Koka Womens University in Kyoto, Japan, realigned its Media
Information curriculum to match the requirements of Certiport Internet and
Computing Core Certification (IC), a globally recognized standard for digital
literacy that validates the fundamental computer and Internet skills and
knowledge required to be successful in school, work and life. With this
curriculum update, the university now offers its students a worldwide standard
that develops digital literacy, helps students set learning goals and increases
confidence in their ability to participate effectively in the digital community.
BACKGROUND
Established in 1940, Kyoto Koka Womens University offers a variety of courses
focusing on information education and vocational guidance. Its Media
Information program, which has an enrollment of approximately 70 first and
second-year students, offers a wide range of curriculum to match the social
needs of students who may someday find employment in industry.
Searching for a unified standard for digital literacy, Kyoto Koka Womens
University recently updated the Media Information curriculum to match IC.
In todays information-oriented society, information literacy is required of all
students, explained associate professor Issei Abe. Fundamental knowledge of
information equipment is vital, so we looked for an appropriate standard to
match our expectations.
Abe said other Japanese universities use certification programs to evaluate
information literacy, but Kyoto Koka Womens University found IC was the only
program that included the well-balanced skill set required in actual office
environments. In addition, Abe said ICs three componentsComputing
Fundamentals, Key Applications and Living Onlinemake it a valid evaluation of
computing knowledge and skills. ICs worldwide recognition made it even more
attractive to the university.
IC is recognized worldwide as an objective standard to be achieved to prove
ones information literacy, he said.
PROCESS
One year ago, Kyoto Koka Womens University aligned its Media Information
curriculum with the exam objectives and skill standards required of the
Computing Fundamentals and Living Online IC components. Students study
Computing Fundamentals for one semester, followed by Network
Fundamentals in a second semester. Finally, students take a one-year
Information Processing course to learn Microsoft Word and Excel and
prepare for the IC Key Applications exam. Upon conclusion of all three courses,
students are prepared to take and pass IC exams.
All IC exam fees are paid by students interested in pursuing the certification.
Kyoto Koka University has been a Certiport Center, an authorized testing center,
for two years.
CERTIFICATION
In the last year, seven Kyoto Koka Womens University students participating in
the Media Information program earned IC certification. Although the
certification is not required for course completion or graduation from the
university, 20 students passed at least one IC exam.
RESULT
Kyoto Koka Womens University students appreciate the opportunity they now
have to earn IC certification. Abe reported that many students are pleased with
the program and the opportunity it gives them to set and achieve their
computing goals.
We have received many positive comments about how easy it is to prepare for
IC because its content is very standardized, Abe said. Our students are trying
very hard to pass the exams. IC is important to them, as is the opportunity to
learn more about computers. I believe we have been successful in providing our
students a very useful credential and skill set they can use right after graduation.
Abe said that by offering IC as part of the Media Information curriculum, Kyoto
Koka Womens University has enhanced the overall digital literacy level of
students at the university and has developed the credibility of the Media
Information courses. Further, students in the program are now more interested
in computer and networking technology and have increased confidence, which
may result in pursuit of additional certifications.
To learn more about Certiport and the ICcertification, visit www.certiport.com or
call Certiport Customer Services and Support at 1-888-999-9830.
Certiport Set to
Strengthen ICT Skills of
Bruneian Workforce
SEAMEO Voctech in Brunei upgrades to Certiport IC and
Microsoft Office Specialist programs from ICDL assessments to
provide workers world-class skills
BRUNEIOctober 25, 2006
Certiport today announced the signing of an agreement that will pave the way
for members of the Bruneian workforce to learn and validate information and
communications technology (ICT) skills while earning globally recognized
credentials. Parties representing the South-East Asian Ministers of Education
Organization Regional Center for Vocational and Technical Education and Training
(SEAMEO Voctech) and Wordware, a Certiport Solution Provider, indicated the
scope of the agreement will include programs for both the private and public
sectors in Brunei.
In the face of increasing regional and global competition, the stakes for
developing a digitally proficient workforce are very high, said Randy Pierson,
executive vice president of Certiport. Teaching computer skills alone is not
enough. ICT training must be comprehensive and measured against
internationally accepted standards.
After extensive evaluation, a selection committee from the SEAMEO Voctech
elected to adopt Certiport Internet and Computing Core Certification (IC)
and Microsoft Office Specialist (MOS) certifications for their relevancy and
global recognition over International Computer Driving License (ICDL) products.
Both IC and MOS programs are based on validated standards updated on
regular cycles with the help of a global array of subject matter experts from
business and academia. Because of their worldwide portability, Certiport IC and
MOS are proof of current and relevant skills that can be verified with Digital
Transcripts from the global Web site http://verify.certiport.com.
Bordering Malaysia, Bruneis economy has largely developed around the
production of crude oil and natural gas resources. New opportunities ushered in
by the Information Age have prompted leaders to seek out ICT training and
certification for the 90,000 members of the Bruneian workforce.
Although the ability to use a computer is as important as reading and writing,
the exposure does not necessarily translate into understanding ICT concepts,
Wilson Wong, president of Wordware, said. Providing the opportunity to earn
Philippine Digital
Literacy Program to
Train and Certify Up
to 500,000 Public
School Teachers
Philippine government chooses Certiport IC as international
standard to empower Filipinos to compete in global workforce
PATHWAYS HAWAIIJuly 2008
As the Philippine government launches a massive digital literacy program to
develop and validate public school teachers information and communications
technology (ICT) skills using Certiport Internet and Computing Core
Certification (IC), the man behind the initiative understands clearly the
importance of ICT competency. Department of Education Secretary Jesli Lapus,
who is tasked with preparing the countrys next generation of
globally competitive workers, said technology is no longer a luxury;
it is now a standard. And, he continued, digital literacy is a
vital component of the Philippine Department of Educations
literacy requirements.
$350
$300
$150
$118
$200
$137
$250
$156
$207
$400
$100
$50
$0
Indonesia
Thailand
Malaysia
Singapore
Philippines
14,000
13,194.70
United
Kingdom
France
Italy
2,000
1,275.28
4,000
1,858.34
6,000
2,252.11
8,000
2,402.00
10,000
2,915.87
12,000
4,377.05
Country
$364
GDP - ASEAN
0
United
States
Japan
Germany
Country
Canada
ASEAN
Compared to the worlds seven major advanced economies, the emerging and
developing ASEAN nations, which share a combined GDP of almost US$1.1
billion, clearly have room to grow.i
To ensure his countrys international competitiveness, Secretary Lapus has
mandated Philippine public school teachers participatICT training and certification
programs, an integral part of the departments overall education agenda. We
must equip both our teachers and our students with 21st century skills that can
empower all Filipinos to become competitive in this digital age, he said.
In 2007, 300 technical and vocational public school teachers were trained and
certified with IC. Because of the favorable results of this pilot program, IC
became a qualification for grant eligibility under the countrys Partnership for
Technology Access programwhich makes ICTs more affordable, accessible and
relevant to underserved citizens. More recently, another digital literacy pilot
program launched to provide ICT skills and IC credentials to public school
teachers. To date, more than one-fith of the Philippines more than 500,000
teachers have received digital literacy training by private-sector partners.
Secretary Lapus said teachers responses to these ICT initiativehave been very
positive. Many teachers have expressed their desires to participate in ICT
trainings as well as to be able to benchmark their ICT knowledge through
various kinds of globally recognized certification programs like IC, he said.
The training and certification have also enabled the DepartmentEducation to
classify teachers into levels, facilitating additional training programs according to
skills. In addition, digital literacy training and certification have been implemented
at all levels of the Philippine educational system:
The Bureau of Elementary Education has partnered with Intel to implement
the Classmate PC program, which provides personal computers to all
elementary school students.
The Bureau of Secondary Education is working to create computer
laboratories in 100 percent of the countrys high schools.
The Bureau Alternative Learning System is digitizing its modules so mobile
teachers have easier access to instruction materials to better meet the
needs of learners outside the formal school system.
The technical-vocational curriculum has been strengthened with ICT to
give thousands of skilled workers the necessary skills to be considered
digitally literate.
Secretary Lapus said Philippine President Gloria Macapagal Arroyoroyal
understands the importance of digital literacyand it is included as a tool for the
acquisition of life skills in the countrys development plans. The Philippine
government recognizes that demand for technology-savvy workers has increased
and that this poses a challenge toeducational institutions, Secretary Lapus said.
Certiport and IC are either registered trademarks or trademarks of Certiport, Inc. in the United States and/or other countries. Intel is a trademark of Intel Corporation in the U.S.
and other countries. The names of actual companies and products mentioned herein may be the trademarks of their respective owners.
i International Monetary Fund World Economic Outlook Database, April 2008 Edition.
BRIAN LEONARD
College Requires
Students Earn IC to
Cope with Struggling
State Economy and
Prepare for 21st
Century Jobs
Southwestern Michigan College faculty and staff are also
required to earn IC digital literacy standard
PATHWAYS HAWAII July 2008
When Brian Leonard encountered Certiport Internet and Computing Core
Certification (IC), the worlds only globally recognized standards-based
certification for fundamental computing and Internet skills, he quickly realized the
digital literacy standard had the potential to meet a critical need with which he
and his colleagues at Southwestern Michigan College (SMC) were struggling:
effectively articulating what it means to the computing and business industries to
be digitally literate.
In todays global marketplace, its become imperative that individuals have a
foundation of computer literacy to be successful, said Leonard, dean of
academic development and assessment at SMC. Although much of our
curriculum was already aligned with moving students toward digital literacy, we
had no anchorage with the industrys perspective related to this need. And,
although we utilized advisory boards to make sure what we were providing in
the academic setting was meeting local needs, we wanted to broaden our
perspective to a national and global level so our students would have
opportunities to be successful.
Ensuring students are certified and work-ready for 21st century jobs is good
news for a state with a struggling economy. Michigans unemployment rate has
increased from 7.4 to 8.5 in the last six months, and layoffs across all industries
have increased 84.8 percent in the past year. Michigan jobs in the natural
resources, mining, construction and manufacturing fields are down a total of 24.3
percent from this time last year; however, jobs available in professional and
business services, education and health services have slightly increasedi. Clearly,
preparing students for employment in new fields within the state and for
entrance in a more global workplace is becoming more and more important.
Leonard said many businesses in southwestern Michigan provide services in
global industries, including local machine shops with U.S. military contracts and
other companies that outsource services worldwide. The idea of a global
worker is vital, he said.
So now SMC uses IC in myriad ways to ensure its nearly 3,000 students and
300 faculty and staff develop critical computing skills and experience success.
Our training serves not only our students but also advances the professional
development of our faculty and staff.
In March 2007, Dr. Diane Chaddock, executive vice president and chief operating
officer, announced IC3 woube the standard used to define computer literacy as
a degree requirement at SMC. Later that year, Dr. David M. Mathews, college
president, gave a presentation about IC3 and recognized employees who were
certified. Mathews, who is IC certified, announced he expects all SMC full-time
faculty and staff to earn IC by 2009. So far, the college has administered more
than 1,500 IC3 exams to students, faculty and staff, and more than 200
individuals have certified.
SMCs emphasis on digital literacy skills has also motivated local high schools to
add IC to their curricula. Whether they earn IC at the college or before they
enroll, students need it to clearly demonstrate digital literacy. As students come
into the college environment, they are required to engage in a digital world,
Leonard said. Its financially advantageous for individuals enrolling at
Southwestern Michigan College to have the certification, because we accept it as
proof of digital literacy.
When SMC developed its digital literacy program, the IC exam objectives
aligned very naturally with its introductory computer technology course. Having
the opportunity in our introductory courses to complete an externally validated
certificate was a great advantage, Leonard said. What IC provides us is a
validation that our curriculum is aligned with industry standards not only on a
local level, but on a national and global level.
In addition, the colleges English as a Second Language (ESL) department
implemented IC to benefit its 45 program participants. They identified another
important component of ESL: exposure to as well as understanding of
technology, Leonard said. As such, when we instigated IC as a general
education and graduation requirement for students, we included it as a capstone
to the ESL program. And, because many of our ESL students are international,
they understand the value of a globally recognized certificate of this nature.
Leonard said students really start to realize the value of IC when they seek
employment. Their experiences with the interview and hiring process have
served as great examples of the value of certification. One student was told the
biggest difference between her and the other candidates applying for the job
was that the hiring manager had no doubt she was functional in relation to the
things she would be asked to do on computers. The hiring decision was based
strictly on the certification, and she received the position.
As his work with digital literacy continues, Leonard said his greatest experiences
have been seeing students faces when they emerge from the colleges testing
center having successfully earned IC. They have accomplished something, and
its non-disputable, he said.
ABOUT CERTIPORT
Certiport prepares individuals with current and relevant digital skills and
credentials for the competitive global workforce. These solutions are delivered
by more than 12,000 Certiport Centers worldwide and include Certiport
Internet and Computing Core Certification (IC), the official Microsoft Office
certification programs and the Adobe Certified Associate certification program.
For more information, visit www.certiport.com.
Certiport and IC are either registered trademarks or trademarks of Certiport, Inc. in the United States and/or other countries. The names of actual companies and products
mentioned herein may be the trademarks of their respective owners.
i U.S. Bureau of Labor Statistics, 20 June 2008.
Certification Standard
IC Validation Brief
2002-2003
IC Validation Brief
DOCUMENT PURPOSE
This Validation Summary document has been created to inform those reviewing or evaluating the Internet
& Computing Core Certification (IC) program of the processes and procedures used to develop and
validate the IC examinations.
True certification-level exams undergo an in-depth and strenuous development process. In summary
form, this document outlines the steps taken by the exam developers to ensure the IC program meets the
highest industry standards of quality and validity for test and certification program development. This
document is not intended to be an exhaustive report of the research, analysis, and developmental steps
taken to create the IC certification program. The full validation report, prepared by The Donath Group,
is available to qualified parties under a non-disclosure agreement.
IC Validation Brief
INTRODUCTION
Internet & Computing Core Certification Overview
The Internet & Computing Core Certification (IC) is a standards-based certification program for basic
computing and Internet literacy. IC provides specific guidelines for the knowledge and skills required to
be a functional user of computer hardware, software, networks and the Internet. By establishing this
vendor-independent standard, IC provides a reliable, universal measure of basic computing and Internet
skills.
IC consists of three different competency exams. Passing all three IC exams qualify an individual to
receive IC certification.
Key Applications: this exam evaluates examinee proficiency in two computer applications (a word
processor and spreadsheet) and the common features of different applications.
Living Online: this exam measures basic skills in using networks, electronic mail, the Internet, and
Web browsing software as well as an understanding of how computers and the Internet affect
society.
Each exam uses various test-question methods. Whenever possible, testing the ability to use specific
product functions (such as file and system management functions of Windows) is done with performancebased test items where candidates are asked to perform specific software tasks in a realistic simulation of
the software environment. Performance-based testing has proven to have a high degree of statistical
reliability and user satisfaction. Testing of other knowledge types (such as knowledge of hardware and
software) is done with traditional linear type questions, like multiple choice, multiple response and
matching test items.
The appropriate mix of linear and performance-based testing questions to measure the knowledge, skills
and abilities of candidates for IC ensures a high degree of validity, reliability and impartiality for all
participants in the program.
IC Program Partners
The IC program was developed through the partnership of Certiport, Inc., the leading provider of global,
performance-based certification programs and services, and SkillCheck, Inc., a leading provider of
assessment and testing products to the education and training, human resources, and staffing services
industries.
The exam development process was guided by The Donath Group, a leading psychometric and evaluative
research consulting organization with over fifty years of highly specialized experience in test
construction, measurement, and statistical analysis.
IC Validation Brief
IMPORTANCE OF VALIDATION
Certification Validation Overview
Exams developed as industry recognized certifications must meet high demands of rigor in the test
development, validation and analysis processes. By publishing certification exams that have followed the
most credible development standards and methodologies, test developers can ensure that certificate
holders possess the clearly defined knowledge and/or skill sets corresponding to that specific certification.
In short, certifications purporting to be industry standards must also be standards driventhey must
adhere to the testing industrys highest set of guidelines of acceptable professional test development
processes represented by such guidelines as The Standards for Educational and Psychological Testing
and the Uniform Guidelines on Employee Selection Procedures.
Exam Validity
Exam test validity is the most important consideration in evaluating tests for a particular purpose
especially when exams are used for industry certifications. The concept of validity refers to the
meaningfulness, usefulness and appropriateness of inferences made from test scores. Test validation,
therefore, is the process of gathering evidence to support the inferences made by test scores.
Validity cannot be adequately summarized by a single set of evidence, such as a reliability coefficient.
This is particularly important today as the term certification is usually used to make an inference about
probable job behavior performance based on the resulting test score. Because of this, it is critically
important that validity for a particular test score be supported through an accumulation of empirical,
theoretical, statistical, and conceptual evidence.
Types of Validity
The following discusses the main types of validation evidence for interpreting test scores.
Content-oriented validation
Content validity refers to the extent to which test scores measure the content they are intended to
measure. Content-related validity evidence can be gathered by examining the degree of
congruence between test items and the content domains purportedly measured by the test items.
This typically requires convening a panel of subject matter experts and asking them to rate the
item-objective congruence according to some established criteria.
Construct-related validation
Construct validity refers to the extent to which the test scores measure the construct it is intended
to measure. It focuses on the relationship between the specific research operations used and the
abstract labeling of them either in cause or effect constructs. Construct validity can be
investigated using factor analysis or a multitrait-multimethod matrix procedure. A construct is
usually a theoretical, unobservable dimension of a measurement procedure. Test question
responses are used to assess whether there is a statistical underlying factor represented by the
responses.
IC Validation Brief
Criterion-related validation
Criterion-related validity evidence refers to how well test scores correlate or predict other
measures of importance, such as some level of job performance, experience, knowledge or skills.
Criterion-related validity can be determined by contrasting groups of known masters and
nonmasters in the content area and perform a comparison of the test score distributions and
reliabilities. This has the benefit of being entirely empirical when the two groups are identified.
Standard Validation Methodology
The following are brief descriptions of well-established exam development methodologies used to fulfill
the main types of validity and publish high quality certification exams. These activities formed the basis
for the validation of the IC program.
Job Task Analysis - Identify the knowledge, skills and abilities required of a certified employee or
individual.
Blueprint Development - Define the scope and content of the skills to be measured by the exam.
Survey Analysis - Gather supporting evidence from a blueprint survey of subject matter experts.
Pilot Tryout & Analysis Pilot all test items through a complete tryout with a representative sample
of certification candidates. The tryout demonstrates empirically how each item behaves under
standardized testing conditions.
Pilot Test Analysis - Evaluate key indices such as the item difficulty value, the discrimination, and
the correlation with external criteria and background groups.
Final Exam Construction - Construct the final exam using the best performing items fitting the exam
blueprint.
Standard Setting Establish cut scores based on an analysis of candidate data and exam score by
using a regression analysis.
The following section describes the many steps IC went through to ensure the highest levels of
validation.
IC Validation Brief
IC VALIDATION
IC Validation Overview
From its conception, the mission of the IC program was to develop state-of-the-art exams that meet or
exceed industry validation standards. To this end, The Donath Group guided the IC program
development and ensured the IC programs compliance with the highest test development methods and
procedures, including those outlined by the following standards organizations:
The Standards for Educational and Psychological Testing (American Educational Research
Association, the American Psychological Association and National Council on Measurement and
Education)
The Uniform Guidelines on Employee Selection Procedures (The Equal Opportunity Commission,
Civil Service Commission, Department of Labor and Department of Justice)
IC Exam Validity
In commitment to its mission, the IC program took steps to accumulate ample empirical, theoretical,
statistical, and conceptual evidence to support its claims of achieving the highest levels of exam validity.
The IC exams were developed, created, and validated over a two year periodutilizing the expertise of
three leading testing, validation, and evaluation corporationsand drawing on the knowledge of over 270
subject matter experts in 19 countriesand, pilot tested in over 40 different locations worldwide, with
over 1,500 exams delivered. The IC exams are completely vendor-independent, and have garnered
endorsements and recognition from recognized industry and government organizations like CompTIA
(Computing Technology Industry Association) and NSSB (National Skills Standard Board).
The result of the IC programs validation efforts is a true certification program that accurately and
reliably can be used to make solid inferences about an individuals knowledge, skills, and applicable job
performance based on the resulting exams scores. The IC program is perfect for academic institutions,
workforce development programs, and organizations needing a reliable means of ensuring individual
computing literacy in an increasingly digital world.
Types of ValidityFulfilled by IC
The IC exams fulfilled all necessary processes to ensure coverage of the main types of validation
evidence for interpreting test scores.
Content-oriented validation
Content validity refers to the extent to which test scores measure the content they are intended to
measure. The IC examinations were developed from research in the field of computer and
Internet literacy, and then empirically established the most important areas to measure skills and
knowledge for this behavioral domain.
Additionally, subject matter experts (SMEs) carefully reviewed the IC test objectives and test
items for item-objective congruence. The blueprint survey review of the content defined the
IC Validation Brief
appropriate content of the examination and the test item reviewers verified that the test items
measure and represent the content of each of the test objectives covered in the examination.
Content-oriented validation evidence is provided in points 1, 2, 3, 4 and 5 under sub-section IC
Validation Methodology.
Construct-related validation
Construct-related validity refers to the extent to which the test scores measure the construct it is
intended to measure. The construct being measured by the IC exams is basic knowledge and
skills in computing as it exists today for most entry-level jobs using computers. This construct is
supported by current research literature, qualitative evaluations by SMEs, and a factor analysis
that determined there is an underlying statistical construct for the IC test data.
Construct-related validation evidence is provided in points 1, 2, 3, 4, 5 and 6 under sub-section
IC Validation Methodology.
Criterion-related validation
Criterion-related validity evidence refers to how well test scores correlate or predict other
measures of importance, such as some level of job performance, experience, knowledge or skills.
Criterion-related validity was established by comparing and analyzing survey responses by
certification candidates to their IC exam score distributions. IC exam scores were found to
highly correlate to a candidates computing and other appropriate experience levels.
Additionally, when analyzing pass and fail decisions compared to candidate experience, the
decisions are very consistent with their levels of experience. Each IC exam had strong
relationships with these predictor variables.
Criterion-related validation evidence is provided in points 6, 7 and 8 under sub-section IC
Validation Methodology.
IC Validation Methodology
The sections below summarize the steps taken in the development of the IC exams. This process
follows, and in some cases exceeds, standards for test validation developed in such documents as APA
Standards and the Uniform Guidelines on Employee Selection Procedures.
1. Industry and Academic Research
Research was completed identifying the knowledge, skills and abilities required for IC
certified individuals
A thorough literature review was performed of industry training and educational
programs that relate to computer literacy and the latest training and educational
methodologies (including Digital Literacy, Information Literacy, Fluency in
Information Technology, Media Literacy and Digital Divide)
A study was completed of existing national and international programs and
curriculumns that clearly define needed competencies in hardware, software and
operating systems, applications, networking, electronic mail, and use of the Internet
An analysis was conducted of training programs from courseware, CBT, training
vendors and book publishers which cover material related to this subject matter
IC Validation Brief
IC Validation Brief
Sample Result: Based on survey results of the exam blueprint for the IC
Computing Fundamentals exam, the following weighting of domains was
used to develop a pilot exam that would meet content validation requirements:
Table 1: Computing FundamentalsDomain and Objective Weighting
Domain 1: Computer Hardware
1.1 Identify different types of computers, how computers
work (process information) and how individual computers fit
into larger systems
1.2 Identify the function of computer hardware components
and common problems associated with individual
components
1.3 Identify issues relating to computer performance and
how it is affected by different components of the computer
1.4 Identify the factors that go into a decision on how to
purchase a computer or select a computer for work or
school
43%
22%
35%
Total
100%
12%
12%
10%
10%
10%
12%
10%
13%
12%
IC Validation Brief
Sample Result: The Donath Groups psychometric and editorial analysis of
over 250 test items created to meet objectives in the IC blueprint determined
180 items (60 for each pilot exam) best met industry test item standards.
These items (a mix of performance-based and linear test items) were further
reviewed for clarity and adherence to industry item-writing and formatting
standards. Final items were automated and used to construct the three exams
used for the IC pilot.
6. Pilot Tryout & Analysis
All IC accepted test items were pilot tested in a standardized computer format with
over 500 potential certification candidates
Pilot tests were conducted at over 40 different testing sites under the exact same
conditions in which actual certification testing would take place
After taking each pilot exam, each candidate completed a survey of their self-assessed
technical skill proficiency and demographic background information
Candidate survey results formed the basis for test and item analysis performed by The
Donath Group
All items were analyzed for item difficulty, item discrimination, and analysis of
distracters
Items demonstrating statistically aberrant behavior were flagged for possible removal in
the final exam, or for further detailed review
SMEs conducted additional reviews of questionable items, and assisted in the selection
of the final set of items
Scores from each pilot exam were reviewed for potential bias in gender, race, age, or
any other variable that defines a protected group
A mastery composite score for the pilot tryout was calculated and correlated with the
pilot test scores
A regression analysis of the predictor variables and composite score was used to assess
the relationship between the pilot exam and the survey
Sample Result: No pattern of statistical differences was determined to exist
that would indicate that the IC exams are functioning differently for any
protected groups.
7. Final Exam Construction
Based on the results and analysis of the IC exams pilot tryout, test items were selected
for the final IC exams item pools
After detailed analysis test items demonstrating statistically deviant behavior, or
potential biases toward gender, race, age, or any other protected group, were discarded
A comparison of the remaining test items to the determined IC exam content (final
blueprint) was conducted to ensure percentage representation remained consistent with
content validation requirements
The remaining accepted items were included in a set of 44-45 question tests to be used
as the final IC exams
A mastery composite score for the final exam was calculated and correlated with the
pilot test scores
A regression analysis of the predictor variables and composite score was used to assess
the relationship between the pilot exam and the survey
Each test candidate taking part in the original beta test had their test results rescored
based on the final selection of items in the three IC exams
IC Validation Brief
Sample Result: From the original set of 60 questions for each pilot exam,
final IC exams were created that included 44-45 high-performing, highquality items that met content validity requirements based on the original
content study.
8. Standard Setting
IC final exam cut score determination completed by considering level of mastery,
standard deviation, test score means, and decision error
All test performance results, as well as candidates self-reported assessment of their
skill level, were analyzed togetherthis analysis provided the mechanism to guide the
standard setting, or cut score
IC certification exams were published for delivery on November 8, 2001
Sample Result: An analysis of test scores vs. survey results on experience
level determined cut score for each exam, as illustrated in the following chart:
Chart 1: Survey Score vs. Test ScoreCorrelation to Experience Level
10
IC Validation Brief
CONCLUSION
The Internet & Computing Core Certification (IC) program was created to offer a unique, validated,
global certification program that provides specific standards for the knowledge, skills, and abilities
required to be a broad-based, productive user of computer hardware, software, networks, and the Internet.
Through in-depth research and analysis into the world of digital literacy, it was determined that three
exams were needed to cover the range of subjects necessary for an individual to be IC certified.
Living Online: a measure of the basic skills in using networks, electronic mail, the Internet, and Web
browsing software as well as how computers and the Internet affect society.
IC certification exams were created to meet the highest, standards-based development processes accepted
industry-wide. This process was guided by The Donath Group, an industry-recognized leader in exam
construction, measurement, and statistical analysis. The IC program took steps to accumulate ample
empirical, theoretical, statistical, and conceptual evidence to support its claims of achieving the highest
levels of exam validity. The quality and validity of the IC exams is recognized by other industry
organizations like the NSSB and CompTIA.
The final result of the IC programs validation efforts is a true certification program that accurately and
reliably measures an individuals knowledge, skills, and abilities to effectively live and work in our
increasingly digital world.
11
IC Validation Brief
CONTACT INFORMATION
For more information about the Internet &Computing Core CertificationICplease visit
Certiports IC web page, or contact Certiport via email, post, or telephone.
Web Page:
www.certiport.com/ic3
Email:
ic3@certiport.com
Phone:
888.999.9830
Address:
Certiport
Attn: IC Program
1276 South 820 East, Suite 200
American Fork, UT 84003
12
Within this paper, the term technology refers specifically to information and computing technology. See The
National Research Councils Tech Tally: Approaches to Assessing Technological Literacy (National Academies
Press, 2006) for a discussion of technology standards and assessments that include technologies related to
engineering and other non-IT disciplines.
government IT organizations and entering homes, schools and workplaces in the form of desktop
microcomputers. The choice of the term Literacy, even in this early period of computing
technology intersecting with peoples personal lives, recognized that the ability to understand
and use this new tool was not just a skill but a crucial and multi-faceted set of abilities as critical
to everyday life as literacies in language and mathematics (i.e., numeracy). At the same time, the
term Computer Literacy made it clear that it was the tool that people were being asked to
master (a specificity increasingly referred to as tool literacy).
As the computer stopped being thought of primarily as a standalone box dedicated to increasing
personal productivity and instead became a focal point for gathering, organizing and
communicating information, new terminology supplanted Computer Literacy in describing the
skills needed to competently utilize converging computing and communications technology.
Information Literacy, an area of study developed within the discipline of Library Sciences,2
and Digital Literacy,3 a term first popularized by Paul Gilsters book of the same name, both
recognized that the computer was becoming the place where individuals were interfacing with
large quantities of unfiltered data. Both Information Literacy and Digital Literacy focused on the
critical thinking and cognitive abilities needed for individuals to evaluate, organize and process
streams of information that would become torrents as the Internet age advanced.
Research over the last ten years has established more rigorous and holistic approaches to this
subject. Fluency with Information Technology (or FITness)4 was a model put forth by the
Computer Science and Telecommunications Board of the National Research Council (NRC) in
their 1999 work Being Fluent in Information Technology.5 This research established three broad
areas that together constituted technology literacy:
Current (or "contemporary") computer skills (i.e., the ability to use current hardware and
software to perform useful functions)
A set of higher-order thinking and reasoning skills required for understanding and solving
problems as they arise in modern technological systems
FITness acknowledges the importance of being able to use todays technology, yet it also posits
that two equally important technology literacy strands (foundational concepts and critical
thinking/problem solving skills) prepare someone for inevitable changes in what constitutes
contemporary computer skill.
State University of New York (SUNY) Council of Library Directors. Information Literacy Initiative. 30
September 1997.
3
Gilster, Paul. 1997. Digital Literacy. John Wiley & Sons, New York.
4
The Committee on Information Technology Literacy, The Computer Science and Telecommunications Board, The
Commission on Physical Sciences, Mathematic, and Applications and the National Research Council. 1999. Digital
Literacy. The Computer Science and Telecommunications Board, Washington, DC.
5
National Academic Press, 1999.
The concept of an ICT literacy made up of multiple integrated knowledge, skills and abilities
was reinforced by the 2003 Framework for ICT Literacy6 developed by the International ICT
Literacy Panel and the OECD PISA ICT Literacy Feasibility Study of 20037 as well as by
educational standards, most notably the National Educational Technology Standards (NETS)8 for
students, teachers and school administrators created by the International Society for Technology
and Education (ISTE)9 in conjunction with the US Department of Education.
In the years since NRC published Being Fluent with Information Technology, there has emerged
what can be thought of as an informal global consensus regarding what constitutes ICTL. This
consensus reflects the FITness model of ICTL consisting of foundational knowledge, the ability
to use current technological systems and critical thinking and problem solving skills. It also
addresses the fact that technology is the conduit for a wealth of information a user must manage,
and that enforces the notion that ICTL must been seen in the context of real world use (for
example, ICT integrated into traditional academic disciplines such as language, math and social
studies a major theme of the NETS standards). This consensus is reflected in standards
developed in the UK10, Japan,11 Australia,12 South Africa13 (to name just a few countries that
have established technology education standards), as well as standards developed by most of the
50 US states.14
From Standards to Measurable Objectives
Much of the standard setting that has taken place along common pathways in different parts of
the world has been performed at a very high level. Work by the NRC and OECD, for example,
created powerful frameworks for defining ICTL, but did not provide comprehensive teachable or
measurable objectives tied to those frameworks.
The ISTE NETS standards, which have been adopted as the framework for technology standards
in 45 of the 50 US states, provide an interesting case study on the work needed to turn high-level
educational standards into specific objectives that could be measured via traditional or modern
assessment techniques. As shown in Appendix A of this document, the ISTE standards articulate
high-level goals covering areas such as (1) Creativity and Innovation, (2) Communication and
Collaboration, (3) Research and Information Fluency, (4) Critical Thinking, Problem Solving
and Decision Making, (5) Digital Citizenship and (6) Technology Operations and Concepts.
NETS Standard 6 (Technology Operations and Concepts), an ISTE standard highly suited to
assessment by techniques that measure what the NCR FITness study defines as contemporary
knowledge, consists of just the following goal-oriented statements:
http://www.ets.org/Media/Tests/Information_and_Communication_Technology_Literacy/ictreport.pdf
http://www.oecd.org/dataoecd/35/13/33699866.pdf
8
http://www.iste.org/AM/Template.cfm?Section=NETS
9
http://www.iste.org
10
http://www.e-skills.com/
11
http://www.mext.go.jp/english/news/2007/03/07022214.htm
12
http://www.caul.edu.au/caul-doc/InfoLitStandards2001.doc
13
http://www.info.gov.za/otherdocs/1998/prc98/chap6.htm
14
http://www.ccsso.org/Projects/State_Education_Indicators/Key_State_Education_Policies/3160.cfm
7
15
http://www.tea.state.tx.us/rules/tac/chapter126/index.html
http://www.doe.mass.edu/edtech/standards/itstand_draft.pdf
17
http://www.mceetya.edu.au/mceetya/default.asp?id=12183
18
http://www.novell.com/training/certinfo/cne/index.html
19
http://www.microsoft.com/learning/mcp/mcse/default.mspx
16
widely popular, spawning additional programs such as the successful A+ certification for general
hardware and software support specialists created by the industry consortium CompTIA.20
Unlike educational assessments that are still delivered primarily in paper-and-pencil format to
large numbers of students simultaneously, IT certifications made use of computer-delivered
testing technology and secure proctored facilities to create on-demand testing programs
supported by an infrastructure that was also being increasingly used to deliver proctored
licensing and non-IT certification exams.
The technology and business model around which the certification industry was built also
allowed IT certification exam developers to make use of advanced testing techniques that were
not being used in other mass testing environments (many of which still rely on paper-and-pencil
test delivery). These techniques included adaptive testing (and other dynamic testing
methodologies based on Item Response Theory models) and performance-based testing that asks
test takers to perform functions as if working with technology in the real world, rather than
asking test takers to select responses in traditional multiple-choice or other linear item formats.
The Microsoft Office Specialist21 program (a certification on Microsoft desktop applications that
has been given to over 10 million people internationally) is the most popular IT certification
based entirely on performance-based testing technology.
The Internet and Computing Core Certification
The Internet and Computing Core Certification (IC3) was developed to integrate the most recent
thinking in ICTL concepts and standard setting with the latest advances in automated assessment
utilized by the IT certification industry. The goal of the program was to create and deploy ICTL
assessments that were valid, scalable and covered the broadest range of ICTL knowledge, skills
and abilities.
The IC3 program was created in 2003 by Certiport, Inc.,22 Microsofts partners in the creation
and management of the aforementioned Microsoft Office Specialist (now MCAS program)
program and First Advantage23 (formerly SkillCheck, Inc.), a company with twenty years of
experience in developing valid, performance-based assessments for the employment industry
(including global staffing organizations such as Adecco, Manpower and Kelly Services).
Research on IC3 began with a literature review of current thinking in ICTL (much of it
summarized in the previous section of this paper) as well as a review of current ICT testing in
use by education and industry. As elaborated in detail in Haber and Kellys 2005 book National
Educational Testing Standards for Students (NETS*S): Resources for Assessment24 (published
by the International Society for Technology in Education), different assessment techniques have
been used to measure various aspects of technology literacy. These include:
20
http://certification.comptia.org/a/
http://www.certiport.com/Portal/desktopdefault.aspx?page=common/pagelibrary/mbc_mcas.html
22
www.certiport.com
23
www.fadvassessments.com
24
International Society for Technology in Education, Eugene, OR, 2006
21
Surveys Normally self-report surveys that allow respondents to gauge their own level
of technology experience and skill or provide input regarding student and teacher
attitudes towards technology
Linear tests Tests including standard linear test items such as multiple-choice, truefalse, and matching items
Performance-based assessments Assessments that ask test takers to perform real-world
tasks, either within actual technology environments or within high-fidelity simulations of
technology products
Hands-on assessments Portfolios, observational assessments and other techniques that
measure competency by analyzing technology work products or observing work in
progress, normally using rubrics and other methods for standardized grading
Each assessment technique, as used in the real world, has strengths and weaknesses, summarized
in Table 1 below.
Table 1 Strengths and Weaknesses of Different Assessment Techniques
Assessment Technique
Surveys
Linear Assessments
Performance-Based
Assessments
Strengths
An inexpensive way to
gather information from
large numbers of users
Well-designed, validated
surveys can provide
accurate information,
especially with regard to
self-reported abilities and
attitudes
Can be delivered via
paper, computer and
online at low cost
Techniques for mass
delivery of linear
assessments are already in
use in many standardized
educational testing
programs
Highly reliable and
accurate way to measure
specific skills (such as the
ability to use specific
software products and
features)
Weaknesses
Self-report surveys
provide no way of
confirming specific
abilities
Surveys must be carefully
constructed to ensure selfreport information elicits
honest and consistent
responses
Linear test items are best
used to measure
knowledge, rather than
skill
While performance-based
technologies (such as
interactive simulations)
can be delivered over the
Internet, such delivery
requires more resources
than survey or linear
assessments that are
primarily delivering textbased testing
6
Hands-on Assessments
An analysis of work
products or work in
progress provides a way
to measure complex
behaviors and abilities,
including collaboration
and creative uses of
technology
Performance-based
assessments are best used
to measure specific skills,
rather than complex
processes such as
collaboration or creativity
As a process requiring
manual scoring, hands-onassessments are difficult
and costly to scale
Scaling hands-on
assessments presents
challenges with regard to
consistently scoring work
samples and observed
behaviors across different
observers/graders
Any assessment that would meet the needs of the IT certification industry for validity and
scalability would fit somewhere into the continuum illustrated in Figure 1.
Performance-Based Assessments
Interactive Simulations
Concurrent (Live Application)
Hands On Assessments
Portfolios
Observations
Complexity
Figure 1. Scalability vs. Complexity of Tests Using Various Item Types
Towards the goal of creating fully automated certification exams that could be delivered globally
(and given other practical considerations, such as time requirements of testing within an
educational setting), the following criteria were established for the IC3 program:
The white paper The Internet and Computing Core Certification (IC3): Building a
Dynamic Standard, summarizing the literature review and other research performed in
the creation of the standard
A test-design document (TDD) that specifies all aspects of the examinations not related
to content (number of exams, number of items per exam, time constraints, scoring
routine, delivery mechanism, reporting guidelines, etc.)
A set of test blueprints detailing all objectives covered by the standard.
The TDD specified that IC3 would consist of three separate exams (described above):
Computing Fundamentals
Key Applications
Living Online
A separate examination blueprint was prepared for each of the three exams. These blueprints
were developed by subject-matter experts under the guidance of The Donath Group,26 a
professional exam-development organization specializing in the creation and validation of IT
certifications.
The blueprints were designed as a hierarchy of high-level domains (such as Computer Hardware,
Software and Operating Systems for the Computing Fundamentals exam), sub-domains which
define one element of the domain, and objectives (each of which was designed to be measurable
using some type of automated test item). A sample from the Computing Fundamentals blueprint
appears on the following page.
25
26
Presentation software was added to this exam in a 2005 update to the standard.
http://www.donath.com/main.html
Identify types of computers, how they process information and how individual computers interact with
other computing systems and devices
Content may include the following:
1.1.1
Categorize types of computers based on their size, power and purpose, including:
Supercomputers
Mainframe computers
Minicomputers
Microcomputers
Laptop computers
Handheld computers/Personal Digital Assistants (PDAs)
1.1.2
1.1.3
1.1.4
1.1.5
1.1.6
Identify the role of types of memory and storage and the purpose of each, including:
Random Access Memory (RAM)
Read Only Memory (ROM)
Storage media (such as hard disks, floppy diskettes, and optical media like CD
ROMs)
A first draft of all standard documents was shared with the Global Digital Literacy Council
(GLDC), a group of thought leaders in the fields of education and workplace development
related to technology.27 Through a six-week period of online meetings and discussions, this
group provided input that was incorporated into the standard, including a number of
modifications to the specific objectives covered in the examination blueprints.
Once the modified standard documents were reviewed and approved by the GLDC, the
blueprints was turned into an online survey that were distributed to over 200 professionals in IT
education (including educators and trainers certified in IT standard such as Microsoft Office
Specialist). 189 survey respondents ranked each sub-domain based on the following criteria:
This questionnaire is designed to solicit your review and analysis of individual test
objectives as they relate to (1) the IMPORTANCE of an objective in assessing the
literacy or competency of a candidate within the context of an entry level job, and (2) the
FREQUENCY with which a competency-based objective is performed within the
context of an entry level job. In order to successfully complete the questionnaire, you
must rate the IMPORTANCE and FREQUENCY of each text objective using the
following scales:
IMPORTANCE
1 = Not important
2 = Of little importance
3 = Of modest importance
4 = Very important
5 = Critically important
FREQUENCY
1 = Never
2 = Rarely
3 = Often
4 = Very often
5 = Always
An optional COMMENTS field was included for each blueprint entry to allow experts to
provide specific suggestions and to qualify and comment on their ratings. Each respondent was
also asked a number of questions regarding their experience with computers and various
computer applications.
This information was used to perform a content analysis of the blueprints. Comments from this
group of experts were also used to further refine the wording of exam objectives (with care taken
to ensure that any changes did not impact the data analysis portion of the project). This research
was used to create a content balance for each exam. An example of the results appears on the
following page.
27
http://www.gdlcouncil.org/
10
189
189
Mean
4.19
4.55
4.50
Mean
3.40
3.63
Mean
189
Valid N
189
Domain Total
189
Valid N
189
Domain Total
189
Valid N
Objective
0.67
0.68
Std.Dev.
0.76
0.56
0.65
Std.Dev.
0.87
0.99
Std.Dev.
Importance
4.19
4.03
Mean
3.94
4.49
4.43
Mean
2.98
3.16
Mean
0.83
0.81
Std.Dev.
0.91
0.63
0.77
Std.Dev.
0.96
1.09
Std.Dev.
Frequency
Living Online
8.50
8.17
26.10
8.13
9.04
8.94
13.17
6.38
6.79
Importance
+ Frequency
9.85%
9.47%
30.24%
9.42%
10.47%
10.35%
15.26%
7.39%
7.87%
Percentage
18
60 Item
Beta
11
19
45 Item
Exam
100.00%
86.31
1.04
9.21%
Exam Total
3.58
7.95
7.45%
28.98%
25.01
6.43
9.66%
8.34
25.52%
0.80
1.02
0.98
Std.Dev.
0.87
Std.Dev.
22.03
4.06
189
3.76
3.07
Mean
4.06
Mean
Percentage
8.86%
4.20
189
0.76
0.86
Std.Dev.
0.74
Std.Dev.
Importance
+ Frequency
7.65
3.35
Mean
Valid N
189
4.28
Mean
189
Valid N
Frequency
Objective
Importance
Living Online
60
15
12
45
18
45 Item
Exam
60 Item
Beta
13
14
LivingOnlineExamReliabilityAnalysis
Study
IC3LivingOnlineExam(2003Standard)KR20(from
originalvalidationstudy)
IC3LivingOnlineExam(2005Standard)KR20(from
originalvalidationstudy)
IC3LivingOnlineExam(2005Standard)KR20(analysis
ofliveexamdata)
IC3LivingOnlineExam(2005Standard)testretest
reliability(analysisofliveexamdata)
*MaybeinfluencedbyPracticeEffect
Value
.88(n=260)
.93(n=402)
.90(n=3943)
.82*(n=136)
15
Deployment
Once the exams were finalized, the final tests were integrated into the Certiport iQsystem for test
delivery within the organizations test-center network (the same network used to deploy the
companys Microsoft Office Specialist exams). The exams were also translated into over ten
languages for deployment worldwide within this test-center network.
Simultaneously, the test blueprint documents were used to create an educational and training
curriculum to be used as the basis of an ICTL learning program. In addition to specifying the
exam objectives, additional details and training suggestions were included in the curriculum in
order to allow book, CBT and eLearning publishers, as well as teachers and independent training
organizations, to create training materials that prepared people for IC3 certification. A sample
page from the IC3 curriculum appears in Appendix C.
Updating the Standard
The standard and test-development process described above was performed more than once. In
2002, the entire process was used to create the first iteration of the IC3 Standard (called the 2003
Standard) and a single exam form for each of the three IC3 modules (Computing Fundamentals,
Key Applications and Living Online). The beta exam process was repeated in 2004 to create a
second test form for each of the three modules.
In 2004, work was begun to update the standard itself to create the IC3 2005 Standard. This
standard update took into account a number of changes since 2003, notably the development of
many state-level and national standards around ICTL that had been adopted in the period since
2002 when the original IC3 program was originally researched. Updated blueprints went through
high-level expert review by the GLDC for the 2005 standard, followed by the same online
survey/blueprint analysis used to create the 2003 content-balanced blueprints.
Two beta forms were created to provide content for the IC3 2005 exams. These beta items were
pooled with items from previous exams that still corresponded to 2005 blueprint objectives to
create four parallel forms that have been deployed over the last three years.
One of the key issues that comes up with any IT-related standard and exam-development process
relates to datedness of blueprint objectives and exam items, given rapid changes in technology.
Certification exams that are tied to specific products, such as the Microsoft Office Specialist or
MCSE program, require updating upon the release of new Microsoft products, such as the
Microsoft Office desktop productivity suite or Windows operating system. For such productrelated certifications, updates are dictated by the release schedule of the product manufacturer.
General ICTL exams like IC3 are less sensitive to version-related changes in the software
marketplace. This is because certain IC3 domains (especially those that relate to what the NRC
FITness model would term foundational concepts and higher-order thinking skills) are less
subject to change than domains related to contemporary computer skills (which would include
the ability to use the specific, contemporary software applications).
16
Certain foundational concepts, such as the nature of the microprocessor and computer memory,
concepts related to digital citizenship (such as the proper citing of sources and avoidance of
plagiarism), or cognitive abilities (such as the ability to interpret graphical or tabular information
generated by a computer application), may not constitute eternal verities. However, they are
certainly subjects that remain stable, even in light of changes to specific technology. A review of
2003 exam items in these areas found very few issues of datedness that would impact item
performance. For example, a question asking a candidate to indentify MHz as a unit of processor
speed might have seen 133 MHz as the correct answer in a 2003 exam, but 500 MHz is the
correct answer in a 2005 exam. The foundational concept, however (MHz as the unit for
measuring processor speed) was not impacted by this choice of response wording.
Even in areas subject to change, like application software, the fact that IC3 focuses on basic
features and functionality, rather than advanced features (which are generally what
manufacturers add when they upgrade software) means that the application features that are
included in the IC3 standard (such as opening files, cutting and pasting text, changing fonts or
printing documents) have not changed significantly in the course of over a decade of product
updates.
Generally, what has changed between updates of the standard are assumptions related to how
technology is used. For example, the 2005 standard put more emphasis on the computer as an
information-management tool than the 2003 standard which placed more emphasis on the
computers role in increasing personal-productivity. Both the 2005 standard and exams also
assumed a world in which constant connection to the Internet was more of a given (vs. earlier
assumptions that people were required to get onto an onramp like a dial-in service provider
before they could be connected to the Net). For the 2009 standard, currently under development,
the world of Web 2.0 in which the Web is no longer a place where content is found and
consumed, but one where individuals are now publishers and content providers, needs to be
taken into account and used as a framework for certain program components, especially the
objectives making up the Living Online exam.
Microsofts release of Office 2007, the first release of their Windows application suite that
significantly overhauled the product interface, has added an additional challenge to current exam
development efforts. An assumption built into IC3 is that the latest version of Microsoft Office
could serve as a stand-in not just for previous versions of Office, but for all Windows-based
desktop applications. Given that the features covered by IC3 (File Open, Edit Copy, etc.)
have been accessed in the same way within Word for Windows and WordPerfect for Windows
going back to their first releases, this was a reasonable assumption backed up by consistent item
statistics for the same item used in tests for different product versions. With the new version of
Office (which replaces menus with a new Web-like ribbon interface), the assumption that the
latest version of Office can serve as a stand in for other Windows applications may no longer
hold. Other groups involved with ICTL assessment face similar issues. North Carolina with its
NCDesk assessment,28 for example, has avoided the use of specific applications by simulating
their own generic word processor, spreadsheet and other products as the basis for application
assessment. Other test developers are looking to Open Source applications as the basis for
creating generic tests. While a potential solution to versioning issues, these choices present their
28
http://www.ncpublicschools.org/accountability/testing/computerskills/
17
own problems: notably, the creation of standardized assessments based on applications few or no
users have ever encountered.
Developers of IC3 are currently reviewing additional options in this area, including the creation
of alternative forms based on specific Office versions (something already done in the creation of
Macintosh-specific exams) in preparation for exam creation based on the 2009 standard.
The following table shows changes in item statistics between the original beta development and
validation for one of the IC3 modules, and how those items have performed over time. This
analysis demonstrates no significant across-the-board drift in item statistics that would indicate
the overall test was getting easier due to changes in technology or environment.
18
QNID
Q002
Q005
Q007
Q008
Q010
Q017
Q110
Q121
Q122
Q135
Q141
Q151
Q153
Q154
Q159
Q171
Q188
Q256
Q258
Q260
Q265
Q266
Q407
Q410
Q413
Q415
Q419
Q421
Q429
Q431
Q433
Q436
Q437
Q441
Q445
Q456
Q458
Q461
Q463
Q466
Q470
Q471
Q477
Q480
Q481
Total
Average
OriginalPilotAnalysis2005
PValue
Stdev
RValue
0.7338 0.4425
0.5610
0.7394 0.4396
0.6098
0.8194 0.3847
0.3713
0.8590 0.3484
0.6009
0.7158 0.4511
0.2590
0.7676 0.4224
0.2806
0.7355 0.4411
0.3022
0.7744 0.4181
0.4014
0.6157 0.4865
0.3103
0.8108 0.3917
0.2581
0.8713 0.3350
0.3803
0.6164 0.4864
0.3586
0.9612 0.1932
0.3239
0.9440 0.2300
0.3448
0.7264 0.4459
0.3924
0.8400 0.3666
0.2332
0.8337 0.3724
0.3810
0.9446 0.2287
0.3504
0.5669 0.4956
0.1657
0.7626 0.4260
0.4483
0.5079 0.5000
0.2970
0.8584 0.3487
0.2372
0.7960 0.4035
0.3368
0.8831 0.3217
0.2291
0.6095 0.4885
0.3343
0.5174 0.5003
0.5603
0.5274 0.4999
0.4673
0.7910 0.4071
0.4800
0.9303 0.2549
0.3706
0.9005 0.2997
0.4070
0.7139 0.4525
0.2697
0.8404 0.3667
0.5152
0.6741 0.4693
0.4502
0.8060 0.3959
0.5496
0.6915 0.4624
0.5359
0.6940 0.4614
0.3978
0.6995 0.4591
0.5728
0.7985 0.4016
0.4853
0.7139 0.4525
0.4542
0.8085 0.3940
0.3006
0.8484 0.3591
0.5621
0.7662 0.4238
0.4826
0.8731 0.3332
0.5146
0.8059 0.3961
0.4033
0.7065 0.4559
0.3454
34.4005
.7652
2008LiveExamAnalysis
PValue
Stdev
RValue
0.7456 0.4356
0.4713
0.8303 0.3754
0.5319
0.7624 0.4257
0.4813
0.9143 0.2800
0.5555
0.6959 0.4601
0.3179
0.7596 0.4274
0.3876
0.7045 0.4563
0.4156
0.699 0.4588
0.4551
0.634 0.4818
0.4035
0.7537 0.4309
0.2869
0.8311 0.3747
0.4176
0.4887 0.4999
0.4021
0.95 0.2179
0.3044
0.8603 0.3468
0.5229
0.6538 0.4758
0.3966
0.8831 0.3214
0.3547
0.8339 0.3722
0.4754
0.9006 0.2993
0.4837
0.4707 0.4992
0.2257
0.8352 0.3711
0.4304
0.4725 0.4993
0.3084
0.8598 0.3476
0.2819
0.7918 0.4061
0.3576
0.8737 0.3322
0.2894
0.594 0.4912
0.2590
0.5640
0.7378 0.4399
0.613 0.4871
0.3514
0.8365 0.3699
0.3614
0.931 0.2535
0.3615
0.8909 0.3117
0.4593
0.7474 0.4346
0.2508
0.8988 0.3016
0.5234
0.6434 0.4791
0.3296
0.8554 0.3517
0.5689
0.7324 0.4427
0.3452
0.753 0.4313
0.3699
0.7796 0.4146
0.5271
0.844 0.3629
0.4732
0.775 0.4176
0.4593
0.8313 0.3745
0.3331
0.8978 0.3030
0.5238
0.7816 0.4132
0.4744
0.9232 0.2664
0.5246
0.8466 0.3605
0.3972
0.7119 0.4529
0.2445
34.8291
.7746
19
Conclusion
The process of researching, developing and validating an ICTL examination program based on
the latest thinking regarding technology literacy and informed by best practices in both the
education and certification, has led to the creation of an IC3 program that covers substantial
components of what constitutes the current global consensus regarding what it means to be
ICT literate.
In addition, the process of creating and validating IC3 exams simultaneously with the creation of
educational curricula has ensured that the standard maximized the number of objectives that
were both teachable and measurable. Most standard-setting projects, including those at the
national and state level on technology as well as NCLB academic subjects (language,
mathematics and now science and social studies) separate the standard creation and examdevelopment processes, leading to a disconnect between the curricula teachers are asked to adopt
and the high-stakes exams that may eventually measure student mastery of that curricula.
No standard can be absolutely comprehensive. For example, student ability to collaborate on
technology projects part of the NETS standards for students may always need to be measured
with some type of hands-on test, such as portfolio review or observational assessment. However,
many academic and even industry standards face challenges when trying to balance pedagogical
and assessment needs associated with programs like NCLB. The parallel creation of standards,
exams and curricula represented by the experience creating IC3 may offer a pathway to
minimizing conflict between learning and testing, even if such a conflict can never be eliminated
entirely. Organizations interested in measuring other sorts of literacies, such as more broadly
defined technology literacy 29 may find parallels with their own work in the experience creating
ICTL literacy certification programs such as IC3.
29
Tech Tally: Approaches to Assessing Technological Literacy (National Academies Press, 2006)
20
Appendix A
National Educational Technology Standards for Students
1. Creativity and Innovation
Students demonstrate creative thinking, construct knowledge, and develop innovative products and processes
using technology. Students:
a. apply existing knowledge to generate new ideas, products, or processes.
b. create original works as a means of personal or group expression.
c.. use models and simulations to explore complex systems and issues.
d. identify trends and forecast possibilities.
2. Communication and Collaboration
Students use digital media and environments to communicate and work collaboratively, including at a
distance, to support individual learning and contribute to the learning of others. Students:
a. interact, collaborate, and publish with peers, experts or others employing a variety of digital
environments and media.
b. communicate information and ideas effectively to multiple audiences using a variety of media and
formats.
c. develop cultural understanding and global awareness by engaging with learners of other cultures.
d. contribute to project teams to produce original works or solve problems.
3. Research and Information Fluency
Students apply digital tools to gather, evaluate, and use information. Students:
a. plan strategies to guide inquiry.
b. locate, organize, analyze, evaluate, synthesize, and ethically use information from a variety of sources
and media.
c. evaluate and select information sources and digital tools based on the appropriateness to specific tasks.
d. process data and report results.
4. Critical Thinking, Problem-Solving & Decision-Making
Students use critical thinking skills to plan and conduct research, manage projects, solve problems and make
informed decisions using appropriate digital tools and resources. Students:
a. identify and define authentic problems and significant questions for investigation.
b. plan and manage activities to develop a solution or complete a project.
c. collect and analyze data to identify solutions and/or make informed decisions.
d. use multiple processes and diverse perspectives to explore alternative solutions.
5. Digital Citizenship
Students understand human, cultural, and societal issues related to technology and practice legal and ethical
behavior. Students:
a. advocate and practice safe, legal, and responsible use of information and technology.
b. exhibit a positive attitude toward using technology that supports collaboration, learning, and
productivity.
c. demonstrate personal responsibility for lifelong learning.
d. exhibit leadership for digital citizenship.
6. Technology Operations and Concepts
Students demonstrate a sound understanding of technology concepts, systems and operations. Students:
a. understand and use technology systems.
b. select and use applications effectively and productively.
c. troubleshoot systems and applications.
d. transfer current knowledge to learning of new technologies.
21
Appendix B
National Educational Technology Standards for Students 6-8th Grade
Indicators
The following experiences with technology and digital resources are examples of learning activities in which
students might engage during Grades 68 (ages 1114):
1. Describe and illustrate a content-related concept or process using a model, simulation, or concept-mapping
software. (1, 2)
2. Create original animations or videos documenting school, community, or local events.
(1, 2, 6)
3. Gather data, examine patterns, and apply information for decision making using digital tools and resources. (1, 4)
4. Participate in a cooperative learning project in an online learning community. (2)
5. Evaluate digital resources to determine the credibility of the author and publisher and the timeliness and accuracy
of the content. (3)
6. Employ data-collection technology such as probes, handheld devices, and geographic mapping systems to gather,
view, analyze, and report results for content-related problems.
(3, 4, 6)
7. Select and use the appropriate tools and digital resources to accomplish a variety of tasks and to solve problems.
(3, 4, 6)
8. Use collaborative electronic authoring tools to explore common curriculum content from multicultural
perspectives with other learners. (2, 3, 4, 5)
9. Integrate a variety of file types to create and illustrate a document or presentation. (1, 6)
10. Independently develop and apply strategies for identifying and solving routine hardware and software problems.
(4, 6)
The numbers in parentheses after each item identify the standards (16) most closely linked to the activity
described. Each activity may relate to one indicator, to multiple indicators, or to the overall standards referenced.
The categories are:
1. Creativity and Innovation
2. Communication and Collaboration
3. Research and Information Fluency
4. Critical Thinking, Problem Solving, and Decision Making
5. Digital Citizenship
6. Technology Operations and Concepts
National Educational Technology Standards for Students
2007 ISTE. All Rights Reserved.
Excerpted from NETS for Students Booklet
22
IC-1
1.1.1
IC-1
1.1.2
IC-1
1.1.3
23
1. Explain that electronic devices (such as calculators and cell phones) contain
a microprocessor and other computing devices designed for specific purposes,
such as calculation and communication.
Scott Stoddart
National Director, Workforce Initiatives
Certiport
Tel: 888-999-9830 ext 184
Mobile: 801-554-3178
Fax: 801-492-4118
E-Mail: sstoddart@certiport.com
Web: www.certiport.com
Jon Haber
Senior Vice President
First Advantage Assessment Solutions
Tel: 781-221-4160
Fax: 781-229-8108
Mobile: 617-818-2262
E-Mail: jon.haber@fadv.com
Web: www.fadvassessments.com
Contact Information:
24
September, 2003
Executive
Summary
Since the release of IC3 in February 2002, the Internet and Computing
Core Certification has enjoyed phenomenal growth and popularity,
and is now used to provide a globally-recognized, valid certification
on basic Internet and Computer Literacy to thousands of students,
teachers, employees and other learners and professionals in over sixty
countries.
Given the rapid pace of technological change, the IC3 standard is
updated every two years. This addendum to the IC 3 White Paper, The
Internet and Computing Core Certification: Building a Dynamic
Standard, summarizes the research and development used to update
the IC3 standard and IC3 exams.
The primary goals of the Internet and Computing Core Certification
remain intact:
Research
Process for the
Original IC3
Standard
(continued)
Standard
Update
Research
IC3 Advisory
In updating the IC3 standard, the IC3 development group made use of
the following resources:
The IC3 Advisory Board was created to provide input into the update
3
to the IC3 standard at all stages. The goal was to bring together a
group of distinguished professionals in various fields from around the
world to help determine direction and oversee the review and
validation process for the standard. The Advisory Board consists of
the following individuals:
Dr. Helen C. Barrett
Project Co-Director
The International Society for
Technology in Education (ISTE)
Mary Bennett
Assistant Director, Vocational
Assessment
OCR Examinations, UK
Jonathan P. Dalton
Sector Strategy Manager
The Learning and Skills Council
National Office, United Kingdom
John F. Ebersole
Associate Provost & Dean,
Extended Education
Boston University
Neill Hopkins
Vice President of Workforce
Development
CompTIA
Cosmas Yatzoglou
Greece
The first two phases of review by the Advisory Board outlined above
took place between April 2003 and July 2003. In September 2003,
the Advisory Board will convene in Stratford-upon-Avon in the UK
for a summit to finalize the standard in preparation for its release to
IC3 courseware publishers and other partners in October. The summit
event will also offer a forum where Advisory Board members can
discuss the future of the IC3 certification in the context of sharing
ideas on the direction of technology education around the world.
Review of
Existing
Standards
After being reviewed by the Advisory Board, the IC3 exam blueprints
which form the basis of the updated standard were provided as an
online survey to over 200 subject matter experts (SMEs) working in
several countries and numerous fields of expertise.
Each sub-domain in the exam blueprints was the subject of a review
by over 200 SMEs, who rated the objective on multiple scales, as
well as provided comments on the overall sub-domain and any
objectives within the sub-domain.
Several elements of the IC3 standard remain intact from the original
version of the program. These include:
Changes to the
IC3 standard
Schedule
Once the exam blueprints have been revised based on the input from
the Subject Matter Expert (SME) panel, a final set of blueprints will
be provided to the IC3 Advisory Board in time for the IC3 Summit to
take place on September 13, 2003 in Stratford-upon-Avon in the UK.
At the end of that summit meeting, the blueprints will be finalized
and content validation of the blueprints completed.
The revised blueprints will be turned into a detailed curriculum to be
provided to Independent Courseware Vendors (ICVs) in October of
2003 to allow them ample time to prepare study materials based on
the new standard in time for the release of exams based on the new
standard in 2004.
The exams are scheduled to replace the current IC 3 exams in July
2004 to coincide with the end of the 2003-2004 academic year.
Exams based on the new standard (including the initial set of exams
to be developed between September 2003 and July 2004 and any
updates to the exams based on the 2003 standard update) will be
considered the official IC 3 examinations from July 2004 until the
standard is revisited and revised again in 2005-2006.
Conclusions
10
Executive
Summary
Teachers are also being asked why they should train and test for a
computing and Internet literacy standard when students will be
absorbing these basic skills by osmosis well before they even reach
high school. The widely-held notion that younger students who have
been exposed to computers and the Internet from a very early age will
come to school fully equipped to handle the latest technology may
actually slow progress in the development of Internet and computing
literacy standards. While early exposure to video games, e-mail and
the World Wide Web may build a young person's confidence with
technology and provide basic keyboarding skills, it does not
necessarily lead to he or she becoming proficient in the range of
knowledge and skills required to use computers productively in
school or at work.
Though it has become easier in recent years for schools to obtain
computer hardware, software and Internet connections, there remain
no accepted standards upon which educators can base training
courses and materials in Computing Fundamentals and Internet
literacy and no standard to help them evaluate the effectiveness of
such training.
Even as standards become more important in education in general,
and in technical education in particular (witness the number of
academics teaching to certification standards such as CompTIA's A+
or Microsoft's MCP), states and schools have been left on their own
in establishing standards for computing and Internet literacy skills.
And this is true even where these skills have been defined as critical
core competencies for graduating high-school students in order for
them to enter higher education or the workplace.
A review of programs being developed in different schools and
different states shows enough similarities among these programs to
support a broad-based computing and Internet literacy national and
even international standard. Such a standard offers the following
benefits:
Test Target
The target skill level for this certification is a person with computing
and Internet skills sufficient to enter current job markets or to begin a
program in higher education. While graduating high-school seniors
and first-time job seekers represent major universes of candidates
who fit this description, the skill level can also apply to a much
broader range of candidates, such as retooling older workers or
students, welfare-to-work candidates and others seeking education or
employment opportunities that require the use of modern computers
and the Internet.
Program
Components
Principles and
Goals
Principle/Goal
The certification
will be built on a
framework of
current thinking
and best practices
in technology
education and
training.
Explanation
Over the last decade, new subject areas such as Information
Literacy, Digital Literacy, Fluency in Information Technology,
Media and Visual Literacy have been added to the definition of
computer literacy education. While the Internet and Computing
Core Certification will not be so broad as to cover the domains of
all of these diverse areas of study, these important schools of
thought will inform the development of the program. An analysis
of the theories underlying or informing the Internet and
Computing Core Certification standard begins on page 8.
Explanation
The Internet and Computing Core Certification exam will be
created under the supervision of a professional psychometricians
and test developers with experience in the certification industry to
ensure that the program fulfills all of the industry's highest
standards for test integrity and validity. As members of the
Association of Test Publishers (ATP), the developers will utilize
ATP standards for test development as a basis for exam
development.
The exam will also make use of the latest testing technologies,
notably the ability to integrate both performance-based and
knowledge-based questions into a certification exam that provides
for the fairest and most accurate testing experience possible.
The certification
will keep up to date
with new
technology and stay
relevant to the
latest technological
trends.
Relevance to the While the Internet and Computing Core Certification standard is not
so broad as to include all of the domains of the educational trends
IC Standard
listed below, an understanding of all of these trends has been
important to determine the best ways for this new certification to be
conceived and developed. A description of the structure of the
Internet and Computing Core Certification standard begins on page
14 and includes an explanation on how these different trends have
influenced the design of the program.
Learning/Testing Description
Approach
Computer literacy is the mastery of knowledge, skills and abilities
Computer
relating to the use of computer technology, including computer
Literacy
hardware, software, networks and the Internet. As an area of
education or training that focuses on refining a particular set of
technical skills, computer literacy can be thought of as a "snapshot"
of capabilities relevant to the technology dominant at a particular
point in time.
Information
Literacy
State University of New York (SUNY) Council of Library Directors. Information Literacy Initiative. 30
September 1997.
2
Shapiro, Jeremy J. and Shelley K. Hughes. "Information Literacy as a Liberal Art". Educom Review. 3.2.
March/April 1996.
Digital Literacy
Description
As a discrete discipline, Information Literacy can be thought to
include both traditional literacy and computer literacy, with
language and higher-order thinking skills seen as critical to creating,
understanding and evaluating information, and computer skills seen
as needed to locate, retrieve, collect, process, and communicate
information. For purposes of defining an Internet and Computing
Certification standard, key elements of Information Literacy will be
taken into account including:
This term is taken from Paul Gilster's 1997 book of the same name,3
a work targeted at a popular audience that addressed the
consequences (positive and negative) of the Internet's low barrier to
entry for information producers and consumers. The most important
contribution of Gilster's book is its description of a set of critical
thinking skills for evaluating the quality of information found on the
Internet.
The portion of the Internet and Computing Core Certification
standard having to do with "Living Online" takes into account the
practical suggestions offered by Gilster and others who have
embraced the concept of Digital Literacy. Most notable are
suggestions about how to evaluate the quality of online information
sources and how to make the best use of search engines and other
resources for locating information.
Gilster, Paul. 1997. Digital Literacy. John Wiley & Sons, New York.
Description
The 1999 book Being Fluent with Information Technology,4 is a
joint project of the Committee on Information Technology Literacy
of the National Research Council of the National Academy of
Sciences, the Computer Science and Telecommunications Board,
and the Commission on Physical Sciences, Mathematics, and
Applications.
This ambitious project attempts to define a set of required skills
needed not just for mastery of current technology, but for ongoing
readiness in handling technological advancement and change. The
book identifies three critically linked sets of skills required for
"FITness" (a term used throughout the study to describe individuals
who pass standards in Fluency in Information Technology). These
skills include:
The Committee on Information Technology Literacy, The Computer Science and Telecommunications
Board, The Commission on Physical Sciences, Mathematic, and Applications and the National Research
Council. 1999. Digital Literacy. The Computer Science and Telecommunications Board, Washington,
DC.
10
Tyner, Kathleen. 1998. Literacy in a Digital World. Lawrence Erlbaum Associates, New Jersey.
Beyond Access: Understanding the Digital Divide. Keynote Address by Andy Carvin, NYU Third Act
Conference, May 19, 2000. (www.benton.org/Divide/thirdact/speechold.html).
11
12
13
IC - Computing Fundamentals
IC - Key Applications
IC - Living Online
14
Key
Applications
Examination
Living Online
Examination
15
16
National Research
Council
17
With support from the National Science Foundation and NASA, the International
Technology Education Association (ITEA) developed guidelines describing what it
means to be technologically literate with descriptions of content and guidelines for
building technology training into K-12 educational curricula. Their findings were
compiled in the 1999 volume Technology for All Americans: A Rationale and
Structure for the Study of Technology. Phase III of the project, scheduled for 20002003, involves creating professional development standards for technological literacy
in schools.
This project of ISTE, the International Society for Technology in Education,
encourages educational leaders to provide learning opportunities that produce
technology-capable students. The goal of the project is to enable stakeholders in K-12
education to develop national standards for educational uses of technology.
Since 1916, the National Academy of Sciences and its principle operating agency, the
National Research Council, exists to further knowledge and advise the federal
government on scientific and technical issues. NRC commissions publish numerous
projects and studies, including the 1999 publication Being Fluent with Information
Technology. This work proposes a "FITness" model for someone "fluent" in
information technology. These guidelines include foundational knowledge of
information technology and an understanding of current technology.
Content Standards for Alaska Students Technology is a listing of objectives students
should be able to achieve using technology, including the use of technology-based
tools and the development of information-management and problem-solving skills.
The Arizona Department of Education publishes information on Arizona's Academic
Standards and Accountability project, including Technology Education Standards for
grades K-12. (Arizona has comprehensive technology standards for kindergarten,
grades 1-3, 4-8 and 9-12.)
National Educational
Technology Standards
for Students
Description
Program
http://www.educ.state.a
k.us/ContentStandards/
Technology.html
http://www.ade.state.az
.us/standards/technolog
y
http://www.nas.edu/nrc
http://cnets.iste.org
For More
Information
http://www.sasked.gov.
sk.ca/k/p_e/eval/tl_over
view/rubric.html
Description
18
State Initiative: Colorado The Educational Telecommunications Unit of the Colorado Department of Education
published Competency Guidelines for Classroom Teachers and School Library Media
Specialists in January 1999. While focused on technical competencies required by
educators and library staff, these guidelines provide details of the essential skills
needed in specific areas such as word processing, spreadsheets, databases, networking,
and the Internet.
State Initiative:
Massachusetts has detailed technology competency standards along with useful
Massachusetts
examples for grades pre K -4, 5-8, 9-10 and 11-12. The Information Literacy Project,
an initiative of the University of Massachusetts library system, has produced
Information Literacy Competencies for K-12 education. These competencies are built
on the concept of "Information Literacy" as developed within the library sciences.
State Initiative:
The Michigan Department of Education's Overview of Technology Content Standards
Michigan
describes content standards and benchmarks for early elementary, later elementary,
middle school and high-school students. These standards and benchmarks are built on
real-world models of producers and consumers of technology and information (family
member, consumer, citizen, worker, and life-long learner).
State Initiative: North
North Carolina's State Board of Education's K-12 Computer Technology Skills
Carolina
Standard Course of Study integrates technology initiatives into multiple courses of
study (English, Foreign Languages, Mathematics, Sciences, etc.) and provides
curriculum guidelines and a matrix for Computer/Technology Skills for grades 1-8 and
high school.
State Initiative: Ohio
Ohio's Information Technology Competency Profile represents one of the most
advanced approaches to integrating computer technology and information literacy
subjects into all aspects of a school's curriculum. Their Profile includes specific
descriptions of different curriculum areas (Writing, Oral Communications, Scientific
Inquiry, etc.) and the technical and non-technical competencies that are required to
succeed in each area.
Program
http://itworksohio.org/ITCMP.html
and
http//www.ohioschooln
et.k12.oh.us
http:/www.depi.state.nc
.us.curriculum/compute
r.skills/index.html
http:/cdp.mde.state.mi.
us/MCF/ContentStanda
rds/Technology/default.
html
http://www.doe.mass.e
du/frameworks
For More
Information
www.cde.state.co.us
University Initiative:
Illinois Association of
College and Research
Libraries
University Initiative:
University of Texas at
Austin
19
The Information Literacy Challenge: Addressing the Changing Needs of Our Students
through Our Programs (1996) provides guidelines for educators and librarians
involved with integrating information literacy concepts into different areas of
education.
The Utah State Board of Education has published Elementary and Secondary Core
Curriculum Standards for Educational Technology, levels K-12. These guidelines
describe specific competencies students should have at each grade level including
knowledge and skills in basic operations and concepts of technology; social, ethical
and human issues, technology productivity tools, technology communications tools,
technology research tools, and technology problem-solving and decision-making tools.
Wisconsin provides detailed information technology standards for students to meet by
the end of grades 4, 8 and 12. In addition, the Wisconsin Association of Academic
Librarians has adopted standards for Information Literacy Competencies and Criteria
for Academic Libraries in Wisconsin, specifying knowledge, skills and abilities
required for Information Literacy.
In 1990-1991, the state of West Virginia approved a comprehensive Basic
Skills/Computer Education (BS/CE) program that began by providing computer
hardware and educational software in kindergarten classes, moving the program up
through the grades over the next decade. The 1999 paper West Virginia Story:
Achievement Gains from a Statewide Comprehensive Instructional Technology
Program (written in conjunction with the Milken Exchange on Education Technology)
outlines the program and the results of state-wide testing of achievement in technology
education.
The Information Literacy Standards Implementation Taskforce of the Illinois ACRL
Standards Committee has proposed a set of standards for Information Literacy and its
applications to higher education.
State Initiative:
Wisconsin
Description
Program
http://staff.lib.utexas.ed
u/~beth/IRSQ/skills.ht
ml
http://www.ala.org.acrl.
ilcomstan.html
ERIC Accession
Number: ED429575
http://facstaff.uww.edu/
WAAL/infolit/ilcc.html
and
http://www.dpi.state.wi
.us/dpi/standards
Not available online.
For More
Information
http://www.usoe.k12.ut
.us/curr/EdTech./newco
re.html
20
As part of Williams College's Curricular Innovation strategic plan, the 2000 document
Competencies and Requirements for Technological Competence and Digital Literacy
describes tool-based skills and critical-thinking skills required to be considered
digitally literate.
The Health Services Department of the University of Washington has published
Desktop Competencies and Internet Competencies required for Digital Literacy
education.
University Initiative:
Williams College
University Initiative:
University of
Washington
Description
Program
www.//depts.washingto
n.edu/hserv/teaching/di
glit/diglit.htm
For More
Information
http://www.williams.ed
u/go/strategicplanning/
archive/2-08.html
Influence of
Educational
Methodologies
21
Whenever possible, test questions (particularly performancebased test questions) will be based on examples that are
consistent with one another and relate to real-world
experiences (such as a joint classroom project that requires the
use of multiple applications and methods of communication).
Test Target
Test Blueprint
Development
Testing
Technology
22
Test
Development
Test Validation
23
24
25
Competitive
Advantage
InternetOverview
and Computing
Core Certification
IT Career Roadmap
Vendor-independent
Candidate Verification
Localized Exams
ECDL/ICDL
Internet
Sample
and
Questions
Computing
Core Certification
Program Basics
Name
IC
ICDL
European Computer
Driving Licence, branded
International Computer
Driving Licence outside
Europe
Overview
IC is a global, validated,
standards-based
certification program
for basic computing and
Internet literacy. It
provides specific
guidelines for the
knowledge and skills
required to be a
productive user of
computer hardware,
software, networks, and
the Internet.
ICDL is a structured
training program, based
on a publicly published
syllabus, that
demonstrates an
individual has mastered
the fundamental
concepts of Information
Technology (IT), and is
able to use a personal
computer and computer
software applications at
a fundamental level of
competence.
Computer
Vendor
Relationships
Vendor-independent
Vendor-independent
Exam Content
IC
I/ECDL
Number of
Exams
Exam Topics
1. Computing
Fundamentals
4. Basic Concepts of IT
2. Key Applications
3. Living Online
6. Word Processing
7. Spreadsheets
8. Databases
9. Presentations
10. Information and
Communication
Exam
Methodology
No standard set of
exams available. Training
and testing vendors
determine and design
tests for their locations
using various
methodologies. All
exams must be
approved by the ECDL
Foundation
Exam
Objectives
Standard exam
objectives used
worldwideupdated
on a regular basis. Last
updated 8/08.
Localized
IC is currently available
in more than 120
countries.
ICDL is currently
available in more than
130 countries.
Versions
IC
I/ECDL
Additional
Certifications
IC can be a starting
point for additional
certifications, such as
MOS, MCP, A+, i-Net+,
Adobe Certified
Associate, Microsoft
Certified Application
Specialist, etc.
Industry
Support
Computing Technology
Industry Association
CompTIA.
Member organizations
of Council of European
Professional Informatics
SocietiesCEPIS
Oxford and
CambridgeOCR
International Society
for Technology
EducationISTE
International Society
for Technology
EducationISTE
American Council on
EducationACE
Scottish Qualification
AuthoritySQA
Customized
Modules
Exam Creation
Research
IC
I/ECDL
government initiatives,
existing literacy
programs, Digital
Divide Research.
Objective
Development
Exam
Development
Exam
Integrity
Objectives
Updates
Syllabus updated
periodically by ICDL
Foundation. Three year
standard revision cycle
has been followed in the
past.
Psychometric
Validation
Third-party
psychometric validation
conducted with
supporting
documentation.
Psychometric validation
of tests is conducted by
ICDL Foundation staff.
Exam Delivery
IC
I/ECDL
Exam
Administration
Exams are
administered via
computer for
performance-based
and knowledge-based
exam questions.
Variable technology
based on multiplicity of
vendors. Paper and
computer-based
testing dependent on
testing vendor.
Availability/
Coverage
IC is available to
Certiports 12,000
iQcenters in 120
countries worldwide.
Scoring
provided to an
individual upon
completion of each
exam and printed on
an exam results report.
Exam scores are also
included in the global
results database for
future reference.
manually stamped by
the test proctor on the
candidates skills card.
Scores are not
recorded or tracked in
a global database.
Global
Database
Environment
Proctored testing
environment.
Proctored testing
environment.
IC
I/ECDL
Location
Transferability
Security
A single electronic
repository of
information
provides monitoring
and reporting
capabilities to detect
examination fraud.
Candidate Services
IC
I/ECDL
Certification
IC Certificate delivered
worldwide upon
successful completion
of all three exams.
Certificate can also
be viewed online in
multiple languages.
Candidate
Verification
Employers can
electronically verify
candidate certifications
through Certiports
Verification of candidate
completion is not
currently available.
Varies depending on
each training vendor.
Candidate
Transcripts
The tests used at the end of the training vary by vendor and by location. Not
only do the test questions vary, but the very manner of testing varies some
are computer-based, and some are paper-based. Often the candidate themselves
can choose how they would like to take the test. In some countries the same
candidate may score an 80% by one test vendor and 30% by another on the
same material.
Over the last couple years ICDL has begun using the term Certification as
they have begun to witness the growth in certification programs worldwide.
They have decided to get on the bandwagon to ride the wave and as such have
tried to pass themselves off as an industry certification. They use this
terminology in hopes that those who hear it will not even question the
assumption.
Why is this a problem? Why should a school district, state, or government
agency care about implementing a certification vs. ICDL? Because these groups
need a digital literacy measurement standard, not a training syllabus. It is critical
that when people under their stewardship complete the program that their skills
are measured in the same way, no matter how they obtained the knowledge.
Using ICDL is no different from what they currently have: a list of objectives
which they would like their constituents to master.
An example: the State of Texas has a list of standard computer skills that they
would like to ensure all of their students have. This list is similar to what ICDL
puts out to training vendors. So the State of Texas can give this to training
vendors (or their schools) and students could be trained in a number of ways.
However, what the State of Texas really cares about is that all of the students are
measured in the same way to those standards. With ICDL each school could
test differently this is what they are already doing. ICDL is no improvement it
just adds to their costs. With IC, the State can be assured that each student will
be measured exactly the same same objectives, same process. A student can
take their results anywhere and prove the same outcome. An employer hearing
from a candidate that they are IC certified knows exactly what they are getting.
A true certification implements a standardized measurement process that
does not vary by location examinations
A true certification utilizes professional exam development
A true certification requires psychometric validation
A true certification is portable in that it is accepted by local and global
educational and industry authorities
In most cases, the IC team should not focus on the downsides of ICDL, but
rather the positives of IC. By association it can be assumed that ICDL does not
match the strengths of IC. By focusing on these positives of IC and why each
factor is important to the group/organization reviewing IC, they will utilize these
points in their own comparison and review of ICDL.