Professional Documents
Culture Documents
1
Center for Research in Ubiquitous Computing,
National University of Computer and Emerging Sciences, Karachi, Pakistan
2
ALADIN Solutions,
Karachi, Pakistan
{noman.islam, zubair.shaikh}@nu.edu.pk
1 Introduction
Usability evaluation is not a new topic. Ample volume of research has been reported
on usability evaluation of common graphical user interfaces i.e. desktop applications,
website and mobile apps etc. This section starts with a brief discussion on common
technique for evaluation of graphical user interfaces. During literature, we found
some research works that performed qualitative comparison of ontology editors based
on features. This section also discusses these research efforts.
Usability of a system can be defined as the ease of use and learnability of its interface.
There are various approaches for usability evaluation proposed in literature. Fig. 1
provides a summary of these techniques. These techniques can be broadly classified
as usability testing, usability inspection and inquiry methods [2].
The usability testing is performed by the users of a product to identify any serious
problems in a user interface. For example, one of the usability testing techniques is
coaching method that comprises a set of representative users and an expert called
coach. These users ask the expert about different system related questions to identify
usability problems. Another example of usability testing is co-discovery method,
where two test users are asked to perform a task using the GUI. The two users help
each other to perform this task. Remote usability testing can also be performed when
the testers and participants are at different locations. In shadowing method, the expert
sits next to the tester to explain the test users behavior. Finally, in teaching method,
the test user first familiarizes itself with the system and then educates a novice user
about the working of system.
Any
Tools framework /
Basic idea / motivation Parameters
compared consolidation
approach
Extensibility, storage, import/
To evaluate tools based on
export, merging, inference Apollo,
interoperability, openness,
Kapoor and engine, exception handling, Protg 3.4,
easiness to update and No
Sharma [3] consistency checking, IsaViz,
maintain, market status and
collaboration, library, SWOOP
penetration
visualization, versioning
Methodological support,
OntoEdit,
collaboration, theory awareness,
Mizoguchi Focus on dynamic aspects of Hozo,
architecture, interoperability, No
[4] ontology development WebODE,
ontology/model instance,
Protg,
instance definition, inference,
Functional requirements
Collaboration, development in
different human languages,
multi-ontology support,
verification, visual abstraction, NeOn,
natural language support, Protg,
automated information CmapTools,
Norta et al. Comparison is done in context
extraction. TopBraid Yes
[5] of electronic B2B-collaboration
Non-functional requirements composer,
modifiability, inerrability, HOZO,
portability, interoperability OntoBroker
performance, security, usability,
flexibility, feasibility,
scalability, applicability,
completeness etc.
Apollo,
Extensibility, backup OntoStudio,
management, storage, imports / Protg,
Comparison of tools for
Alatrish [6] exports, methodological support, Swoop, No
convenience and flexibility
inferencing, consistency TopBraid
checking, usability etc. Composer
Free Edition
Protg,
OntoEdit
(Free), DOE,
IsaViz,
Ontolingua,
Altova
SemanticWork
s, OilEd,
Comparison of ontology Extensibility, collaboration,
Islam et al. WebODE,
editing tools in context of architecture, import/export No
[7] pOWL,
semantic web languages, supporting tools etc.
SWOOP,
TopBraid
Composer,Ne
On Toolkit,
Morla, OBO-
Edit, Hozo,
OntoBuilder,
WSMO Studio
Following are some of the usability studies reported for comparison of ontology
editors:
Duineveld et al. (2010). The earliest work on evaluation of ontology development tools
is reported by Duineveld et al. [8]. A framework for evaluation has been proposed
which evaluates the tools on three dimensions. The general dimension investigates the
tool for features that are also found in other programs. The ontology dimension
analyzes the tool for features specific for ontology development. The third dimension
cooperation evaluates the tool for features required to support collaborative ontology
development.
Khondoker et al. (2010). Khondoker et al. performed an online survey in which users
were asked to fill a questionnaire [10]. The questionnaire contains questions about
four usability components i.e. tools, task, environment and user. The participants
included beginners as well as experience ontology developers.
Following are some of the studies that have used usability evaluation for
validation of their proposed tool.
Bernstein et al. (2006). Bernstein et al. proposed an ontology editor called GINO i.e.
guided input natural language ontology editor. The usability evaluation of proposed
tool involves six participants with no experience in semantic web and ontologies. The
participants were initially introduced to the concepts and then asked to create
ontologies. Using a key logger, various observations were about the evaluation
session. For example, common mistakes by users, frequency of usage of backspace
key, incorrect data entries made by user were recorded. Based on the information, a
usability score was computed. At the end of session, participants were asked few
questions to get feedback about the usability of tool.
Fortuna et al. (2007). OntoGen, an ontology editor was presented by Fortuna et al. [6].
For the usability evaluation of tool, a study was performed between two groups of
users. First group had good knowledge of computer science, while the second group
had more knowledge about cognitive processes. The evaluation session lasted for 90
minutes and comprised three phases. The first phase introduced the reader about the
purpose of tasks. In the second phase, users were first asked to fill an initial
questionnaire. The participants were asked to test the system and then create sample
ontologies. A questionnaire was then filled in the end by participants to provide their
experiences about the system. In the conclusion phase, an informal discussion was
done to discuss any final thoughts or questions aroused during usability evaluation
session.
We close this section with a brief discussion usability evaluation work on tools related
to ontology development. Funk et al. presented a controlled language for ontology
editing and implementation software based on NLP tools [11]. The usability
evaluation process comprised pre-test questionnaire, reading a manual, carrying out
two ontology development tasks and filling out a post-test questionnaire. The pre-test
questionnaire asked participants about their previous knowledge of ontologies and
controlled language. The post-test questionnaire comprised questions to measure the
usability score of tool. There were 15 evaluators having different level of expertise
and experience.
Garcia et al. performed usability evaluation of OWL-VisMod, a visual modeling
tool for OWL ontologies [12]. The evaluation process comprised 26 questions
categorized into 22 closed ended and 4 open ended questions. Users were asked to
download the tool and then answer questions about the effectiveness of the tool i.e.
whether the visualization achieve the desired objective in opinion of user.
Table 2: Summary of usability evaluation proposals
Participants
Pre- Usability
Tools Post-session ontology Usability
Summary session evaluation
compared questionnaire development quantification
briefing procedure
experience
To evaluate a
tool from
Ontolingua,
different
WebOnto,
Duineveld dimensions i.e. Evaluate
ProtgWin,
et al. general user n/a yes tools based yes no
OntoSaurus,
(2010) interface, on
ODE,
ontology
KADS22
support and
cooperation
Create a
small
Protg
ontology,
2000,
load a large
Evaluation of OntoEdit,
ontology,
Garca- ontology editor OILEd,
search a
Barriocana based on KSL
yes yes class and nil no
et al. heuristic Ontology
property,
(2005) assessment and Editor,
perform
user testing WebOde ,
annotation,
WebOnto,
update
KADS22
classes and
properties
Protg,
SWOOP,
Top Braid
Get feedback
Composer,
from users
Using online Internet
Khondoker based on
survey, obtain Business
et al. no yes their yes no
feedback of Logic, Onto
(2010) experience
users Track,
of using
IHMC
tools
Cmap
Ontology
editor
Create class
Create sub-
To evaluate an class
ontology editor Add data and
Bernstein
GINO, a object yes, based on
et al. GINO yes yes nil
natural property SUS [13]
(2006)
language Add instance
ontology editor and values
Change the
value
Create an
Integrates
ontology and
machine
create two
Fortuna et learning and
OntoGen Yes yes ontologies minimal no
al. (2007) text mining
based on
algorithms in
created
user interface
ontologies
4 DISCUSSIONS
References
[1] Cardoso, J., The Semantic Web Vision: Where are We? IEEE Intelligent Systems, 2007: p.
22-26.
[2] Usability Evaluation. [cited June 2013]; Available from: http://www.usabilityhome.com.
[3] Kapoor, B. and S. Sharma, A Comparative Study Ontology Building Tools for Semantic
Web Applications. International journal of Web & Semantic Technology, 2010. 1(3).
[4] Mizoguchi, R., Ontology Engineering Environments, in Handbook on Ontologies, S.S.a.R.
Studer, Editor. 2003. p. 275-295.
[5] Norta, A., L. Carlson, and R. Yangarber. Utility Survey of Ontology Tools. in
InternationalEnterprise Distributed Object Computing Conference Workshops. 2010.
[6] Fortuna, B., M. Grobelnik, and D. Mladenic. OntoGen: Semi-automatic Ontology Editor. in
2007 conference on Human interface. Beijing, China.
[7] Islam, N., A.Z. Abbasi, and Z.A. Shaik. Semantic Web: Choosing the right methodologies,
tools and standards. in International Conference on Information and Emerging
Technologies (ICIET). 2010. Karachi, Pakistan.
[8] Duineveld, A.J., R. Stoter, M.R. Weiden, B. Kenepa, and V.R. Benjamins, A comparative
study of ontological engineering tools International Journal of Human-Computer
Studies, 2000. 52(6).
[9] Garca-Barriocanal, E., M.A. Sicilia, and S. Snchez-Alonso, Usability Evaluation of
Ontology Editors. Knowledge Organization, 2005. 32(1): p. 1-9.
[10] M. Rahamatullah Khondoker, P.M. Comparing Ontology Development Tools Based on an
Online Survey. in World Congress on Engineering. 2010. London, UK.
[11] Funk, A., V. Tablan, K. Bontcheva, H. Cunningham, B. Davis, and S. Handschuh.
CLOnE: Controlled Language for Ontology Editing. in 6th international The
semantic web and 2nd Asian conference on Asian semantic web conference 2007.
Busan, Korea.
[12] Garca, J., F.J. Garca-Pealvo, R. Thern, and P.O.d. Pablos, Usability Evaluation of a
Visual Modelling Tool for OWL Ontologies. Journal of Universal Computer Science,
2011. 17(9): p. 1299-1313.
[13] Brooke, J., SUS: A quick and dirty usability scale in Usability evaluation in industry, P.W.
Jordan, et al., Editors. 1996.