Evaluation is a very vital research interest in the digital library domain. This has been exhibited by the growth of the literature in the main conferences and journal papers. However it is very difficult for one to navigate in this extended corpus. For these reasons the DiLEO ontology has been developed in order to assist the exploration of important concepts and the discovery of trends in the evaluation of digital libraries. DiLEO is a domain ontology, which aims to conceptualize the DL evaluation domain by correlating its key entities and provide reasoning paths that support the design of evaluation experiments.
2. structure
- Introduction
- On the evaluation of digital libraries
- Modeling the evaluation of digital libraries
- An ontological representation of the digital library evaluation
domain
- Ontologies
- DiLEO presentation
2
3. in short
- an ontology
- for comparing instances
- for the support of digital library evaluation planning
3
4. motive
- e development of a schema for the description and the
comparison of evaluation instances.
- Outmost aim is to cover the disagreement among evaluation
models through a structured and formal meta-model.
“the lack of globally accepted abstract evaluation models and
methodologies can be counterbalanced by collecting, publishing and
analyzing current research activities” [Führ et al., 2007]
- At the same time to develop a digital library evaluation planning
tool.
“Every evaluation should begin with a well-crafted plan. Writing an
evaluation plan is not just common sense. It is an essential roadmap
to successful evaluation!” [Reeves et a., 2003].
4
6. modeling evaluation
- We do not refer to digital library evaluation models, but to the
modeling of the process itself.
- Five main works:
- In Tefko Saracevic’s classification [2004]
- In the Evaluation Computer [Kovacs & Micsik, 2004]
- In the PRET A Rapporter framework [Blandford et al., 2007]
- In the 5SQual model [Gonçalves et al., 2007]
- In the Zachman Framework
6
7. Saracevic’s Classification
- A classification of evaluation studies according to:
- what elements have been evaluated (Constructs)
- which were the goals, the perspectives and so on (Context)
- which were the perspectives that interested us (Criteria)
- how the evaluation was conducted (Methodology)
7
8. Saracevic’s Classification
- Divided the evaluation studies to these proposing evaluation
models and to these reporting results of evaluation initiatives.
- Saracevic formed the concept of Context to encapsulate all high-
level questions, such as why one evaluates, what is his/her target,
etc.
- At the same time he developed a category to classify studies
according to what was evaluated (Constructs) and two categories
(Criteria, Methodology) to classify the studies according to how
these are conducted.
8
9. the evaluation computer
- A faceted classification of different
views, which synthesize an
instance of an evaluation or an
‘evaluation atom’.
- A calculation of the distance
between two ‘atoms’ in a space.
9
11. PRET A Rapporter
- An evaluation framework that emphasized on the context of work.
- According to the authors the framework holds features that assist
the planning of an evaluation.
- e framework structures the evaluations according to:
- the purpose of evaluation
- the resources and the constrains
- the ethical considerations
- the data gathering
- the analysis of data
- the reporting of findings
11
12. PRET A Rapporter
- PRET A Rapporter is moving case study-wise. In practice this
means it focuses on particular dimensions in each of the three
indicative studies that presents.
- a formative evaluation of a system
- a comparative evaluation of two interfaces of the same database
- a qualitative study of a system in actual use.
12
13. 5SQual
- e model is based on the well-known framework for the
description of the digital libraries 5S (Streams, Structures, Spaces,
Scenarios, & Societies).
- e model defines some dimensions (criteria) that correspond to
constituting elements of the digital libraries.
- e authors refer to a series of studies, where these criteria are
applied on digital libraries, such as ACM DL, CITIDEL and
NDLTD.
13
15. the Zachman framework
- Zachman Framework is a framework for enterprise architecture,
developed by John Zachman, IBM, early 1980.
- e framework reflects a formal and high-level structured view of
an organization. A taxonomy for the organization of structural
elements of the organization under the lens of specific perspectives.
- It classifies and organizes in a two-dimensional space all the
concepts needed to be homogeneous and to express different
planning perspectives.
- According to the participants (alternative perspectives).
- According to processes (questions).
15
16. the Zachman framework
What How Where Who When Why
Data Process Location Worker Timing Motivation
Core Major
Scope Business Principal Business Mission
Business Business
[Planner] Locations Actors Events & Goals
Concepts Transformations
Business
Business Model Workflow Business Policy
Fact Model Tasks Connectivity
[Owner] Models Milestones Charter
Map
Platform & State
System Model Data Behavior Communication Rule
BRScripts Transition
[Evaluator] Model Allocation s Book
Diagrams
Map
Technology Relational Technical Plat- Procedure & Work Queue
Program form & Rule
Model Database Interface & Scheduling
Specifications Commu- Specifications
[Evaluator] Design Specifications Designs
nications Design
Detail
Database Source Procedures & Work Queues Rule
representation Network
Schema Code Interfaces & Schedules Base
[Evaluator]
Operational Operational
Functioning Bus Operational Operational Operational Operational
Procedures & Work Queues
[Evaluator] Database Object Cod Network Rules
Interfaces & Schedules
16
18. why an ontology?
- Formal models that help us:
- understand a domain of knowledge; in this case the domain of
digital library evaluation.
- to structure a knowledge base to collate different instances; in
this case instances portraying evaluations of digital libraries.
- to infer a logical development; in this case to assist digital
library evaluation planning.
18
19. why an ontology?
- e previous schemas are located vertically in specific research
areas. For example the PRET A Rapporter framework has a HCI
view of things or the 5SQual examines the dimension of quality.
- ey define concepts (constituents), either of the digital libraries,
or of the evaluation, but not their in-between relationships.
- e purpose is to use the ontology relationships and to
highlight the links between the concepts and to semantically
strengthen them.
- It has the potential to express paths, which will reveal
alternative or complementary concepts and threads.
19
20. ontologies
- We use elements such as:
– classes (representing concepts, entities, etc.)
– relationships (linking the concepts together)
– functions (constraining the relationships in particular ways)
– axioms (stating true facts)
– instances (reflecting examples of reality)
20
21. engineering process
- DiLEO is the result of some process:
- Literature review and study
- selecting the proper concepts
- continuously exploring the proper relationships
- Expressed in OWL
- Validation
- through discussion and practice in the “Exploring
perspectives on the evaluation of digital libraries” tutorial in
ECDL 2010.
- through a focus group with field researchers.
21
22. a typical presentation of an evaluation
- Development in OWL with Protégé Ontology Editor
- http://protege.stanford.edu/
22
27. use of ontology
- We use threads of the ontology — paths — to express explicitly a
process or a requirement. For example:
- Activities/analyze - isPerformedIn - Means/logging studies- hasMeansType
- Means Type/quantitative
isPerformedIn hasMeansType
Activity Means Means Types
record, measure, analyze, compare, Comparison studies, qualitative,
interpret, report, recommend expert studies, quantitative
laboratory studies,
field studies, logging
studies, surveys
27
28. use of ontology
- Level/individual level - isAffectedBy - Dimensions/performance
measurement - isFocusingOn - Objects/usage of content/usage of data -
isOperatedBy - Subjects/human agents - isCharacterizedby -
Characteristics/experience
- ... isCharacterizedby - Characteristics/discipline
- ... isCharacterizedby - Characteristics/age
isAffectedBy
Levels Dimensions Subjects
content level, processing effectiveness, performance system agents,
isCharacterizedby
level, engineering level, measurement, service quality, human agents
interface level, individual technical excellence, outcomes
Characteristics
age, count, discipline,
level, institutional level, assessment
experience, profession,
social level
isFocusingOn
isOperatedBy
Objects
usage of content:
usage of data, usage
of metadata 28
30. query examples
- We ask the knowledge base by issuing SPARQL queries
- Assuming that we want to plan an evaluation with log files.
- During the evaluation planning we are interested in knowing
which were the research questions of relevant studies.
- To mine this information from the knowledge base we need to
submit a SPARQL query.
30
31. query examples
- the query and the answers will have this form:
answers
the research questions (in the first column) from two studies (wm2008c and
nzdl2000) that used log files (in second column).
SPARQL query
SELECT DISTINCT
?Research_QuestionsInst
?Means
WHERE
{
?Research_QuestionsInst a<Research_Questions>.
?Dimensions a<Technical_Excellence>.
?Activity a <Record>.
?Means a <Logs>.
?Research_QuestionsInst<isBelongingTo> ?Dimensions.
?Dimensions<hasConstituent> ?Activity.
?Activity<isPerformedIn> ?Means
} 31
32. sources
- more on DiLEO:
- G. Tsakonas & C. Papatheodorou (2011). “An ontological representation of
the digital library evaluation domain”. Journal of the American Society of
Information Science and Technology 62(8), 1577–1593.
- related readings are located in:
- http://www.mendeley.com/groups/731821/dileo/
32