Skip to content

Evaluation Framework

Arghavan Akbarieh edited this page Nov 16, 2025 · 1 revision

This framework explains how the ontologies in the project are assessed. It is divided into five main axes: Alignment, Accessibility, Health and Quality, Quantity, and Robustness. Each axis contains a set of points that describe what is evaluated and why it matters.

1. Connectivity Axis

Linkage to upper ontologies:

Checks whether the ontology is linked to widely used upper or cross domain vocabularies. Typical examples include FOAF, VANN, and similar schema level vocabularies in addition to the basic languages RDF, RDFS, OWL and XML. Linking to upper ontologies improves semantic coherence and reuse.

Linkage to existing AECO ontologies:

Assesses whether the ontology connects to other ontologies in the Architecture, Engineering, Construction and Operations domain. Examples include DOT and BROT. This shows how well the ontology fits into sector specific semantic ecosystems.

Linkage to domain meta schema ontologies:

Examines whether the ontology uses or aligns with domain meta schemas. Examples include Brick, RealEstateCore, BOT, Haystack, SAREF4Building and SSN or SOSA. These schemas support consistent modelling practices across digital built environment systems.

2. Accessibility Axis

Conceptual data model available:

Checks whether relationships between classes are mapped in a conceptual data model. This is the minimum requirement for any ontology. For example, the MPO ontology satisfies this point but does not meet some of the next accessibility points. Most ontologies that have a URI usually include a conceptual model.

Accessible as a serialisation:

Assesses whether the ontology is available in at least one machine readable serialisation format such as Turtle (.ttl), RDF/XML (.xml), JSON LD (.json) or N Triples (.nt). This goes beyond simply having a conceptual model.

Accessible as a URI

Determines whether the ontology is published at a persistent URI so that others can access it. A serialisation file alone is not enough. It must be uploaded to a stable location to be considered accessible at this level.

3. Documentation & Reuse

Clearly documented in natural language:

Assesses whether the ontology provides human readable explanations for its classes and properties. Rate as No when no documentation is provided. Rate as Yes when any of the following applies:

  • Documentation is provided in a publication even if no URI exists, as in UNO or UPO.

  • Documentation is provided at the ontology URI even if there is no paper, as in DICE or DOT.

  • Documentation is embedded directly inside the serialised file, as in BCAO.

Use of annotations in machine readable form:

Checks whether the ontology uses annotations such as rdfs:comment or dcterms:description. This ensures that natural language explanations are captured in the serialised version. Rate as Yes if these annotations appear either in the URI or in any downloadable serialisation.

Reused or extended:

Assesses whether the ontology has been reused or extended by others. This can often be observed through its outgoing links to other ontologies and through citations.

Please read more in our publication: View PDF

Clone this wiki locally