Skip to content
This repository was archived by the owner on Oct 8, 2020. It is now read-only.

Commit 81ab834

Browse files
Merge branch 'release/0.2.0'
2 parents adce8f8 + 68a7e6e commit 81ab834

File tree

141 files changed

+6239
-2362
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

141 files changed

+6239
-2362
lines changed

LICENSE

Lines changed: 201 additions & 674 deletions
Large diffs are not rendered by default.

README.md

Lines changed: 86 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,30 @@
1+
2+
13
# SANSA Inference Layer
4+
[![Maven Central](https://maven-badges.herokuapp.com/maven-central/net.sansa-stack/sansa-inference-parent_2.11/badge.svg)](https://maven-badges.herokuapp.com/maven-central/net.sansa-stack/sansa-inference-parent_2.11)
25
[![Build Status](https://ci.aksw.org/jenkins/job/SANSA%20Inference%20Layer/job/develop/badge/icon)](https://ci.aksw.org/jenkins/job/SANSA%20Inference%20Layer/job/develop/)
36

7+
**Table of Contents**
8+
9+
- [SANSA Inference Layer](#)
10+
- [Structure](#structure)
11+
- [sansa-inference-common](#sansa-inference-common)
12+
- [sansa-inference-spark](#sansa-inference-spark)
13+
- [sansa-inference-flink](#sansa-inference-flink)
14+
- [sansa-inference-tests](#sansa-inference-tests)
15+
- [Setup](#setup)
16+
- [Prerequisites](#prerequisites)
17+
- [From source](#from-source)
18+
- [Using Maven pre-build artifacts](#)
19+
- [Using SBT](#using-SBT)
20+
- [Usage](#usage)
21+
- [Example](#example)
22+
- [Supported Reasoning Profiles](#)
23+
- [RDFS](#rdfs)
24+
- [RDFS Simple](#rdfs-simple)
25+
- [OWL Horst](#owl-horst)
26+
27+
428
## Structure
529
### sansa-inference-common
630
* common datastructures
@@ -81,7 +105,7 @@ with `VERSION` beeing the released version you want to use.
81105
</snapshots>
82106
</repository>
83107
```
84-
'2'. Add dependency to your pom.xml
108+
2\. Add dependency to your pom.xml
85109

86110
For Apache Spark
87111
```xml
@@ -123,24 +147,70 @@ and for Apache Flink add
123147
where `VERSION` is the released version you want to use.
124148

125149
## Usage
150+
Besides using the Inference API in your application code, we also provide a command line interface with various options that allow for a convenient way to use the core reasoning algorithms:
126151
```
127152
RDFGraphMaterializer 0.1.0
128153
Usage: RDFGraphMaterializer [options]
129-
130-
131-
-i <file> | --input <file>
132-
the input file in N-Triple format
133-
-o <directory> | --out <directory>
134-
the output directory
135-
--single-file
136-
write the output to a single file in the output directory
137-
--sorted
138-
sorted output of the triples (per file)
139-
-p {rdfs | owl-horst} | --profile {rdfs | owl-horst}
140-
the reasoning profile
141-
--help
142-
prints this usage text
154+
155+
-i, --input <path1>,<path2>,...
156+
path to file or directory that contains the input files (in N-Triples format)
157+
-o, --out <directory> the output directory
158+
--properties <property1>,<property2>,...
159+
list of properties for which the transitive closure will be computed (used only for profile 'transitive')
160+
-p, --profile {rdfs | rdfs-simple | owl-horst | transitive}
161+
the reasoning profile
162+
--single-file write the output to a single file in the output directory
163+
--sorted sorted output of the triples (per file)
164+
--parallelism <value> the degree of parallelism, i.e. the number of Spark partitions used in the Spark operations
165+
--help prints this usage text
166+
```
167+
This can easily be used when submitting the Job to Spark (resp. Flink), e.g. for Spark
168+
169+
```bash
170+
/PATH/TO/SPARK/sbin/spark-submit [spark-options] /PATH/TO/INFERENCE-SPARK-DISTRIBUTION/FILE.jar [inference-api-arguments]
171+
```
172+
173+
and for Flink
174+
175+
```bash
176+
/PATH/TO/FLINK/bin/flink run [flink-options] /PATH/TO/INFERENCE-FLINK-DISTRIBUTION/FILE.jar [inference-api-arguments]
177+
```
178+
179+
In addition, we also provide Shell scripts that wrap the Spark (resp. Flink) deployment and can be used by first
180+
setting the environment variable `SPARK_HOME` (resp. `FLINK_HOME`) and then calling
181+
```bash
182+
/PATH/TO/INFERENCE-DISTRIBUTION/bin/cli [inference-api-arguments]
143183
```
184+
(Note, that setting Spark (resp. Flink) options isn't supported here and has to be done via the corresponding config files)
185+
144186
### Example
145187

146-
`RDFGraphMaterializer -i /PATH/TO/FILE/test.nt -o /PATH/TO/TEST_OUTPUT_DIRECTORY/ -p rdfs` will compute the RDFS materialization on the data contained in `test.nt` and write the inferred RDF graph to the given directory `TEST_OUTPUT_DIRECTORY`.
188+
```bash
189+
RDFGraphMaterializer -i /PATH/TO/FILE/test.nt -o /PATH/TO/TEST_OUTPUT_DIRECTORY/ -p rdfs
190+
```
191+
will compute the RDFS materialization on the data contained in `test.nt` and write the inferred RDF graph to the given directory `TEST_OUTPUT_DIRECTORY`.
192+
193+
## Supported Reasoning Profiles
194+
195+
Currently, the following reasoning profiles are supported:
196+
197+
##### RDFS
198+
199+
###### RDFS Simple
200+
201+
A fragment of RDFS that covers the most relevant vocabulary, prove that it
202+
preserves the original RDFS semantics, and avoids vocabulary and axiomatic
203+
information that only serves to reason about the structure of the language
204+
itself and not about the data it describes.
205+
It is composed of the reserved vocabulary
206+
`rdfs:subClassOf`, `rdfs:subPropertyOf`, `rdf:type`, `rdfs:domain` and `rdfs:range`.
207+
208+
More details can be found in
209+
210+
Sergio Muñoz, Jorge Pérez, Claudio Gutierrez:
211+
*Simple and Efficient Minimal RDFS.* J. Web Sem. 7(3): 220-234 (2009)
212+
##### OWL Horst
213+
OWL Horst is a fragment of OWL and was proposed by Herman ter Horst [1] defining an "intentional" version of OWL sometimes also referred to as pD\*. It can be materialized using a set of rules that is an extension of the set of RDFS rules. OWL Horst is supposed to be one of the most common OWL flavours for scalable OWL reasoning while bridging the gap between the unfeasible OWL Full and the low expressiveness of RDFS.
214+
215+
[1] Herman J. ter Horst:
216+
*Completeness, decidability and complexity of entailment for RDF Schema and a semantic extension involving the OWL vocabulary.* J. Web Sem. 3(2-3): 79-115 (2005)

0 commit comments

Comments
 (0)