Skip to content
This repository was archived by the owner on Oct 8, 2020. It is now read-only.

Commit 6c78cb2

Browse files
Update README.md
updated version numbers
1 parent 8de2064 commit 6c78cb2

File tree

1 file changed

+10
-5
lines changed

1 file changed

+10
-5
lines changed

README.md

Lines changed: 10 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -88,15 +88,15 @@ For Apache Spark
8888
<dependency>
8989
<groupId>net.sansa-stack</groupId>
9090
<artifactId>sansa-inference-spark_2.11</artifactId>
91-
<version>0.4.0</version>
91+
<version>0.6.0</version>
9292
</dependency>
9393
```
9494
and for Apache Flink
9595
```xml
9696
<dependency>
9797
<groupId>net.sansa-stack</groupId>
9898
<artifactId>sansa-inference-flink_2.11</artifactId>
99-
<version>0.4.0</version>
99+
<version>0.6.0</version>
100100
</dependency>
101101
```
102102

@@ -106,12 +106,12 @@ Add the following lines to your SBT file:
106106

107107
For Apache Spark add
108108
```scala
109-
libraryDependencies += "net.sansa-stack" % "sansa-inference-spark_2.11" % "0.4.0"
109+
libraryDependencies += "net.sansa-stack" % "sansa-inference-spark_2.11" % "0.6.0"
110110
```
111111

112112
and for Apache Flink add
113113
```scala
114-
libraryDependencies += "net.sansa-stack" % "sansa-inference-flink_2.11" % "0.4.0"
114+
libraryDependencies += "net.sansa-stack" % "sansa-inference-flink_2.11" % "0.6.0"
115115
```
116116
### Using Snapshots
117117

@@ -120,7 +120,7 @@ Snapshot version are only avalibale via our custom Maven repository located at h
120120
## Usage
121121
Besides using the Inference API in your application code, we also provide a command line interface with various options that allow for a convenient way to use the core reasoning algorithms:
122122
```
123-
RDFGraphMaterializer 0.4.0
123+
RDFGraphMaterializer 0.6.0
124124
Usage: RDFGraphMaterializer [options]
125125
126126
-i, --input <path1>,<path2>,...
@@ -166,6 +166,10 @@ will compute the RDFS materialization on the data contained in `test.nt` and wri
166166
Currently, the following reasoning profiles are supported:
167167

168168
##### RDFS
169+
The RDFS reasoner can be configured to work at two different compliance levels:
170+
171+
###### RDFS (Default)
172+
This implements all of the [RDFS closure rules](https://www.w3.org/TR/rdf11-mt/#patterns-of-rdfs-entailment-informative) with the exception of bNode entailments and datatypes (**rdfD 1**). RDFS axiomatic triples are also omitted. This is an expensive mode because all statements in the data graph need to be checked for possible use of container membership properties. It also generates type assertions for all resources and properties mentioned in the data (**rdf1**, **rdfs4a**, **rdfs4b**).
169173

170174
###### RDFS Simple
171175

@@ -175,6 +179,7 @@ information that only serves to reason about the structure of the language
175179
itself and not about the data it describes.
176180
It is composed of the reserved vocabulary
177181
`rdfs:subClassOf`, `rdfs:subPropertyOf`, `rdf:type`, `rdfs:domain` and `rdfs:range`.
182+
This implements just the transitive closure of `rdfs:subClassOf` and `rdfs:subPropertyOf` relations, the `rdfs:domain` and `rdfs:range` entailments and the implications of `rdfs:subPropertyOf` and `rdfs:subClassOf` in combination with instance data. It omits all of the axiomatic triples. This is probably the most useful mode but it is a less complete implementation of the standard.
178183

179184
More details can be found in
180185

0 commit comments

Comments
 (0)