Skip to content

Commit 57aabce

Browse files
conker84jexp
authored andcommitted
fixes #90: Create a separate directory for the procedures documentation
1 parent 57922d0 commit 57aabce

File tree

5 files changed

+52
-41
lines changed

5 files changed

+52
-41
lines changed

doc/asciidoc/index.adoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,17 +13,17 @@ The project is composed by 2 parts:
1313
== Installation
1414

1515
Build locally
16-
// todo release
1716

1817
----
1918
mvn clean install
2019
----
2120

22-
1. Copy `<producer/consumer dir>/target/neo4j-kafka-*.jar` into `$NEO4J_HOME/plugins`
21+
1. Copy `<project_dir>/target/neo4j-streams-<VERSION>.jar` into `$NEO4J_HOME/plugins`
2322
2. Restart Neo4j
2423

2524

2625
include::producer/index.adoc[]
2726

27+
include::consumer/index.adoc[]
2828

29-
include::consumer/index.adoc[]
29+
include::procedures/index.adoc[]

doc/asciidoc/procedures/index.adoc

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
== Procedures
2+
3+
The Streams project comes out with a list of procedures.
4+
5+
=== General configuration
6+
7+
You can enable/disable the procedures by changing this variable inside the `neo4j.conf`
8+
9+
.neo4j.conf
10+
----
11+
streams.procedures.enable=<true/false, default=true>
12+
----
13+
14+
=== streams.publish
15+
16+
This procedure allows custom message streaming from Neo4j to the configured environment by using the underlying configured Producer.
17+
18+
Uses:
19+
20+
`CALL streams.publish('my-topic', 'Hello World from Neo4j!')`
21+
22+
The message retrieved from the Consumer is the following:
23+
24+
`{"payload":"Hello world from Neo4j!"}`
25+
26+
Input Parameters:
27+
28+
[cols="3*",options="header"]
29+
|===
30+
|Variable Name
31+
|Type
32+
|Description
33+
|`topic`
34+
|String
35+
|The topic where you want to publish the data
36+
37+
|`payload`
38+
|Object
39+
|The data that you want to stream
40+
41+
|===
42+
43+
You can send any kind of data in the payload, nodes, relationships, paths, lists, maps, scalar values and nested versions thereof.
44+
45+
In case of nodes or relationships if the topic is defined in the patterns provided by the configuration their properties will be filtered in according with the configuration.

doc/asciidoc/producer/configuration.adoc

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,6 @@ kafka.transaction.id=
1919
streams.source.topic.nodes.<TOPIC_NAME>=<PATTERN>
2020
streams.source.topic.relationships.<TOPIC_NAME>=<PATTERN>
2121
streams.source.enable=<true/false, default=true>
22-
streams.procedures.enable=<true/false, default=true>
2322
----
2423

2524
Note: To use the Kafka transactions please set `kafka.transaction.id` and `kafka.acks` properly

doc/asciidoc/producer/index.adoc

Lines changed: 0 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -10,43 +10,6 @@ include::configuration.adoc[]
1010

1111
include::patterns.adoc[]
1212

13-
=== Procedures
14-
15-
The producer comes out with a list of procedures.
16-
17-
==== streams.publish
18-
19-
This procedure allows custom message streaming from Neo4j to the configured environment
20-
21-
Uses:
22-
23-
`CALL streams.publish('my-topic', 'Hello World from Neo4j!')`
24-
25-
The message retrieved from the Consumer is the following:
26-
27-
`{"payload":"Hello world from Neo4j!"}`
28-
29-
Input Parameters:
30-
31-
[cols="3*",options="header"]
32-
|===
33-
|Variable Name
34-
|Type
35-
|Description
36-
|`topic`
37-
|String
38-
|The topic where you want to publish the data
39-
40-
|`payload`
41-
|Object
42-
|The data that you want to stream
43-
44-
|===
45-
46-
You can send any kind of data in the payload, nodes, relationships, paths, lists, maps, scalar values and nested versions thereof.
47-
48-
In case of nodes or relationships if the topic is defined in the patterns provided by the configuration their properties will be filtered in according with the configuration.
49-
5013
=== Transaction Event Handler
5114

5215
The transaction event handler is the core of the Stream Producer and allows to stream database changes.

readme.adoc

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,10 @@ mvn clean install
2828

2929
### link:doc/asciidoc/consumer/configuration.adoc[Configuration]
3030

31+
== Streams Procedures
32+
33+
### link:doc/asciidoc/procedures/index.adoc[Procedures]
34+
3135
== Docker
3236

3337
### link:doc/asciidoc/docker/index.adoc[Docker]

0 commit comments

Comments
 (0)