Skip to content

Commit 4442020

Browse files
fixes #282: Documentation mistakes on Sink and FAQ section (#283)
1 parent 68fc16e commit 4442020

File tree

4 files changed

+7
-5
lines changed

4 files changed

+7
-5
lines changed

doc/asciidoc/consumer/configuration.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ kafka.value.deserializer=org.apache.kafka.common.serialization.ByteArrayDeserial
1717
{environment}.topic.cud=<LIST_OF_TOPICS_SEPARATED_BY_SEMICOLON>
1818
{environment}.topic.pattern.node.<TOPIC_NAME>=<NODE_EXTRACTION_PATTERN>
1919
{environment}.topic.pattern.relationship.<TOPIC_NAME>=<RELATIONSHIP_EXTRACTION_PATTERN>
20-
{environment}.enabled=<true/false, default=true>
20+
{environment}.enabled=<true/false, default=false>
2121
----
2222

2323
See the https://kafka.apache.org/documentation/#brokerconfigs[Apache Kafka documentation] for details on these settings.

doc/asciidoc/docker/data/docker-compose-source-sink.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ services:
4545
NEO4J_dbms_logs_debug_level: DEBUG
4646
NEO4J_streams_sink_enabled: "true"
4747
NEO4J_streams_sink_topic_neo4j:
48-
"WITH event.value.payload AS payload, event.value.meta AS meta
48+
"WITH event.payload AS payload, event.value.meta AS meta
4949
FOREACH (ignoreMe IN CASE WHEN payload.type = 'node' AND meta.operation <> 'deleted' and payload.after.labels[0] = 'Question' THEN [1] ELSE [] END |
5050
MERGE (n:Question{neo_id: toInteger(payload.id)}) ON CREATE
5151
SET n += payload.after.properties

doc/asciidoc/docker/index.adoc

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -74,6 +74,7 @@ From the same directory where the compose file is, you can launch this command:
7474
docker-compose up -d
7575
----
7676

77+
[#streams_docker_source_module]
7778
==== Source module
7879

7980
Following a compose file that allows you to spin-up Neo4j, Kafka and Zookeeper in order to test the application.
@@ -147,7 +148,7 @@ one configured as `Source` and one as `Sink`, allowing you to share any data fro
147148
environment:
148149
NEO4J_streams_sink_enabled: "true"
149150
NEO4J_streams_sink_topic_neo4j:
150-
"WITH event.value.payload AS payload, event.value.meta AS meta
151+
"WITH event.payload AS payload, event.value.meta AS meta
151152
FOREACH (ignoreMe IN CASE WHEN payload.type = 'node' AND meta.operation <> 'deleted' and payload.after.labels[0] = 'Question' THEN [1] ELSE [] END |
152153
MERGE (n:Question{neo_id: toInteger(payload.id)}) ON CREATE
153154
SET n += payload.after.properties

doc/asciidoc/faq/index.adoc

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -85,8 +85,9 @@ Other references to how to configure it to connect to the Confluent Cloud can be
8585

8686
=== Kafka output events description
8787

88-
If you configure the Neo4j Streams plugin as Sink, using a Cypher query in order to ingest data from Kafka into Neo4j,
89-
watching the Kafka console consumer output you will see JSON events which describes nodes and relationships creation.
88+
If you configure the Neo4j Streams plugin as Source, using a Cypher query in order to send data from Neo4j to Kafka
89+
(i.e. see <<streams_docker_source_module, Source module>> section for more details), watching the Kafka console consumer
90+
output you will see JSON events, which describes nodes and relationships creation.
9091
They looks like as following:
9192

9293
[source, json]

0 commit comments

Comments
 (0)