File tree Expand file tree Collapse file tree 1 file changed +27
-2
lines changed Expand file tree Collapse file tree 1 file changed +27
-2
lines changed Original file line number Diff line number Diff line change 55:sectid:
66:sectlinks:
77
8- The project is composed by 2 parts:
8+ == Motivation
99
10+ Many user and customers want to integrate Kakfa and other streaming solutions with Neo4j.
11+ Either to ingest data into the graph from other sources.
12+ Or to send update events (change data capture - CDC) to the event log for later consumption.
13+
14+ This extension was developed to satisfy all these use-cases and more to come.
15+
16+ The project is composed of several parts:
17+
18+ * Neo4j Streams Procedure: a procedure to send a payload to a topic
1019* Neo4j Streams Producer: a transaction event handler events that sends data to a Kafka topic
1120* Neo4j Streams Consumer: a Neo4j application that ingest data from Kafka topics into Neo4j via templated Cypher Statements
1221
22+ In the next releases we will add Kafka-Connect connectors as well.
23+
1324== Installation
1425
15- Build locally
26+ Download the latest release jar from https://github.com/neo4j-contrib/neo4j-streams/releases/latest
27+
28+ Copy it into `$NEO4J_HOME/plugins` and configure the relevant connections.
29+
30+ The minimal setup in your `neo4j.conf` is:
31+
32+ ----
33+ kafka.zookeeper.connect=localhost:2181
34+ kafka.bootstrap.servers=localhost:9092
35+ ----
36+
37+ For each module there are additional configs that are explained in the individual sections.
38+
39+
40+ == Build locally
1641
1742----
1843mvn clean install
You can’t perform that action at this time.
0 commit comments