Skip to content

Commit 3a32d45

Browse files
authored
Update readme.adoc
1 parent 96fa6ed commit 3a32d45

File tree

1 file changed

+60
-12
lines changed

1 file changed

+60
-12
lines changed

readme.adoc

Lines changed: 60 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,37 +1,85 @@
11
= Neo4j Streaming Data Integrations
2+
:docs: https://neo4j-contrib.github.io/neo4j-streams/
23

3-
== Description
4+
image::https://github.com/neo4j-contrib/neo4j-streams/raw/gh-pages/3.5/images/neo4j-loves-confluent.png[]
45

5-
The project is composed by 2 parts:
6+
This project integrates Neo4j with streaming data solutions.
67

7-
* Neo4j Streams Producer: a transaction event handler events that sends data to a Kafka topic
8-
* Neo4j Streams Consumer: a Neo4j application that ingest data from Kafka topics into Neo4j via templated Cypher Statements
8+
Currently it provides an integration with *Apache Kafka and the Confluent Platform*.
99

10-
== Installation
10+
The project contains these components:
1111

12-
Build locally
12+
== Neo4j Kafka Connect Plugin
13+
14+
A https://www.confluent.io/connector/kafka-connect-neo4j-sink/[Kafka Connect Sink plugin] that allows to ingest events from Kafka to Neo4j via templated Cypher statements. (link:{docs}#_kafka_connect[docs], https://www.confluent.io/blog/kafka-connect-neo4j-sink-plugin[article])
15+
16+
image::https://www.confluent.io/wp-content/uploads/Kafka_Connect_Neo4j_Sink.png[width=300,link=https://www.confluent.io/connector/kafka-connect-neo4j-sink/]
17+
18+
== Neo4j Server Extension
19+
20+
* Source: a Change-Data-Capture (CDC) implementation sends change data to Kafka topics (link:{docs}#_neo4j_streams_producer[docs])
21+
* Sink: a Neo4j extension that ingest data from Kafka topics into Neo4j via templated Cypher statements (link:{docs}#_neo4j_streams_consumer[docs])
22+
* Neo4j Streams Procedures (Read & Write): Procedures to write to and read from topics interactively/programmatically (link:{docs}#_procedures[docs])
23+
24+
== Documentation & Articles
25+
26+
Read more at http://r.neo4j.com/kafka
27+
28+
Here are articles, introducing the https://medium.com/neo4j/a-new-neo4j-integration-with-apache-kafka-6099c14851d2[Neo4j Extension] and the https://www.confluent.io/blog/kafka-connect-neo4j-sink-plugin[Kafka Connect Plugin].
29+
30+
And practical applications of the extension for https://medium.freecodecamp.org/how-to-leverage-neo4j-streams-and-build-a-just-in-time-data-warehouse-64adf290f093[Building Data Pipelines with Kafka, Spark, Neo4j & Zeppelin] (https://medium.freecodecamp.org/how-to-ingest-data-into-neo4j-from-a-kafka-stream-a34f574f5655[part 2]).
31+
32+
And for exchanging results of https://medium.freecodecamp.org/how-to-embrace-event-driven-graph-analytics-using-neo4j-and-apache-kafka-474c9f405e06[Neo4j Graph Algorithms within a Neo4j Cluster].
33+
34+
== Feedback & Suggestions
35+
36+
Please raise https://github.com/neo4j-contrib/neo4j-streams/issues[issues on GitHub], we also love contributions, so don't be shy to send a Pull Request.
37+
38+
We would also love you to https://goo.gl/forms/VLwvqwsIvdfdm9fL2[**fill out our survey**] to learn more about your Kafka + Neo4j use-cases and deployments.
39+
40+
== Installation Server Extension
41+
42+
You can run/test the extension link:{docs}#docker[locally with Docker], or install it manually into your existing Neo4j server.
43+
44+
1. Download the jar-file from the https://github.com/neo4j-contrib/neo4j-streams/releases/latest[latest release]
45+
2. Copy `neo4j-streams-<VERSION>.jar` into `$NEO4J_HOME/plugins`
46+
3. Update `$NEO4J_HOME/conf/neo4j.conf` with the necessary configuration.
47+
4. Restart Neo4j
48+
49+
== Development & Contributions
50+
51+
==== Build locally
1352

1453
----
1554
mvn clean install
1655
----
1756

18-
2. Copy `<project_dir>/target/neo4j-streams-<VERSION>.jar` into `$NEO4J_HOME/plugins`
19-
3. Restart Neo4j
57+
You'll find the build artifact in `<project_dir>/target/neo4j-streams-<VERSION>.jar`
58+
59+
Testing the link:{docs}#_docker_compose_file[Kafka Connect Plugin locally with Docker].
60+
61+
////
62+
== Documentation Links
63+
64+
=== Kafka Connect Plugin
65+
66+
### link:doc/asciidoc/kafka-connect/index.adoc[Kafka Connect Plugin]
2067
21-
== Streams Producer
68+
=== Streams Producer
2269
2370
### link:doc/asciidoc/producer/configuration.adoc[Configuration]
2471
2572
### link:doc/asciidoc/producer/patterns.adoc[Patterns]
2673
27-
== Streams Consumer
74+
=== Streams Consumer
2875
2976
### link:doc/asciidoc/consumer/configuration.adoc[Configuration]
3077
31-
== Streams Procedures
78+
=== Streams Procedures
3279
3380
### link:doc/asciidoc/procedures/index.adoc[Procedures]
3481
35-
== Docker
82+
=== Docker
3683
3784
### link:doc/asciidoc/docker/index.adoc[Docker]
85+
////

0 commit comments

Comments
 (0)