|
1 | 1 | = Neo4j Streaming Data Integrations |
| 2 | +:docs: https://neo4j-contrib.github.io/neo4j-streams/ |
2 | 3 |
|
3 | | -== Description |
| 4 | +image::https://github.com/neo4j-contrib/neo4j-streams/raw/gh-pages/3.4/images/neo4j-loves-confluent.png[] |
4 | 5 |
|
5 | | -The project is composed by 2 parts: |
| 6 | +This project integrates Neo4j with streaming data solutions. |
6 | 7 |
|
7 | | -* Neo4j Streams Producer: a transaction event handler events that sends data to a Kafka topic |
8 | | -* Neo4j Streams Consumer: a Neo4j application that ingest data from Kafka topics into Neo4j via templated Cypher Statements |
| 8 | +Currently it provides an integration with *Apache Kafka and the Confluent Platform*. |
9 | 9 |
|
10 | | -== Installation |
| 10 | +The project contains these components: |
11 | 11 |
|
12 | | -Build locally |
| 12 | +== Neo4j Kafka Connect Plugin |
| 13 | + |
| 14 | +A https://www.confluent.io/connector/kafka-connect-neo4j-sink/[Kafka Connect Sink plugin] that allows to ingest events from Kafka to Neo4j via templated Cypher statements. (link:{docs}#_kafka_connect[docs], https://www.confluent.io/blog/kafka-connect-neo4j-sink-plugin[article]) |
| 15 | + |
| 16 | +image::https://www.confluent.io/wp-content/uploads/Kafka_Connect_Neo4j_Sink.png[width=300,link=https://www.confluent.io/connector/kafka-connect-neo4j-sink/] |
| 17 | + |
| 18 | +== Neo4j Server Extension |
| 19 | + |
| 20 | +* Source: a Change-Data-Capture (CDC) implementation sends change data to Kafka topics (link:{docs}#_neo4j_streams_producer[docs]) |
| 21 | +* Sink: a Neo4j extension that ingest data from Kafka topics into Neo4j via templated Cypher statements (link:{docs}#_neo4j_streams_consumer[docs]) |
| 22 | +* Neo4j Streams Procedures (Read & Write): Procedures to write to and read from topics interactively/programmatically (link:{docs}#_procedures[docs]) |
| 23 | + |
| 24 | +== Documentation & Articles |
| 25 | + |
| 26 | +Read more at http://r.neo4j.com/kafka |
| 27 | + |
| 28 | +Here are articles, introducing the https://medium.com/neo4j/a-new-neo4j-integration-with-apache-kafka-6099c14851d2[Neo4j Extension] and the https://www.confluent.io/blog/kafka-connect-neo4j-sink-plugin[Kafka Connect Plugin]. |
| 29 | + |
| 30 | +And practical applications of the extension for https://medium.freecodecamp.org/how-to-leverage-neo4j-streams-and-build-a-just-in-time-data-warehouse-64adf290f093[Building Data Pipelines with Kafka, Spark, Neo4j & Zeppelin] (https://medium.freecodecamp.org/how-to-ingest-data-into-neo4j-from-a-kafka-stream-a34f574f5655[part 2]). |
| 31 | + |
| 32 | +And for exchanging results of https://medium.freecodecamp.org/how-to-embrace-event-driven-graph-analytics-using-neo4j-and-apache-kafka-474c9f405e06[Neo4j Graph Algorithms within a Neo4j Cluster]. |
| 33 | + |
| 34 | +== Feedback & Suggestions |
| 35 | + |
| 36 | +Please raise https://github.com/neo4j-contrib/neo4j-streams/issues[issues on GitHub], we also love contributions, so don't be shy to send a Pull Request. |
| 37 | + |
| 38 | +We would also love you to https://goo.gl/forms/VLwvqwsIvdfdm9fL2[**fill out our survey**] to learn more about your Kafka + Neo4j use-cases and deployments. |
| 39 | + |
| 40 | +== Installation Server Extension |
| 41 | + |
| 42 | +You can run/test the extension link:{docs}#docker[locally with Docker], or install it manually into your existing Neo4j server. |
| 43 | + |
| 44 | +1. Download the jar-file from the https://github.com/neo4j-contrib/neo4j-streams/releases/latest[latest release] |
| 45 | +2. Copy `neo4j-streams-<VERSION>.jar` into `$NEO4J_HOME/plugins` |
| 46 | +3. Update `$NEO4J_HOME/conf/neo4j.conf` with the necessary configuration. |
| 47 | +4. Restart Neo4j |
| 48 | + |
| 49 | +== Development & Contributions |
| 50 | + |
| 51 | +==== Build locally |
13 | 52 |
|
14 | 53 | ---- |
15 | 54 | mvn clean install |
16 | 55 | ---- |
17 | 56 |
|
18 | | -2. Copy `<project_dir>/target/neo4j-streams-<VERSION>.jar` into `$NEO4J_HOME/plugins` |
19 | | -3. Restart Neo4j |
| 57 | +You'll find the build artifact in `<project_dir>/target/neo4j-streams-<VERSION>.jar` |
| 58 | + |
| 59 | +Testing the link:{docs}#_docker_compose_file[Kafka Connect Plugin locally with Docker]. |
| 60 | + |
| 61 | +//// |
| 62 | +== Documentation Links |
| 63 | +
|
| 64 | +=== Kafka Connect Plugin |
| 65 | +
|
| 66 | +### link:doc/asciidoc/kafka-connect/index.adoc[Kafka Connect Plugin] |
20 | 67 |
|
21 | | -== Streams Producer |
| 68 | +=== Streams Producer |
22 | 69 |
|
23 | 70 | ### link:doc/asciidoc/producer/configuration.adoc[Configuration] |
24 | 71 |
|
25 | 72 | ### link:doc/asciidoc/producer/patterns.adoc[Patterns] |
26 | 73 |
|
27 | | -== Streams Consumer |
| 74 | +=== Streams Consumer |
28 | 75 |
|
29 | 76 | ### link:doc/asciidoc/consumer/configuration.adoc[Configuration] |
30 | 77 |
|
31 | | -== Streams Procedures |
| 78 | +=== Streams Procedures |
32 | 79 |
|
33 | 80 | ### link:doc/asciidoc/procedures/index.adoc[Procedures] |
34 | 81 |
|
35 | | -== Docker |
| 82 | +=== Docker |
36 | 83 |
|
37 | 84 | ### link:doc/asciidoc/docker/index.adoc[Docker] |
| 85 | +//// |
0 commit comments