Skip to content

Commit fe25c83

Browse files
committed
chunked docs
1 parent ef04c3b commit fe25c83

24 files changed

+2075
-4
lines changed

doc/asciidoc/consumer/index.adoc

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
1+
[[consumer]]
12
== Neo4j Streams Consumer
23

34
Is the Kafka Sink that ingest the data directly into Neo4j

doc/asciidoc/index.adoc

Lines changed: 26 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,24 @@
55
:sectid:
66
:sectlinks:
77

8-
== Motivation
8+
9+
[abstract]
10+
--
11+
This is the user guide for Neo4j Streams version {docs-version}, authored by the Neo4j Labs Team.
12+
--
13+
14+
The guide covers the following areas:
15+
16+
* <<introduction>> -- An introduction to Neo4j Streams
17+
* <<producer>> -- Sends transaction event handler events to a Kafka topic
18+
* <<consumer>> -- Ingests events from a Kafka topic into Neo4j
19+
* <<procedures>> -- Procedures for consuming and producing Kafka events
20+
* <<docker>> -- Docker Compose files for local testing
21+
* <<kafka-connect>> -- Kafka Connect Sink plugin
22+
23+
24+
[[introduction]]
25+
== Introduction
926

1027
Many user and customers want to integrate Kakfa and other streaming solutions with Neo4j.
1128
Either to ingest data into the graph from other sources.
@@ -21,7 +38,8 @@ The project is composed of several parts:
2138

2239
In the next releases we will add Kafka-Connect connectors as well.
2340

24-
== Installation
41+
[[installation]]
42+
=== Installation
2543

2644
Download the latest release jar from https://github.com/neo4j-contrib/neo4j-streams/releases/latest
2745

@@ -37,7 +55,8 @@ kafka.bootstrap.servers=localhost:9092
3755
For each module there are additional configs that are explained in the individual sections.
3856

3957

40-
== Build locally
58+
[[build-locally]]
59+
==== Build locally
4160

4261
----
4362
mvn clean install
@@ -53,4 +72,7 @@ include::consumer/index.adoc[]
5372

5473
include::procedures/index.adoc[]
5574

56-
include::docker/index.adoc[]
75+
76+
include::docker/index.adoc[]
77+
78+
include::kafka-connect/index.adoc[]
Lines changed: 92 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,92 @@
1+
[[kafka-connect]]
2+
== Kafka Connect
3+
4+
image::neo4j-loves-confluent.png[Neo4j Loves Confluent]
5+
6+
7+
Kafka Connect, an open source component of Apache Kafka, is a framework for connecting Kafka with external systems such as databases, key-value stores, search indexes, and file systems.
8+
9+
The Neo4j Streams project provides a Kafka Connect plugin that can be installed into the Confluent Platform enabling:
10+
11+
- Ingest data from Kafka topics directly into Neo4j via templated Cypher queries;
12+
- Stream Neo4j transaction events (*coming soon*).
13+
14+
=== Plugin installation
15+
16+
You can choose your preferred way in order to install the plugin
17+
18+
==== Download and install the plugin via Confluent Hub client
19+
20+
If you are using the provided compose file you can easily install the plugin by using the Confluent Hub.
21+
22+
Once the compose file is up and running you can install the plugin by executing the following command:
23+
24+
[source,bash]
25+
----
26+
<confluent_platform_dir>/bin/confluent-hub install neo4j/kafka-connect-neo4j:1.0.0
27+
----
28+
29+
When the installation will ask:
30+
31+
[source,bash]
32+
----
33+
The component can be installed in any of the following Confluent Platform installations:
34+
----
35+
36+
Please prefer the solution `(where this tool is installed)` and then go ahead with the default options.
37+
38+
Following an example:
39+
40+
image::confluent-hub-client.jpg[Installation via Confluent Hub Client]
41+
42+
At the end of the process the plugin is automatically installed.
43+
44+
==== Download the zip from the Confluent Hub
45+
46+
Please go to the Confluent Hub page of the plugin:
47+
48+
https://www.confluent.io/connector/kafka-connect-neo4j-sink/
49+
50+
And click to the **Download Connector** button.
51+
52+
Once you downloaded the file please place it into your Kafka Connect `plugins` dir.
53+
54+
==== Build it locally
55+
56+
Download the project from Github:
57+
58+
git clone https://github.com/neo4j-contrib/neo4j-streams.git
59+
60+
Go into the `neo4j-streams` directory:
61+
62+
cd neo4j-streams
63+
64+
Build the project by running the following command:
65+
66+
mvn clean install
67+
68+
Inside the directory `<neo4j-streams>/kafka-connect-neo4j/target/component/packages` you'll find a file named `neo4j-kafka-connect-neo4j-<VERSION>.zip`, please place it into your Kafka Connect `plugins` dir.
69+
70+
=== Docker compose file
71+
72+
Inside the directory `/kafka-connect-neo4j/docker` you'll find a compose file that allows you to start the whole testing environment:
73+
74+
.docker-compose.yml
75+
[source,yaml]
76+
----
77+
include::../../../kafka-connect-neo4j/docker/docker-compose.yml[]
78+
----
79+
80+
include::../../../kafka-connect-neo4j/docker/readme.adoc[]
81+
82+
=== Monitor via Confluent Platform UI
83+
84+
The Kafka Monitoring UI can be found at http://<localhost>:9021/management/connect
85+
86+
image::confluent-metrics.jpg[Confluent Importing Metrics]
87+
88+
They show up properly in my topic, and then are added to Neo4j via the sink.
89+
90+
Below you see the data that has been ingested into Neo4j. During my testing I got up to more than 2M events.
91+
92+
image::confluent-imported-data.jpg[Confluent Platform Management]

doc/asciidoc/procedures/index.adoc

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
1+
[[procedures]]
12
== Procedures
23

34
The Streams project comes out with a list of procedures.

doc/asciidoc/producer/index.adoc

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
1+
[[producer]]
12
== Neo4j Streams Producer
23

34
Is the transaction event handler events that sends data to a Kafka topic

0 commit comments

Comments
 (0)