You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
StreamsUtils.ignoreExceptions({ module.addSerializer(Point::class.java, PointSerializer()) }, NoClassDefFoundError::class.java) // in case is loaded from
88
132
StreamsUtils.ignoreExceptions({ module.addSerializer(PointValue::class.java, PointValueSerializer()) }, NoClassDefFoundError::class.java) // in case is loaded from
133
+
StreamsUtils.ignoreExceptions({ module.addSerializer(org.neo4j.driver.types.Point::class.java, DriverPointSerializer()) }, NoClassDefFoundError::class.java) // in case is loaded from
134
+
StreamsUtils.ignoreExceptions({ module.addSerializer(org.neo4j.driver.types.Node::class.java, DriverNodeSerializer()) }, NoClassDefFoundError::class.java) // in case is loaded from
135
+
StreamsUtils.ignoreExceptions({ module.addSerializer(org.neo4j.driver.types.Relationship::class.java, DriverRelationshipSerializer()) }, NoClassDefFoundError::class.java) // in case is loaded from
= When to use Kafka Connect vs. Neo4j Streams as a Plugin
1
+
= When to use Kafka Connect Neo4j Connector vs. Neo4j Streams as a Plugin
2
2
3
3
[abstract]
4
4
This section covers how to decide whether to run as a Kafka Connect worker, or as a Neo4j Plugin.
@@ -7,16 +7,18 @@ This section covers how to decide whether to run as a Kafka Connect worker, or a
7
7
8
8
=== Pros
9
9
10
-
* Processing is outside of Neo4j so that memory & CPU impact doesn't impact Neo4j. You don't need to size the database with Kafka utilization in mind.
11
-
* Much easier for Kafka pros to manage; they benefit from the Confluent ecosystem, such as connecting the REST API to manipulate connectors, the control center to administer & monitor them.
12
-
* By restarting the worker, you can restart your sink strategy without having downtime for Neo4j.
10
+
* Processing is outside of Neo4j so that memory & CPU impact doesn't impact Neo4j.
11
+
You don't need to size the database with Kafka utilization in mind.
12
+
* Much easier for Kafka pros to manage; they benefit from the Confluent ecosystem,
13
+
such as connecting the REST API to manipulate connectors, the control center to administer & monitor them.
14
+
* By restarting the worker, you can restart your sink/source strategy without having downtime for Neo4j.
13
15
* Upgrade Neo4j-Streams without restarting the cluster
14
-
Strictly an external bolt client, so better overall security management of plugin actions.
16
+
* Strictly an external bolt client, so better overall security management of plugin actions.
15
17
16
18
=== Cons
17
19
18
-
* You can't do TransactionEventHandlers from outside of the database, so you can only sink to Neo4j, you can't produce from it.
19
-
* If you're using Confluent Cloud, you can't host the connector in the cloud (yet). So this requires a 3rd piece of architecture: Confluent Cloud, Neo4j, and the Connect Worker (usually a separate VM)
20
+
* If you're using Confluent Cloud, you can't host the connector in the cloud (yet).
21
+
So this requires a 3rd piece of architecture: Confluent Cloud, Neo4j, and the Connect Worker (usually a separate VM)
20
22
* Possibly worse throughput due to bolt latency & overhead, and separate network hop.
21
23
22
24
== Neo4j-Streams Plugin
@@ -31,8 +33,6 @@ Strictly an external bolt client, so better overall security management of plugi
31
33
=== Cons
32
34
33
35
* Memory & CPU consumption on your Neo4j Server
34
-
* Requires restarting Neo4j in order to update your configuration and/or Cypher.
35
-
* Upgrading plugin requires cluster restart
36
36
* Need to track config to be identical across all members in the cluster
37
37
* Lesser ability to manage the plugin because it is running inside of the database and not under a particular user account.
Copy file name to clipboardExpand all lines: doc/docs/modules/ROOT/pages/configuration.adoc
+5Lines changed: 5 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,9 @@
1
1
[#neo4j_configuration_system]
2
+
3
+
[NOTE]
4
+
The Neo4j Streams Plugin running inside the Neo4j database is deprecated and will not be supported after version 4.3 of Neo4j.
5
+
We recommend users not to adopt this plugin for new implementations, and to consider migrating to the use of the Kafka Connect Neo4j Connector as a replacement
Copy file name to clipboardExpand all lines: doc/docs/modules/ROOT/pages/consumer.adoc
+5Lines changed: 5 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,6 +10,11 @@ Use this section to configure Neo4j to ingest the data from Kafka into Neo4j.
10
10
--
11
11
12
12
endif::env-docs[]
13
+
14
+
[NOTE]
15
+
The Neo4j Streams Plugin running inside the Neo4j database is deprecated and will not be supported after version 4.3 of Neo4j.
16
+
We recommend users not to adopt this plugin for new implementations, and to consider migrating to the use of the Kafka Connect Neo4j Connector as a replacement
17
+
13
18
[[neo4j_streams_sink]]
14
19
Is the Kafka Sink that ingest the data directly into Neo4j
Copy file name to clipboardExpand all lines: doc/docs/modules/ROOT/pages/docker.adoc
+5-1Lines changed: 5 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,9 @@
1
1
= Run with Docker
2
2
3
+
[NOTE]
4
+
The Neo4j Streams Plugin running inside the Neo4j database is deprecated and will not be supported after version 4.3 of Neo4j.
5
+
We recommend users not to adopt this plugin for new implementations, and to consider migrating to the use of the Kafka Connect Neo4j Connector as a replacement
6
+
3
7
ifdef::env-docs[]
4
8
[abstract]
5
9
--
@@ -277,7 +281,7 @@ You'll see something like this:
277
281
----
278
282
279
283
[[docker_kafka_connect]]
280
-
== Kafka Connect plugin
284
+
== Kafka Connect Neo4j Connector
281
285
282
286
Inside the directory `/kafka-connect-neo4j/docker` you'll find a compose file that allows you to start the whole testing environment:
== Which way should I run the Neo4j Connector for Apache Kafka: As a database plugin, or using the Kafka Connect Framework?
193
+
194
+
If you have already implemented the database plugin and are running Neo4j <= 4.2, there is no need to change, all other users, new users, and Neo4j Aura users should implement only the Kafka Connect Neo4j Connector
0 commit comments