Skip to content

Commit 29c445a

Browse files
ali-incenvitucci
andauthored
Add RAW_JSON_STRING payload mode (#83)
* Add RAW_JSON_STRING payload mode * Apply suggestions from code review Co-authored-by: Nicola Vitucci <nicola.vitucci@gmail.com> * Set released version --------- Co-authored-by: Nicola Vitucci <nicola.vitucci@gmail.com>
1 parent e7dfad7 commit 29c445a

File tree

4 files changed

+41
-9
lines changed

4 files changed

+41
-9
lines changed

antora.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ asciidoc:
1212
page-product: Neo4j Connector for Kafka
1313
kafka-connect-version: 3.0
1414
connector-version: '5.1'
15-
exact-connector-version: '5.1.14'
15+
exact-connector-version: '5.1.15'
1616
page-pagination: true
1717
product-name: Neo4j Connector for Kafka
1818
url-common-license-page: https://neo4j.com/docs/license/

modules/ROOT/pages/changelog.adoc

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,20 @@
22

33
This page lists changes to the {product-name}.
44

5+
== Version 5.1.15
6+
7+
[cols="1,2", options="header"]
8+
|===
9+
| Feature | Details
10+
11+
a|
12+
label:bug[]
13+
label:fixed[]
14+
15+
Introduce `RAW_JSON_STRING` payload mode.
16+
| For xref:source/query.adoc[], users can now use a new payload mode named `RAW_JSON_STRING` which will generate the messages as raw JSON strings similar to the 5.0.x versions of the connector.
17+
|===
18+
519
== Version 5.1.14
620

721
This is a maintenance release which provides updated dependencies.
@@ -19,6 +33,7 @@ label:fixed[]
1933
Introduce `neo4j.query.force-maps-as-struct`
2034
| For the source query strategy, users can now control whether maps with homogeneous value types are encoded as structs or maps.
2135
|===
36+
2237
== Version 5.1.12
2338

2439
This is a maintenance release which provides updated dependencies.

modules/ROOT/pages/source/payload-mode.adoc

Lines changed: 22 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,32 @@
11
= Kafka Source Connector: Payload Mode Configuration
22
:page-role: new-5.1.5
33

4-
The Kafka Source Connector for Neo4j supports two payload modes to control the format of data serialized and published to Kafka topics: `EXTENDED` and `COMPACT`. This feature is configurable through the `neo4j.payload-mode` property, allowing users to select the preferred serialization format based on data requirements.
4+
The Kafka Source Connector for Neo4j supports three payload modes to control the format of data serialized and published to Kafka topics: `EXTENDED`, `COMPACT` and `RAW_JSON_STRING`.
5+
This feature is configurable through the `neo4j.payload-mode` property, allowing users to select the preferred serialization format based on data requirements.
56

67
== Payload Modes
78

89
The `neo4j.payload-mode` configuration offers the following options:
910

10-
* **`EXTENDED` (Default)**: Provides a detailed structure for each property, supporting schema compatibility and consistency. This format is especially useful in cases where schema changes (such as property type changes) or temporal types are present, ensuring data consistency across changes.
11+
* **`EXTENDED` (Default)**: Provides a detailed structure for each property, supporting schema compatibility and consistency.
12+
This format is especially useful in cases where schema changes (such as property type changes) or temporal types are present, ensuring data consistency across changes.
1113

12-
* **`COMPACT`**: Produces a simpler format that only includes the essential fields. This format is lighter and may be preferable when schema compatibility or complex data types are not required.
14+
* **`COMPACT`**: Produces a simpler format that only includes the essential fields.
15+
This format is lighter and may be preferable when schema compatibility or complex data types are not required.
16+
17+
* **`RAW_JSON_STRING`**: Produces a raw JSON string representation of the data.
18+
This mode is useful for scenarios where you generate complex data structures using xref:source/query.adoc[] which are not easily represented in the `EXTENDED` or `COMPACT` modes.
19+
*Only available when using xref:source/query.adoc[] strategy.*
1320

1421
[WARNING]
1522
====
1623
*Limitations of `COMPACT` Mode*
1724
18-
* **Property Type Changes**: `COMPACT` mode does not support changes in property types. If a property type changes in Neo4j (e.g., from integer to string), it can break the schema.
19-
* **Protobuf Compatibility**: `COMPACT` mode is not supported with Protobuf. It does not support serialization of temporal types (e.g., `LocalDate`, `LocalDateTime`).
25+
* **Property Type Changes**: `COMPACT` mode does not support changes in property types.
26+
If a property type changes in Neo4j (e.g., from integer to string), it can break the schema.
27+
28+
* **Protobuf Compatibility**: `COMPACT` mode is not supported with Protobuf.
29+
It does not support serialization of temporal types (e.g., `LocalDate`, `LocalDateTime`).
2030
====
2131

2232

@@ -141,7 +151,8 @@ This mode is especially beneficial for data with complex schema requirements, as
141151

142152
== Understanding the `EXTENDED` Payload Structure
143153

144-
In `EXTENDED` mode, each property includes fields for every supported Neo4j type. Only the field corresponding to the actual property type will contain a non-null value, while all others are set to null. This structure ensures that any change in the type of a property does not cause schema enforcement errors at either the source or sink connector.
154+
In `EXTENDED` mode, each property includes fields for every supported Neo4j type. Only the field corresponding to the actual property type will contain a non-null value, while all others are set to null.
155+
This structure ensures that any change in the type of a property does not cause schema enforcement errors at either the source or sink connector.
145156

146157
[cols="1,2"]
147158
|===
@@ -179,8 +190,11 @@ For example, a string field will be represented as:
179190

180191
== Configuration Recommendations
181192

182-
`COMPACT` mode is useful and easier to work with when generated messages are consumed with other connectors or applications, and you can relax your schema compatibility mode on target topics. If your environment requires schema compatibility, temporal data types, or you have strong type safety requirements with different converters (`AVRO`, `JSON Schema`, `PROTOBUF` or `JSON Embedded`), `EXTENDED` mode should be preferred.
193+
`COMPACT` mode is useful and easier to work with when generated messages are consumed with other connectors or applications, and you can relax your schema compatibility mode on target topics.
194+
If your environment requires schema compatibility, temporal data types, or you have strong type safety requirements with different converters (`AVRO`, `JSON Schema`, `PROTOBUF` or `JSON Embedded`), `EXTENDED` mode should be preferred.
183195

184196
== Compatibility with Sink Connectors
185197

186-
The `EXTENDED` format was introduced in connector version 5.1.0 to ensure that all data published to Kafka topics adheres to a consistent schema. This prevents issues when a property changes type on the Neo4j side (e.g., a name property changes from integer to string), enabling smooth data processing across connectors and Kafka consumers. When a Neo4j sink connector is fed by a Neo4j source connector, it’s recommended to use `EXTENDED` mode, as the Neo4j sink connector can seamlessly handle the `EXTENDED` data type.
198+
The `EXTENDED` format was introduced in connector version 5.1.0 to ensure that all data published to Kafka topics adheres to a consistent schema.
199+
This prevents issues when a property changes type on the Neo4j side (for example, a name property changes from integer to string), enabling smooth data processing across connectors and Kafka consumers.
200+
Use the `EXTENDED` mode when a Neo4j sink connector is fed by a Neo4j source connector, as the Neo4j sink connector can seamlessly handle the `EXTENDED` data type.

modules/ROOT/pages/source/query.adoc

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -116,4 +116,7 @@ In this case, you can configure `neo4j.payload-mode` setting as `COMPACT` so tha
116116
include::example$producer-data/query.compact.json[]
117117
----
118118

119+
In case you generate complex data structures as part of your Cypher query, you can also use the `RAW_JSON_STRING` payload mode which will produce a raw JSON string representation of the data without any schema compatibility guarantees.
120+
Although the generated message will look exactly the same as the `COMPACT` mode, it will be encoded as a `STRING` type.
121+
119122
Refer to the xref:source/payload-mode.adoc[payload mode] page for more information about the `neo4j.payload-mode` setting, including its limitations.

0 commit comments

Comments
 (0)