Skip to content

Commit 499d884

Browse files
authored
Fixes to page titles in generated docs (#353)
* Fixes to page titles in generated docs * Removed streams.adoc Co-authored-by: Adam Cowley <[email protected]>
1 parent 5b1eddf commit 499d884

28 files changed

+1211
-271
lines changed

doc/docs.yml

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3,14 +3,16 @@ site:
33
url: /neo4j-streams-docs
44
content:
55
sources:
6-
- url: https://github.com/neo4j-contrib/neo4j-streams
7-
branches: '4.0'
6+
- url: ../
7+
branches: HEAD
88
start_path: doc/docs
99
ui:
1010
bundle:
1111
url: https://github.com/neo4j-documentation/docs-refresh/raw/master/ui/build/ui-bundle.zip
1212
snapshot: true
13+
urls:
14+
html_extension_style: indexify
1315
asciidoc:
1416
attributes:
15-
page-theme: docs
16-
page-cdn: /_/
17+
page-theme: labs
18+
page-disabletracking: true

doc/docs/antora.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,3 +12,5 @@ asciidoc:
1212
copyright: Neo4j Inc.
1313
common-license-page-uri: https://neo4j.com/docs/license/
1414
page-product: Neo4j Streams
15+
environment: streams.sink
16+
page-pagination: true

doc/docs/modules/ROOT/nav.adoc

Lines changed: 52 additions & 52 deletions
Original file line numberDiff line numberDiff line change
@@ -1,56 +1,56 @@
11
* xref::index.adoc[Neo4j Streams Integrations User Guide]
22
33
* xref::overview.adoc[Project overview]
4-
** xref::overview.adoc#neo4j_streams_plugin_overview[Neo4j Streams plugin]
5-
** xref::overview.adoc#kafka_connect_plugin_overview[Kafka Connect plugin]
4+
// ** xref::overview.adoc#neo4j_streams_plugin_overview[Neo4j Streams plugin]
5+
// ** xref::overview.adoc#kafka_connect_plugin_overview[Kafka Connect plugin]
66
77
* xref::quickstart.adoc[Quick Start]
8-
** xref::quickstart.adoc#neo4j_streams_plugin_quickstart[Neo4j Streams plugin]
9-
** xref::quickstart.adoc#kafka_connect_plugin_quickstart[Kafka Connect plugin]
8+
// ** xref::quickstart.adoc#neo4j_streams_plugin_quickstart[Neo4j Streams plugin]
9+
// ** xref::quickstart.adoc#kafka_connect_plugin_quickstart[Kafka Connect plugin]
1010
11-
* xref::producer.adoc[Neo4j Streams - Source: Neo4j -> Kafka]
12-
** xref::producer.adoc#neo4j_streams_producer_config[Configuration]
13-
** xref::producer.adoc#source-patterns[Patterns]
14-
** xref::producer.adoc#_transaction_event_handler[Transaction Event Handler]
11+
* xref::producer.adoc[Source: Neo4j -> Kafka]
12+
// ** xref::producer.adoc#neo4j_streams_producer_config[Configuration]
13+
// ** xref::producer.adoc#source-patterns[Patterns]
14+
// ** xref::producer.adoc#_transaction_event_handler[Transaction Event Handler]
1515
16-
* xref::consumer.adoc[Neo4j Streams - Sink: Neo4j -> Kafka]
17-
** xref::consumer.adoc#neo4j_streams_sink_howitworks[How it works]
18-
** xref::consumer.adoc#_sink_ingestion_strategies[Sink ingestion strategies]
19-
** xref::consumer.adoc#neo4j_streams_dlq[How deal with bad data]
20-
** xref::consumer.adoc#neo4j_streams_supported_deserializers[Supported Kafka deserializers]
21-
** xref::consumer.adoc#neo4j_streams_config_summary[Configuration summary]
16+
* xref::consumer.adoc[Sink: Neo4j -> Kafka]
17+
// ** xref::consumer.adoc#neo4j_streams_sink_howitworks[How it works]
18+
// ** xref::consumer.adoc#_sink_ingestion_strategies[Sink ingestion strategies]
19+
// ** xref::consumer.adoc#neo4j_streams_dlq[How deal with bad data]
20+
// ** xref::consumer.adoc#neo4j_streams_supported_deserializers[Supported Kafka deserializers]
21+
// ** xref::consumer.adoc#neo4j_streams_config_summary[Configuration summary]
2222
23-
* xref::procedures.adoc[Neo4j Streams - Procedures]
24-
** xref::procedures.adoc#neo4j_streams_procedures_config[Configuration]
25-
** xref::procedures.adoc#neo4j_streams_procedure_publish[streams.publish]
26-
** xref::procedures.adoc#neo4j_streams_procedure_consume[streams.consume]
23+
* xref::procedures.adoc[Procedures]
24+
// ** xref::procedures.adoc#_configuration[Configuration]
25+
// ** xref::procedures.adoc#_streams_publish[streams.publish]
26+
// ** xref::procedures.adoc#_streams_consume[streams.consume]
2727
2828
* xref::kafka-connect.adoc[Kafka Connect Plugin]
29-
** xref::kafka-connect.adoc#kafka_connect_plugin_install[Plugin installation]
30-
** xref::kafka-connect.adoc#kafka-connect-sink-instance[Create the Sink Instance]
31-
** xref::kafka-connect.adoc#kafka-connect-sink-strategies[Sink ingestion strategies]
32-
** xref::kafka-connect.adoc#kafka-connect-cud-file-format[How deal with bad data]
33-
** xref::kafka-connect.adoc#kafka_connect_monitor[Monitor via Confluent Pltaform UI]
34-
** xref::kafka-connect.adoc#kafka_connect_config_policy[Kafka Connect Client Config Override Policy]
35-
** xref::kafka-connect.adoc#_configuration_summary[Configuration Summary]
29+
// ** xref::kafka-connect.adoc#kafka_connect_plugin_install[Plugin installation]
30+
// ** xref::kafka-connect.adoc#kafka-connect-sink-instance[Create the Sink Instance]
31+
// ** xref::kafka-connect.adoc#kafka-connect-sink-strategies[Sink ingestion strategies]
32+
// ** xref::kafka-connect.adoc#kafka-connect-cud-file-format[How deal with bad data]
33+
// ** xref::kafka-connect.adoc#kafka_connect_monitor[Monitor via Confluent Pltaform UI]
34+
// ** xref::kafka-connect.adoc#kafka_connect_config_policy[Kafka Connect Client Config Override Policy]
35+
// ** xref::kafka-connect.adoc#_configuration_summary[Configuration Summary]
3636
3737
* xref::neo4j-cluster.adoc[Using with Neo4j Causal Cluster]
38-
** xref::neo4j-cluster.adoc#cluster_overview[Overview]
39-
** xref::neo4j-cluster.adoc#cluster_kafka_connect[Kafka Connect]
40-
** xref::neo4j-cluster.adoc#cluster_neo4j_plugin[Neo4j Plugin]
41-
** xref::neo4j-cluster.adoc#cluster_remote_clients[Remote Clients]
38+
// ** xref::neo4j-cluster.adoc#cluster_overview[Overview]
39+
// ** xref::neo4j-cluster.adoc#cluster_kafka_connect[Kafka Connect]
40+
// ** xref::neo4j-cluster.adoc#cluster_neo4j_plugin[Neo4j Plugin]
41+
// ** xref::neo4j-cluster.adoc#cluster_remote_clients[Remote Clients]
4242
4343
* xref::docker.adoc[Run with Docker]
44-
** xref::docker.adoc#neo4j_streams_docker[Neo4j Streams plugin]
45-
** xref::docker.adoc#docker_kafka_connect[Kafka Connect Plugin]
46-
** xref::docker.adoc#docker_streams_cluster[Neo4j Streams with Neo4j Cluster and Kafka Cluster]
44+
// ** xref::docker.adoc#neo4j_streams_docker[Neo4j Streams plugin]
45+
// ** xref::docker.adoc#docker_kafka_connect[Kafka Connect Plugin]
46+
// ** xref::docker.adoc#docker_streams_cluster[Neo4j Streams with Neo4j Cluster and Kafka Cluster]
4747
4848
* xref::kafka-ssl.adoc[Configure with Kafka over SSL]
49-
** xref::kafka-ssl.adoc#kafka_ssl_self_signed[Self Signed Certificates]
50-
** xref::kafka-ssl.adoc#kafka_ssl_config[Kafka Configuration]
51-
** xref::kafka-ssl.adoc#kafka_ssl_neo4j_config[Neo4j Configuration]
52-
** xref::kafka-ssl.adoc#kafka_ssl_testing[Testing]
53-
** xref::kafka-ssl.adoc#_authentication_with_sasl[Authentication with SASL]
49+
// ** xref::kafka-ssl.adoc#kafka_ssl_self_signed[Self Signed Certificates]
50+
// ** xref::kafka-ssl.adoc#kafka_ssl_config[Kafka Configuration]
51+
// ** xref::kafka-ssl.adoc#kafka_ssl_neo4j_config[Neo4j Configuration]
52+
// ** xref::kafka-ssl.adoc#kafka_ssl_testing[Testing]
53+
// ** xref::kafka-ssl.adoc#_authentication_with_sasl[Authentication with SASL]
5454
5555
* xref::cloud.adoc[Confluent Cloud]
5656
@@ -65,22 +65,22 @@
6565
** xref::architecture/optimize.adoc[Optimizing Kafka]
6666
6767
* xref::examples.adoc[Examples with Confluent Platform and Kafka Connect Datagen]
68-
** xref::examples.adoc#examples_binary_format[Confluent and Neo4j in binary format]
69-
** xref::examples.adoc#confluent_docker_example[Confluent with Docker, Neo4j in binary format]
68+
// ** xref::examples.adoc#examples_binary_format[Confluent and Neo4j in binary format]
69+
// ** xref::examples.adoc#confluent_docker_example[Confluent with Docker, Neo4j in binary format]
7070
7171
* xref::developing.adoc[Developing Neo4j Streams]
72-
** xref::developing.adoc#dev_build_locally[Build locally]
73-
** xref::developing.adoc#dev_gen_docs[Generating this Documentation]
74-
** xref::developing.adoc#dev_gen_docs_antora[Generating this Documentation with Antora]
72+
// ** xref::developing.adoc#dev_build_locally[Build locally]
73+
// ** xref::developing.adoc#dev_gen_docs[Generating this Documentation]
74+
// ** xref::developing.adoc#dev_gen_docs_antora[Generating this Documentation with Antora]
7575
7676
* xref::faq.adoc[Neo4j Streams FAQ]
77-
** xref::faq.adoc#_source_code_license[Source Code License]
78-
** xref::faq.adoc#_how_to_integrate_neo4j_and_kafka[How to integrate Neo4j and Kafka]
79-
** xref::faq.adoc#_about_cud_file_format[About CUD file format]
80-
** xref::faq.adoc#_how_to_ingest_events_using_cdc_schema_strategy[How to ingest events using CDC Schema strategy]
81-
** xref::faq.adoc#_is_neo4j_streams_supported_by_confluent_cloud[Is Neo4j Streams supported by Confluent Cloud?]
82-
** xref::faq.adoc#_kafka_output_events_description[Kafka output events description]
83-
** xref::faq.adoc#_how_to_configure_kafka_over_ssl[How to configure Kafka over SSL?]
84-
** xref::faq.adoc#_enabling_dlq_functionality[Enabling DLQ functionality]
85-
** xref::faq.adoc#_supported_kafka_deserializers[Supported Kafka deserializers]
86-
** xref::faq.adoc#_kafka_cluster_and_topic_with_multiple_partition_setup[Kafka cluster and topic with multiple partition setup]
77+
// ** xref::faq.adoc#_source_code_license[Source Code License]
78+
// ** xref::faq.adoc#_how_to_integrate_neo4j_and_kafka[How to integrate Neo4j and Kafka]
79+
// ** xref::faq.adoc#_about_cud_file_format[About CUD file format]
80+
// ** xref::faq.adoc#_how_to_ingest_events_using_cdc_schema_strategy[How to ingest events using CDC Schema strategy]
81+
// ** xref::faq.adoc#_is_neo4j_streams_supported_by_confluent_cloud[Is Neo4j Streams supported by Confluent Cloud?]
82+
// ** xref::faq.adoc#_kafka_output_events_description[Kafka output events description]
83+
// ** xref::faq.adoc#_how_to_configure_kafka_over_ssl[How to configure Kafka over SSL?]
84+
// ** xref::faq.adoc#_enabling_dlq_functionality[Enabling DLQ functionality]
85+
// ** xref::faq.adoc#_supported_kafka_deserializers[Supported Kafka deserializers]
86+
// ** xref::faq.adoc#_kafka_cluster_and_topic_with_multiple_partition_setup[Kafka cluster and topic with multiple partition setup]

doc/docs/modules/ROOT/pages/architecture.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
[[architecture]]
2-
== Architectural Guidance
1+
= Architectural Guidance
32

3+
[[architecture]]
44
The purpose of this section is to:
55

66
* Describe how this integration works

doc/docs/modules/ROOT/pages/cloud.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11

2-
[[confluent_cloud]]
3-
== Confluent Cloud
2+
= Confluent Cloud
43

4+
[[confluent_cloud]]
55
Configuring a connection to a Confluent Cloud instance should follow
66
link:https://docs.confluent.io/current/cloud/using/config-client.html#java-client[Confluent's Java Client]
77
configuration advice, and the advice in <<_kafka_settings, Kafka Settings>> section.

doc/docs/modules/ROOT/pages/config-override-policy.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11

2-
[[kafka_connect_config_policy]]
2+
[#kafka_connect_config_policy]
33
=== Kafka Connect Client Config Override Policy
44

55
In Apache Kafka 2.3.0 was introduced the ability for each source and sink connector to inherit their client configurations

doc/docs/modules/ROOT/pages/consumer-configuration.adoc

Lines changed: 10 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,9 @@
1-
2-
[[neo4j_streams_config_summary]]
31
=== Configuration summary
42

53
You can set the following Kafka configuration values in your `neo4j.conf`, here are the defaults.
64

75
.neo4j.conf
8-
[subs="verbatim,attributes"]
6+
[source,subs="verbatim,attributes"]
97
----
108
kafka.zookeeper.connect=localhost:2181
119
kafka.bootstrap.servers=localhost:9092
@@ -34,6 +32,7 @@ in order to support this feature you can set for each database instance a config
3432

3533
Following the list of new properties that allows to support multi-tenancy:
3634

35+
[source]
3736
----
3837
streams.sink.topic.cypher.<TOPIC_NAME>.to.<DB_NAME>=<CYPHER_QUERY>
3938
streams.sink.topic.cdc.sourceId.to.<DB_NAME>=<LIST_OF_TOPICS_SEPARATE_BY_SEMICOLON>
@@ -49,6 +48,7 @@ This means that for each db instance you can specify if:
4948

5049
So if you have a instance name `foo` you can specify a configuration in this way:
5150

51+
[source]
5252
----
5353
streams.sink.topic.cypher.<TOPIC_NAME>.to.foo=<CYPHER_QUERY>
5454
streams.sink.topic.cdc.sourceId.to.foo=<LIST_OF_TOPICS_SEPARATE_BY_SEMICOLON>
@@ -60,6 +60,7 @@ streams.sink.enabled.to.foo=<true/false, default=true>
6060

6161
The old properties:
6262

63+
[source]
6364
----
6465
streams.sink.topic.cypher.<TOPIC_NAME>=<CYPHER_QUERY>
6566
streams.sink.topic.cdc.sourceId=<LIST_OF_TOPICS_SEPARATE_BY_SEMICOLON>
@@ -83,6 +84,7 @@ Database names are case-insensitive and normalized to lowercase, and must follow
8384
In particular the following property will be used as default values
8485
for non-default db instances, in case of the specific configuration params is not provided:
8586

87+
[source]
8688
----
8789
streams.sink.enabled=<true/false, default=true>
8890
----
@@ -96,6 +98,7 @@ This means that if you have Neo4j with 3 db instances:
9698
and you want to enable the Sink plugin on all instance
9799
you can simply omit any configuration about enabling it, you just need to provide the routing configuration for each instance:
98100

101+
[source]
99102
----
100103
streams.sink.topic.cypher.customersTopic.to.customers=MERGE (c:Customer{customerId: event.customerId}) SET c += event.properties
101104
streams.sink.topic.cypher.productsTopic.to.products=MERGE (c:Product{productId: event.productId}) SET c += event.properties
@@ -105,6 +108,7 @@ streams.sink.topic.cypher.productsTopic.to.neo4j=MERGE (c:MyLabel{myId: event.my
105108
Otherwise if you want to enable the Sink plugin only on `customers` and `products` instances
106109
you can do it in this way:
107110

111+
[source]
108112
----
109113
streams.sink.enabled=false
110114
streams.sink.enabled.to.customers=true
@@ -115,8 +119,10 @@ streams.sink.topic.cypher.productsTopic.to.products=MERGE (c:Product{productId:
115119

116120
So in general if you have:
117121

122+
[source]
118123
----
119124
streams.sink.enabled=true
120125
streams.sink.enabled.to.foo=false
121126
----
122-
Then sink is enabled on all databases EXCEPT foo (local overrides global)
127+
128+
Then sink is enabled on all databases EXCEPT foo (local overrides global)

doc/docs/modules/ROOT/pages/consumer.adoc

Lines changed: 8 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,4 @@
1-
2-
[[neo4j_streams_sink]]
3-
== Neo4j Streams - Sink: Kafka -> Neo4j
1+
= Neo4j Streams - Sink: Kafka -> Neo4j
42
:environment: streams.sink
53
:id: streams_sink
64

@@ -10,12 +8,13 @@ ifdef::env-docs[]
108
This chapter describes the Neo4j Streams Sink in the Neo4j Streams Library.
119
Use this section to configure Neo4j to ingest the data from Kafka into Neo4j.
1210
--
13-
endif::env-docs[]
1411

12+
endif::env-docs[]
13+
[[neo4j_streams_sink]]
1514
Is the Kafka Sink that ingest the data directly into Neo4j
1615

1716
[[neo4j_streams_sink_howitworks]]
18-
=== How it works
17+
== How it works
1918

2019
It works in several ways:
2120

@@ -24,7 +23,7 @@ It works in several ways:
2423
* by providing a pattern extraction to a JSON or AVRO file
2524
* by managing a CUD file format
2625

27-
==== Cypher Template
26+
=== Cypher Template
2827

2928
It works with template Cypher queries stored into properties with the following format:
3029

@@ -67,7 +66,7 @@ MERGE (n:Label {id: event.id})
6766
ON CREATE SET n += event.properties
6867
----
6968

70-
Where `{events}` is a json list, so continuing with the example above a possible full representation could be:
69+
Where `\{events}` is a json list, so continuing with the example above a possible full representation could be:
7170

7271
[source,cypher]
7372
----
@@ -98,7 +97,7 @@ include::sink-strategies.adoc[]
9897
include::cud-file-format.adoc[]
9998

10099
[[neo4j_streams_dlq]]
101-
=== How deal with bad data
100+
== How deal with bad data
102101

103102
The Neo4j Streams Plugin provides several means to handle processing errors.
104103

@@ -194,7 +193,7 @@ Every published record in the `Dead Letter Queue` contains the original record `
194193
|===
195194

196195
[[neo4j_streams_supported_deserializers]]
197-
=== Supported Kafka deserializers
196+
== Supported Kafka deserializers
198197

199198
The Neo4j Streams plugin supports 2 deserializers:
200199

0 commit comments

Comments
 (0)