Skip to content

Commit 8aaed31

Browse files
authored
Merge pull request #70 from marklogic-community/feature/readme-docs
Updating README with a user guide
2 parents bfc8bbe + 7cb5fca commit 8aaed31

File tree

10 files changed

+414
-371
lines changed

10 files changed

+414
-371
lines changed
Lines changed: 1 addition & 66 deletions
Original file line numberDiff line numberDiff line change
@@ -1,73 +1,8 @@
1-
# Licensed to the Apache Software Foundation (ASF) under one or more
2-
# contributor license agreements. See the NOTICE file distributed with
3-
# this work for additional information regarding copyright ownership.
4-
# The ASF licenses this file to You under the Apache License, Version 2.0
5-
# (the "License"); you may not use this file except in compliance with
6-
# the License. You may obtain a copy of the License at
7-
#
8-
# http://www.apache.org/licenses/LICENSE-2.0
9-
#
10-
# Unless required by applicable law or agreed to in writing, software
11-
# distributed under the License is distributed on an "AS IS" BASIS,
12-
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13-
# See the License for the specific language governing permissions and
14-
# limitations under the License.
15-
16-
# These are defaults. This file just demonstrates how to override some settings.
1+
# Change the host as needed
172
bootstrap.servers=172.31.48.44:9092
18-
# (Optional) The group id provides a way to logically group connectors to distribute the load across multiple instances.
19-
group.id=marklogic
20-
21-
# The next two sections are necessary to establish an SSL connection to the Kafka servers.
22-
# To enable SSL, uncomment and customize the lines that start with "#* " in those two sections.
23-
24-
# SSL connection properties
25-
# For more information, see https://docs.confluent.io/current/kafka/encryption.html#encryption-ssl-connect
26-
# These top-level settings are used by the Connect worker for group coordination and to read and write to the internal
27-
# topics which are used to track the cluster's state (e.g. configs and offsets).
28-
#* security.protocol=SSL
29-
# You must create a truststore that contains either the server certificate or a trusted CA that signed the server cert.
30-
# This is how I did it with keytools:
31-
# keytool -keystore kafka.truststore.jks -alias caroot -import -file ca-cert -storepass "XXXXX" -keypass "XXXXX" -noprompt
32-
#* ssl.truststore.location=/secure/path/to/certs/kafka.client.truststore.jks
33-
#* ssl.truststore.password=truststorePassphrase
34-
# For now, turn off hostname verification since we're using self-signed certificates
35-
# This might also be fixable by fixing the "Subject Alternative Name (SAN)", but I'm not a cert expert.
36-
#* ssl.endpoint.identification.algorithm=
37-
38-
# Yes, both of these sections are required.
39-
# Connect workers manage the producers used by source connectors and the consumers used by sink connectors.
40-
# So, for the connectors to leverage security, you also have to override the default producer/consumer
41-
# configuration that the worker uses.
42-
#* consumer.bootstrap.servers=localhost:9093
43-
#* consumer.security.protocol=SSL
44-
#* consumer.ssl.truststore.location=/secure/path/to/certs/kafka.client.truststore.jks
45-
#* consumer.ssl.truststore.password=truststorePassphrase
46-
#* consumer.ssl.endpoint.identification.algorithm=
473

48-
49-
# The converters specify the format of data in Kafka and how to translate it into Connect data. Every Connect user will
50-
# need to configure these based on the format they want their data in when loaded from or stored into Kafka
514
key.converter=org.apache.kafka.connect.storage.StringConverter
525
value.converter=org.apache.kafka.connect.storage.StringConverter
536

54-
55-
# Converter-specific settings can be passed in by prefixing the Converter's setting with the converter we want to apply
56-
# it to
57-
key.converter.schemas.enable=false
58-
value.converter.schemas.enable=false
59-
607
offset.storage.file.filename=/tmp/connect.offsets
61-
# Flush much faster than normal, which is useful for testing/debugging
628
offset.flush.interval.ms=10000
63-
64-
# Set to a list of filesystem paths separated by commas (,) to enable class loading isolation for plugins
65-
# (connectors, converters, transformations). The list should consist of top level directories that include
66-
# any combination of:
67-
# a) directories immediately containing jars with plugins and their dependencies
68-
# b) uber-jars with plugins and their dependencies
69-
# c) directories immediately containing the package directory structure of classes of plugins and their dependencies
70-
# Note: symlinks will be followed to discover dependencies or plugins.
71-
# Examples:
72-
# plugin.path=/usr/local/share/java,/usr/local/share/kafka/plugins,/opt/connectors,
73-
#plugin.path=

CHANGELOG.md

Lines changed: 0 additions & 79 deletions
This file was deleted.
-870 KB
Binary file not shown.
-1.14 MB
Binary file not shown.
-870 KB
Binary file not shown.

0 commit comments

Comments
 (0)