You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CONTRIBUTING.md
+20-2Lines changed: 20 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -105,18 +105,31 @@ You can also manually configure an instance of the MarkLogic Kafka connector as
105
105
In the list of connectors in Control Center, the connector will initially have a status of "Failed" while it starts up.
106
106
After it starts successfully, it will have a status of "Running".
107
107
108
+
## Debugging the MarkLogic Kafka connector
109
+
110
+
The main mechanism for debugging an instance of the MarkLogic Kafka connector is by examining logs from the
111
+
connector. You can access those, along with logging from Kafka Connect and all other connectors, by running the
112
+
following:
113
+
114
+
confluent local services connect log -f
115
+
116
+
See [the log command docs](https://docs.confluent.io/confluent-cli/current/command-reference/local/services/connect/confluent_local_services_connect_log.html)
117
+
for more information.
118
+
108
119
## Destroying and setting up the Confluent Platform instance
109
120
110
121
While developing and testing the MarkLogic Kafka connector, it is common that the "local" instance of Confluent
111
122
Platform will become unstable and no longer work. The [Confluent local docs](https://docs.confluent.io/confluent-cli/current/command-reference/local/confluent_local_current.html)
112
123
make reference to this - "The data that are produced are transient and are intended to be temporary".
113
124
114
125
It is thus advisable that after you copy a new instance of the MarkLogic Kafka connector into Confluent Platform (i.e.
115
-
by running `./gradlew copyConnectorToConfluent`), you should destroy your local Confluent Platform instance:
126
+
by running `./gradlew copyConnectorToConfluent`), you should destroy your local Confluent Platform instance (this
127
+
will usually finish in around 15s):
116
128
117
129
./gradlew destroyLocalConfluent
118
130
119
-
After doing that, you can quickly automate starting Confluent Platform and loading the two connectors via the following:
131
+
After doing that, you can quickly automate starting Confluent Platform and loading the two connectors via the
132
+
following (this will usually finish in around 1m):
120
133
121
134
./gradlew setupLocalConfluent
122
135
@@ -125,6 +138,11 @@ Remember that if you've modified the connector code, you'll first need to run `.
125
138
Doing the above will provide the most reliable way to get a new and working instance of Confluent Platform with the
126
139
MarkLogic Kafka connector installed.
127
140
141
+
For brevity, you may prefer this (Gradle will figure out the tasks as long as only one task starts with "destroy"
142
+
and one task starts with "setup"):
143
+
144
+
./gradlew destroy setup
145
+
128
146
You may have luck with simply doing `confluent local services stop`, `./gradlew copyConnectorToConfluent`, and
129
147
`confluent local services start`, but this has so far not worked reliably - i.e. one of the Confluent Platform
130
148
services (sometimes Schema Registry, sometimes Control Center) usually stops working.
Copy file name to clipboardExpand all lines: src/main/java/com/marklogic/kafka/connect/sink/MarkLogicSinkConfig.java
+4-3Lines changed: 4 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -58,12 +58,13 @@ public class MarkLogicSinkConfig extends AbstractConfig {
58
58
.define(CONNECTION_PORT, Type.INT, Importance.HIGH, "The REST app server port to connect to")
59
59
.define(CONNECTION_DATABASE, Type.STRING, "", Importance.LOW, "Database to connect, if different from the one associated with the port")
60
60
.define(CONNECTION_SECURITY_CONTEXT_TYPE, Type.STRING, "NONE", Importance.HIGH, "Type of MarkLogic security context to create - either digest, basic, kerberos, certificate, or none")
61
-
.define(CONNECTION_USERNAME, Type.STRING, Importance.HIGH, "Name of MarkLogic user to authenticate as")
62
-
.define(CONNECTION_PASSWORD, Type.STRING, Importance.HIGH, "Password for the MarkLogic user")
61
+
.define(CONNECTION_USERNAME, Type.STRING, null, Importance.HIGH, "Name of MarkLogic user to " +
62
+
"authenticate as")
63
+
.define(CONNECTION_PASSWORD, Type.PASSWORD, null, Importance.HIGH, "Password for the MarkLogic user")
63
64
.define(CONNECTION_TYPE, Type.STRING, "DIRECT", Importance.LOW, "Connection type; DIRECT or GATEWAY")
64
65
.define(CONNECTION_SIMPLE_SSL, Type.BOOLEAN, false, Importance.LOW, "Set to true to use a trust-everything SSL connection")
65
66
.define(CONNECTION_CERT_FILE, Type.STRING, "", Importance.LOW, "Path to a certificate file")
66
-
.define(CONNECTION_CERT_PASSWORD, Type.STRING, "", Importance.LOW, "Password for the certificate file")
67
+
.define(CONNECTION_CERT_PASSWORD, Type.PASSWORD, null, Importance.LOW, "Password for the certificate file")
67
68
.define(CONNECTION_EXTERNAL_NAME, Type.STRING, "", Importance.LOW, "External name for Kerberos authentication")
68
69
.define(DATAHUB_FLOW_NAME, Type.STRING, null, Importance.MEDIUM, "Name of a Data Hub flow to run")
69
70
.define(DATAHUB_FLOW_STEPS, Type.STRING, null, Importance.MEDIUM, "Comma-delimited names of steps to run")
0 commit comments