Skip to content

Commit b2e86cb

Browse files
“Rounakclaude
andcommitted
Docs: Fix Copilot-identified issues in pipeline connectors
- Fix Dagster port typo (300 → 3000) - Fix KafkaConnect troubleshooting links - Fix Data Factory typo (DData → Data) - Fix grammar errors (should be perform → should be performed) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
1 parent 2d71d8c commit b2e86cb

File tree

10 files changed

+12
-12
lines changed

10 files changed

+12
-12
lines changed

connectors/pipeline/dagster/hybrid-runner.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ The ingestion framework uses [dagster graphql python client](https://docs.dagste
3232
## Connection Details
3333
<Steps>
3434
<Step title="Connection Details">
35-
- **Host**: Host of the dagster eg.`https://localhost:300` or `https://127.0.0.1:3000` or `https://<yourorghere>.dagster.cloud/prod`
35+
- **Host**: Host of the dagster eg.`https://localhost:3000` or `https://127.0.0.1:3000` or `https://<yourorghere>.dagster.cloud/prod`
3636
- **Token** : Need pass token if connecting to `dagster cloud` instance
3737
- Log in to your Dagster account.
3838
- Click on the "Settings" link in the top navigation bar.

connectors/pipeline/dagster/index.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ The ingestion framework uses [dagster graphql python client](https://docs.dagste
3232
## Connection Details
3333
<Steps>
3434
<Step title="Connection Details">
35-
- **Host**: Host of the dagster eg.`https://localhost:300` or `https://127.0.0.1:3000` or `https://<yourorghere>.dagster.cloud/prod`
35+
- **Host**: Host of the dagster eg.`https://localhost:3000` or `https://127.0.0.1:3000` or `https://<yourorghere>.dagster.cloud/prod`
3636
- **Token** : Need pass token if connecting to `dagster cloud` instance
3737
- Log in to your Dagster account.
3838
- Click on the "Settings" link in the top navigation bar.

connectors/pipeline/datafactory/hybrid-runner.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -72,7 +72,7 @@ Ensure that the service principal or managed identity you’re using has the nec
7272
## Displaying Lineage Information
7373
Steps to retrieve and display the lineage information for a Data Factory service.
7474
1. Ingest Source and Sink Database Metadata: Identify both the source and sink database used by the Azure Data Factory service for example Redshift. Ingest metadata for these database.
75-
2. Ingest Data Factory Service Metadata: Finally, Ingest your DData Factory service.
75+
2. Ingest Data Factory Service Metadata: Finally, Ingest your Data Factory service.
7676
By successfully completing these steps, the lineage information for the service will be displayed.
7777
### Missing Lineage
7878
If lineage information is not displayed for a Data Factory service, follow these steps to diagnose the issue.

connectors/pipeline/datafactory/index.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -72,7 +72,7 @@ Ensure that the service principal or managed identity you’re using has the nec
7272
## Displaying Lineage Information
7373
Steps to retrieve and display the lineage information for a Data Factory service.
7474
1. Ingest Source and Sink Database Metadata: Identify both the source and sink database used by the Azure Data Factory service for example Redshift. Ingest metadata for these database.
75-
2. Ingest Data Factory Service Metadata: Finally, Ingest your DData Factory service.
75+
2. Ingest Data Factory Service Metadata: Finally, Ingest your Data Factory service.
7676
By successfully completing these steps, the lineage information for the service will be displayed.
7777
### Missing Lineage
7878
If lineage information is not displayed for a Data Factory service, follow these steps to diagnose the issue.

connectors/pipeline/flink/hybrid-runner.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ The ingestion framework uses flink REST APIs to connect to the instance and perf
3939
- caCertificate: Authorized certificate for ssl configured server.
4040
- sslCertificate: SSL certificate for the server.
4141
- sslKey: Server root key for the connection.
42-
- **verifySSL** : Whether SSL verification should be perform when authenticating.
42+
- **verifySSL** : Whether SSL verification should be performed when authenticating.
4343
</Step>
4444
<TestConnection />
4545
<ConfigureIngestion />

connectors/pipeline/flink/index.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ The ingestion framework uses flink REST APIs to connect to the instance and perf
3939
- caCertificate: Authorized certificate for ssl configured server.
4040
- sslCertificate: SSL certificate for the server.
4141
- sslKey: Server root key for the connection.
42-
- **verifySSL** : Whether SSL verification should be perform when authenticating.
42+
- **verifySSL** : Whether SSL verification should be performed when authenticating.
4343
</Step>
4444
<TestConnection />
4545
<ConfigureIngestion />

connectors/pipeline/kafkaconnect/hybrid-runner.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ Configure and schedule KafkaConnect metadata workflows from the Collate UI:
2323
- [Metadata Ingestion](#metadata-ingestion)
2424
- [Connection Details](#connection-details)
2525
- [Metadata Ingestion Options](#metadata-ingestion-options)
26-
- [Troubleshooting](/connectors/pipeline/glue-pipeline/troubleshooting)
26+
- [Troubleshooting](/connectors/pipeline/kafkaconnect/troubleshooting)
2727
## Requirements
2828
### KafkaConnect Versions
2929
OpenMetadata is integrated with kafkaconnect up to version [3.6.1](https://docs.kafkaconnect.io/getting-started) and will continue to work for future kafkaconnect versions.
@@ -38,7 +38,7 @@ The ingestion framework uses [kafkaconnect python client](https://libraries.io/p
3838
1. Basic Authentication
3939
- Username: Username to connect to Kafka Connect. This user should be able to send request to the Kafka Connect API and access the [Rest API](https://docs.confluent.io/platform/current/connect/references/restapi.html) GET endpoints.
4040
- Password: Password to connect to Kafka Connect.
41-
- **verifySSL** : Whether SSL verification should be perform when authenticating.
41+
- **verifySSL** : Whether SSL verification should be performed when authenticating.
4242
- **Kafka Service Name** : The Service Name of the Ingested [Kafka](/connectors/messaging/kafka#4.-name-and-describe-your-service) instance associated with this KafkaConnect instance.
4343
</Step>
4444
<TestConnection />

connectors/pipeline/kafkaconnect/index.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ Configure and schedule KafkaConnect metadata workflows from the Collate UI:
2323
- [Metadata Ingestion](#metadata-ingestion)
2424
- [Connection Details](#connection-details)
2525
- [Metadata Ingestion Options](#metadata-ingestion-options)
26-
- [Troubleshooting](/connectors/pipeline/glue-pipeline/troubleshooting)
26+
- [Troubleshooting](/connectors/pipeline/kafkaconnect/troubleshooting)
2727
## Requirements
2828
### KafkaConnect Versions
2929
OpenMetadata is integrated with kafkaconnect up to version [3.6.1](https://docs.kafkaconnect.io/getting-started) and will continue to work for future kafkaconnect versions.
@@ -38,7 +38,7 @@ The ingestion framework uses [kafkaconnect python client](https://libraries.io/p
3838
1. Basic Authentication
3939
- Username: Username to connect to Kafka Connect. This user should be able to send request to the Kafka Connect API and access the [Rest API](https://docs.confluent.io/platform/current/connect/references/restapi.html) GET endpoints.
4040
- Password: Password to connect to Kafka Connect.
41-
- **verifySSL** : Whether SSL verification should be perform when authenticating.
41+
- **verifySSL** : Whether SSL verification should be performed when authenticating.
4242
- **Kafka Service Name** : The Service Name of the Ingested [Kafka](/connectors/messaging/kafka#4.-name-and-describe-your-service) instance associated with this KafkaConnect instance.
4343
</Step>
4444
<TestConnection />

connectors/pipeline/nifi/index.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ For a complete guide on managing secrets in hybrid setups, see the [Hybrid Inges
4545
1. Basic Authentication
4646
- Username: Username to connect to NiFi. This user should be able to send request to the Nifi API and access the `Resources` endpoint.
4747
- Password: Password to connect to NiFi.
48-
- Verify SSL: Whether SSL verification should be perform when authenticating.
48+
- Verify SSL: Whether SSL verification should be performed when authenticating.
4949
2. Client Certificate Authentication
5050
- Certificate Authority Path: Path to the certificate authority (CA) file. This is the certificate used to store and issue your digital certificate. This is an optional parameter. If omitted SSL verification will be skipped; this can present some sever security issue.
5151
**important**: This file should be accessible from where the ingestion workflow is running. For example, if you are using OpenMetadata Ingestion Docker container, this file should be in this container.

connectors/pipeline/nifi/yaml.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ This is a sample config for NiFi:
5454
**1.** Using Basic authentication
5555
- **username**: Username to connect to NiFi. This user should be able to send request to the NiFi API and access the `Resources` endpoint.
5656
- **password**: Password to connect to NiFi.
57-
- **verifySSL**: Whether SSL verification should be perform when authenticating.
57+
- **verifySSL**: Whether SSL verification should be performed when authenticating.
5858
**2.** Using client certificate authentication
5959
- **certificateAuthorityPath**: Path to the certificate authority (CA) file. This is the certificate used to store and issue your digital certificate. This is an optional parameter. If omitted SSL verification will be skipped; this can present some sever security issue.
6060
**important**: This file should be accessible from where the ingestion workflow is running. For example, if you are using OpenMetadata Ingestion Docker container, this file should be in this container.

0 commit comments

Comments
 (0)