Skip to content

Commit bb84623

Browse files
Merge branch 'main' into hybridrunnerdocupd
2 parents df5e852 + 1d8b9d2 commit bb84623

File tree

39 files changed

+101
-249
lines changed

39 files changed

+101
-249
lines changed

collate-ai/sql-agent.mdx

Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -51,3 +51,25 @@ After clicking the icon, you will be redirected to the Ask Collate interface.
5151
In Ask Collate, type your question or request.
5252

5353
You can ask anything related to SQL query creation, optimization, explanations, or filters.
54+
55+
<img noZoom src="/public/images/collate-ai/collate-ai-sql-agent-feature.png" alt="update or suggest tiers" />
56+
57+
## Setup Instructions
58+
59+
1. **Navigate to Applications**: Go to `Settings > Applications`.
60+
61+
<img noZoom src="/public/images/collate-ai/collate-ai-agent.png" alt="setting up Collate AI" />
62+
63+
2. **Install the Agent**: Click on "Add Apps" to access the marketplace and install the Collate AI Tier Agent.
64+
65+
<img noZoom src="/public/images/collate-ai/collate-ai-sql-agent.png" alt="Installation" />
66+
67+
3. **Configure the Agent**:
68+
- **Filter**: Use the UI Query Filter builder to select assets for tier classification.
69+
- **Patch Tier If Empty**: Enable this option to automatically assign a tier to assets that currently lack one.
70+
71+
<img noZoom src="/public/images/collate-ai/collate-ai-tier-agent1.png" alt="Configuration" />
72+
73+
4. **Scheduling**: Set up regular intervals for the agent to run and update metadata.
74+
75+
<img noZoom src="/public/images/collate-ai/collate-ai-tier-agent2.png" alt="Scheduling" />

collate-menu.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1964,9 +1964,9 @@ site_menu:
19641964
url: /main-concepts/metadata-standard/schemas/security/credentials/gcpvalues
19651965
- category: Main Concepts / Metadata Standard / Schemas / Security / Credentials / GitCredentials
19661966
url: /main-concepts/metadata-standard/schemas/security/credentials/gitcredentials
1967-
- category: Main Concepts / Metadata Standard / Schemas / Security / Credentials / GithubCredentials
1967+
- category: Main Concepts / Metadata Standard / Schemas / Security / Credentials / GitHubCredentials
19681968
url: /main-concepts/metadata-standard/schemas/security/credentials/githubcredentials
1969-
- category: Main Concepts / Metadata Standard / Schemas / Security / Credentials / GitlabCredentials
1969+
- category: Main Concepts / Metadata Standard / Schemas / Security / Credentials / GitLabCredentials
19701970
url: /main-concepts/metadata-standard/schemas/security/credentials/gitlabcredentials
19711971
- category: Main Concepts / Metadata Standard / Schemas / Security / Credentials
19721972
url: /main-concepts/metadata-standard/schemas/security/credentials

connectors/api/rest/index.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ Configure and schedule REST metadata workflows from the OpenMetadata UI:
2323
- [Troubleshooting](/connectors/api/rest/troubleshooting)
2424
## Requirements
2525
### Generate OpenAPI Schema URL
26-
- Generate OpenAPI schema url for your service[OpenAPI spec](https://swagger.io/specification/#openapi-document)
26+
- Generate OpenAPI schema URL for your service[OpenAPI spec](https://swagger.io/specification/#openapi-document)
2727
## Metadata Ingestion
2828
<MetadataIngestionUi connector={"REST"} selectServicePath={"/public/images/connectors/rest/select-service.png"} addNewServicePath={"/public/images/connectors/rest/add-new-service.png"} serviceConnectionPath={"/public/images/connectors/rest/service-connection.png"} />
2929
#### Connection Options

connectors/api/rest/yaml.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ Configure and schedule REST metadata workflows from the OpenMetadata UI:
3030
### Python Requirements
3131
<PythonRequirements />
3232
### Generate OpenAPI Schema URL
33-
- Generate OpenAPI schema url for your service[OpenAPI spec](https://swagger.io/specification/#openapi-document)
33+
- Generate OpenAPI schema URL for your service[OpenAPI spec](https://swagger.io/specification/#openapi-document)
3434
## Metadata Ingestion
3535
### 1. Define the YAML Config
3636

File renamed without changes.

connectors/dashboard/powerbi-report-server/troubleshooting.mdx renamed to connectors/dashboard/powerbireportserver/troubleshooting.mdx

File renamed without changes.
File renamed without changes.

connectors/database/databricks/index.mdx

Lines changed: 2 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,6 @@ sidebarTitle: Overview
66
---
77
import ReverseMetadata from '/snippets/connectors/database/databricks/reverse-metadata.mdx'
88
import { ConnectorDetailsHeader } from '/snippets/components/ConnectorDetailsHeader/ConnectorDetailsHeader.jsx'
9-
import PythonRequirements from '/snippets/connectors/python-requirements.mdx'
109
import ExternalIngestionDeployment from '/snippets/connectors/external-ingestion-deployment.mdx'
1110
import TestConnection from '/snippets/connectors/test-connection.mdx'
1211
import IngestionScheduleAndDeploy from '/snippets/connectors/ingestion-schedule-and-deploy.mdx'
@@ -22,7 +21,7 @@ stage="PROD"
2221
availableFeatures={["Metadata", "Query Usage", "Lineage", "Column-level Lineage", "Data Profiler", "Data Quality", "dbt", "Sample Data", "Reverse Metadata (Collate Only)", "Auto-Classification"]}
2322
unavailableFeatures={["Stored Procedures", "Tags", "Owners"]} />
2423
<Tip>
25-
As per the [documentation](https://docs.databricks.com/en/data-governance/unity-catalog/tags.html#manage-tags-with-sql-commands) here, note that we only support metadata `tag` extraction for databricks version 13.3 version and higher.
24+
As per the [documentation](https://docs.databricks.com/en/data-governance/unity-catalog/tags.html#manage-tags-with-sql-commands) here, note that we only support metadata `tag` extraction for Databricks version 13.3 version and higher.
2625
</Tip>
2726
In this section, we provide guides and references to use the Databricks connector.
2827
Configure and schedule Databricks metadata and profiler workflows from the OpenMetadata UI:
@@ -38,12 +37,6 @@ Configure and schedule Databricks metadata and profiler workflows from the OpenM
3837
- [Reverse Metadata](#reverse-metadata)
3938
<ExternalIngestionDeployment />
4039
## Requirements
41-
### Python Requirements
42-
<PythonRequirements />
43-
To run the Databricks ingestion, you will need to install:
44-
```bash
45-
pip3 install "openmetadata-ingestion[databricks]"
46-
```
4740
### Permission Requirement
4841
To enable full functionality of metadata extraction, profiling, usage, and lineage features in OpenMetadata, the following permissions must be granted to the relevant users in your Databricks environment.
4942
### Metadata and Profiling Permissions
@@ -65,7 +58,7 @@ These permissions allow access to Databricks system tables that track query acti
6558
Adjust &lt;user&gt;, &lt;catalog_name&gt;, &lt;schema_name&gt;, and &lt;table_name&gt; according to your specific deployment and security requirements.
6659
</Tip>
6760
## Unity Catalog
68-
If you are using unity catalog in Databricks, then checkout the [Unity Catalog](/connectors/database/unity-catalog) connector.
61+
If you are using Unity Catalog in Databricks, then checkout the [Unity Catalog](/connectors/database/unity-catalog) connector.
6962
## Metadata Ingestion
7063
<MetadataIngestionUi connector={"Databricks"} selectServicePath={"/public/images/connectors/databricks/select-service.png"} addNewServicePath={"/public/images/connectors/databricks/add-new-service.png"} serviceConnectionPath={"/public/images/connectors/databricks/service-connection.png"} />
7164
# Connection Details

connectors/database/databricks/yaml.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ stage="PROD"
2929
availableFeatures={["Metadata", "Query Usage", "Lineage", "Column-level Lineage", "Data Profiler", "Data Quality", "dbt", "Sample Data", "Reverse Metadata (Collate Only)", "Auto-Classification"]}
3030
unavailableFeatures={["Stored Procedures", "Tags", "Owners"]} />
3131
<Tip>
32-
As per the [documentation](https://docs.databricks.com/en/data-governance/unity-catalog/tags.html#manage-tags-with-sql-commands) here, note that we only support metadata `tag` extraction for databricks version 13.3 version and higher.
32+
As per the [documentation](https://docs.databricks.com/en/data-governance/unity-catalog/tags.html#manage-tags-with-sql-commands) here, note that we only support metadata `tag` extraction for Databricks version 13.3 version and higher.
3333
</Tip>
3434
In this section, we provide guides and references to use the Databricks connector.
3535
Configure and schedule Databricks metadata and profiler workflows from the OpenMetadata UI:

connectors/database/sap-erp/index.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ This applies **only to fields marked as secrets** in the connection form (these
4141
For a complete guide on managing secrets in hybrid setups, see the [Hybrid Ingestion Runner Secret Management Guide](https://docs.getcollate.io/getting-started/day-1/hybrid-saas/hybrid-ingestion-runner#3.-manage-secrets-securely).
4242
</Tip>
4343
- **Host and Port**: This parameter specifies the host and port of the SAP ERP instance. This should be specified as a string in the format `https://hostname.com`.
44-
- **API Key**: Api Key to authenticate the SAP ERP Apis.
44+
- **API Key**: API Key to authenticate the SAP ERP APIs.
4545
- **database**: Optional name to give to the database in OpenMetadata. If left blank, we will use `default` as the database name.
4646
- **databaseSchema**: Optional name to give to the database schema in OpenMetadata. If left blank, we will use `default` as the database schema name.
4747
- **paginationLimit**: Pagination limit used while querying the SAP ERP APIs for fetching the entities.

0 commit comments

Comments
 (0)