Skip to content

Commit 31a6504

Browse files
authored
Feature/remove secrets from readme (#31)
* remove secrets from readme * minor changes to readme
1 parent b34c21c commit 31a6504

File tree

1 file changed

+1
-23
lines changed

1 file changed

+1
-23
lines changed

README.md

Lines changed: 1 addition & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
# DataWorkbench
1616

1717
## What is it?
18-
Veracity DataWorkbench is a Python SDK designed to bridge your Python environment with Veracity DataWorkbench services. It simplifies access to data cataloging, lineage tracking, and APIs — supporting efficient data workflows across local and cloud environments such as Databricks
18+
Veracity DataWorkbench is a Python SDK designed to bridge your Databricks environment with Veracity Data Workbench. It simplifies access to data cataloging, lineage tracking, and APIs.
1919

2020

2121
## Table of Contents
@@ -62,28 +62,6 @@ datacatalogue = DataCatalogue() # Naming subject to change
6262
datacatalogue.save(df, "Dataset Name", "Description", tags={"environment": ["test"]})
6363
```
6464

65-
## Configuration
66-
67-
When using Dataworkbench locally, you need to configure the following environment variables:
68-
69-
```python
70-
# Required for local machine setup
71-
import os
72-
73-
os.environ["ApimClientId"] = "your-apim-client-id"
74-
os.environ["ApimClientSecret"] = "your-apim-client-secret"
75-
os.environ["ApimScope"] = "your-apim-scope"
76-
```
77-
78-
Alternatively, create a `.env` file or use a configuration file:
79-
80-
```
81-
# .env file example
82-
ApimClientId=your-apim-client-id
83-
ApimClientSecret=your-apim-client-secret
84-
ApimScope=your-apim-scope
85-
```
86-
8765
## Examples
8866

8967
### Saving a Spark DataFrame to the Data Catalogue

0 commit comments

Comments
 (0)