diff --git a/README.md b/README.md index 0b65641..2049348 100644 --- a/README.md +++ b/README.md @@ -1,187 +1,278 @@ -# Databus Client Python +# Databus Python Client -## Quickstart Example -Commands to download the DBpedia Knowledge Graphs generated by Live Fusion. -DBpedia Live Fusion publishes two different kinds of KGs: +Command-line and Python client for downloading and deploying datasets on DBpedia Databus. -1. Open Core Knowledge Graphs under CC-BY-SA license, open with copyleft/share-alike, no registration needed -2. Industry Knowledge Graphs under BUSL 1.1 license, unrestricted for research and experimentation, commercial license for productive use, free registration needed. - -### Registration (Access Token) +## Table of Contents +- [Quickstart](#quickstart) + - [Python](#python) + - [Docker](#docker) +- [DBpedia](#dbpedia) + - [Registration (Access Token)](#registration-access-token) + - [DBpedia Knowledge Graphs](#dbpedia-knowledge-graphs) + - [Download Live Fusion KG Snapshot (BUSL 1.1, registration needed)](#download-live-fusion-kg-snapshot-busl-11-registration-needed) + - [Download Enriched Knowledge Graphs (BUSL 1.1, registration needed)](#download-enriched-knowledge-graphs-busl-11-registration-needed) + - [Download DBpedia Wikipedia Knowledge Graphs (CC-BY-SA, no registration needed)](#download-dbpedia-wikipedia-knowledge-graphs-cc-by-sa-no-registration-needed) + - [Download DBpedia Wikidata Knowledge Graphs (CC-BY-SA, no registration needed)](#download-dbpedia-wikidata-knowledge-graphs-cc-by-sa-no-registration-needed) +- [CLI Usage](#cli-usage) + - [Download](#cli-download) + - [Deploy](#cli-deploy) +- [Module Usage](#module-usage) + - [Deploy](#module-deploy) + + +## Quickstart + +The client supports two main workflows: downloading datasets from the Databus and deploying datasets to the Databus. Below you can choose how to run it (Python or Docker), then follow the sections on [DBpedia downloads](#dbpedia-knowledge-graphs), [CLI usage](#cli-usage), or [module usage](#module-usage). + +You can use either **Python** or **Docker**. Both methods support all client features. The Docker image is available at [dbpedia/databus-python-client](https://hub.docker.com/r/dbpedia/databus-python-client). -1. If you do not have a DBpedia Account yet (Forum/Databus), please register at https://account.dbpedia.org -2. Login at https://account.dbpedia.org and create your token. -3. Save the token to a file `vault-token.dat`. +### Python + +Requirements: [Python](https://www.python.org/downloads/) and [pip](https://pip.pypa.io/en/stable/installation/) + +Before using the client, install it via pip: -### Docker vs. Python -The databus-python-client comes as **docker** or **python** with these patterns. -`$DOWNLOADTARGET` can be any Databus URI including collections OR SPARQL query (or several thereof). Details are documented below. ```bash -# Docker -docker run --rm -v $(pwd):/data dbpedia/databus-python-client download $DOWNLOADTARGET --token vault-token.dat -# Python python3 -m pip install databusclient -databusclient download $DOWNLOADTARGET --token vault-token.dat ``` -### Download Live Fusion KG Snapshot (BUSL 1.1, registration needed) -TODO One slogan sentence. [More information](https://databus.dbpedia.org/dbpedia-enterprise/live-fusion-kg-snapshot) +You can then use the client in the command line: + ```bash -docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://databus.dbpedia.org/dbpedia-enterprise/live-fusion-kg-snapshot --token vault-token.dat +databusclient --help +databusclient deploy --help +databusclient download --help ``` -### Download Enriched Knowledge Graphs (BUSL 1.1, registration needed) -**DBpedia Wikipedia Extraction Enriched** -TODO One slogan sentence and link -Currently EN DBpedia only. +### Docker + +Requirements: [Docker](https://docs.docker.com/get-docker/) ```bash -docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://databus.dbpedia.org/dbpedia-enterprise/dbpedia-wikipedia-kg-enriched-snapshot --token vault-token.dat +docker run --rm -v $(pwd):/data dbpedia/databus-python-client --help +docker run --rm -v $(pwd):/data dbpedia/databus-python-client deploy --help +docker run --rm -v $(pwd):/data dbpedia/databus-python-client download --help ``` -**DBpedia Wikidata Extraction Enriched** -TODO One slogan sentence and link +## DBpedia + +Commands to download the [DBpedia Knowledge Graphs](#dbpedia-knowledge-graphs) generated by Live Fusion. DBpedia Live Fusion publishes two kinds of KGs: + +1. Open Core Knowledge Graphs under CC-BY-SA license, open with copyleft/share-alike, no registration needed. +2. Industry Knowledge Graphs under BUSL 1.1 license, unrestricted for research and experimentation, commercial license for productive use, free [registration](#registration-access-token) needed. + +### Registration (Access Token) + +To download BUSL 1.1 licensed datasets, you need to register and get an access token. + +1. If you do not have a DBpedia Account yet (Forum/Databus), please register at https://account.dbpedia.org +2. Log in at https://account.dbpedia.org and create your token. +3. Save the token to a file, e.g. `vault-token.dat`. + +### DBpedia Knowledge Graphs + +#### Download Live Fusion KG Snapshot (BUSL 1.1, registration needed) +High-frequency, conflict-resolved knowledge graph that merges Live Wikipedia and Wikidata signals into a single, queryable snapshot for enterprise consumption. [More information](https://databus.dev.dbpedia.link/fhofer/live-fusion-kg-dump) ```bash -docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://databus.dbpedia.org/dbpedia-enterprise/dbpedia-wikidata-kg-enriched-snapshot --token vault-token.dat +# Python +databusclient download https://databus.dev.dbpedia.link/fhofer/live-fusion-kg-dump --vault-token vault-token.dat +# Docker +docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://databus.dev.dbpedia.link/fhofer/live-fusion-kg-dump --vault-token vault-token.dat ``` -### Download DBpedia Wikipedia Knowledge Graphs (CC-BY-SA, no registration needed) -TODO One slogan sentence and link +#### Download Enriched Knowledge Graphs (BUSL 1.1, registration needed) + +**DBpedia Wikipedia Extraction Enriched** + +DBpedia-based enrichment of structured Wikipedia extractions (currently EN DBpedia only). [More information](https://databus.dev.dbpedia.link/fhofer/dbpedia-wikipedia-kg-enriched-dump) ```bash -docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://databus.dbpedia.org/dbpedia/dbpedia-wikipedia-kg-snapshot +# Python +databusclient download https://databus.dev.dbpedia.link/fhofer/dbpedia-wikipedia-kg-enriched-dump --vault-token vault-token.dat +# Docker +docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://databus.dev.dbpedia.link/fhofer/dbpedia-wikipedia-kg-enriched-dump --vault-token vault-token.dat ``` -### Download DBpedia Wikidata Knowledge Graphs (CC-BY-SA, no registration needed) -TODO One slogan sentence and link + +#### Download DBpedia Wikipedia Knowledge Graphs (CC-BY-SA, no registration needed) + +Original extraction of structured Wikipedia data before enrichment. [More information](https://databus.dev.dbpedia.link/fhofer/dbpedia-wikipedia-kg-dump) ```bash -docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://databus.dbpedia.org/dbpedia/dbpedia-wikidata-kg-snapshot +# Python +databusclient download https://databus.dev.dbpedia.link/fhofer/dbpedia-wikipedia-kg-dump +# Docker +docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://databus.dev.dbpedia.link/fhofer/dbpedia-wikipedia-kg-dump ``` -## Docker Image Usage +#### Download DBpedia Wikidata Knowledge Graphs (CC-BY-SA, no registration needed) -A docker image is available at [dbpedia/databus-python-client](https://hub.docker.com/r/dbpedia/databus-python-client). See [download section](#usage-of-docker-image) for details. +Original extraction of structured Wikidata data before enrichment. [More information](https://databus.dev.dbpedia.link/fhofer/dbpedia-wikidata-kg-dump) +```bash +# Python +databusclient download https://databus.dev.dbpedia.link/fhofer/dbpedia-wikidata-kg-dump +# Docker +docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://databus.dev.dbpedia.link/fhofer/dbpedia-wikidata-kg-dump +``` ## CLI Usage -**Installation** -```bash -python3 -m pip install databusclient -``` +To get started with the command-line interface (CLI) of the databus-python-client, you can use either the Python installation or the Docker image. The examples below show both methods. + +**Help and further general information:** -**Running** ```bash +# Python databusclient --help -``` +# Docker +docker run --rm -v $(pwd):/data dbpedia/databus-python-client --help -```man +# Output: Usage: databusclient [OPTIONS] COMMAND [ARGS]... + Databus Client CLI + Options: - --install-completion [bash|zsh|fish|powershell|pwsh] - Install completion for the specified shell. - --show-completion [bash|zsh|fish|powershell|pwsh] - Show completion for the specified shell, to - copy it or customize the installation. - --help Show this message and exit. + --help Show this message and exit. Commands: - deploy - download + deploy Flexible deploy to Databus command supporting three modes: + download Download datasets from databus, optionally using vault access... ``` + +### Download +With the download command, you can download datasets or parts thereof from the Databus. The download command expects one or more Databus URIs or a SPARQL query as arguments. The URIs can point to files, versions, artifacts, groups, or collections. If a SPARQL query is provided, the query must return download URLs from the Databus which will be downloaded. -### Download command +```bash +# Python +databusclient download $DOWNLOADTARGET +# Docker +docker run --rm -v $(pwd):/data dbpedia/databus-python-client download $DOWNLOADTARGET ``` + +- `$DOWNLOADTARGET` + - Can be any Databus URI including collections OR SPARQL query (or several thereof). +- `--localdir` + - If no `--localdir` is provided, the current working directory is used as base directory. The downloaded files will be stored in the working directory in a folder structure according to the Databus layout, i.e. `./$ACCOUNT/$GROUP/$ARTIFACT/$VERSION/`. +- `--vault-token` + - If the dataset/files to be downloaded require vault authentication, you need to provide a vault token with `--vault-token /path/to/vault-token.dat`. See [Registration (Access Token)](#registration-access-token) for details on how to get a vault token. +- `--databus-key` + - If the databus is protected and needs API key authentication, you can provide the API key with `--databus-key YOUR_API_KEY`. + +**Help and further information on download command:** +```bash +# Python databusclient download --help -``` +# Docker +docker run --rm -v $(pwd):/data dbpedia/databus-python-client download --help -``` +# Output: Usage: databusclient download [OPTIONS] DATABUSURIS... -Arguments: - DATABUSURIS... databus uris to download from https://databus.dbpedia.org, - or a query statement that returns databus uris from https://databus.dbpedia.org/sparql - to be downloaded [required] - Download datasets from databus, optionally using vault access if vault options are provided. Options: - --localdir TEXT Local databus folder (if not given, databus folder - structure is created in current working directory) - --databus TEXT Databus URL (if not given, inferred from databusuri, e.g. - https://databus.dbpedia.org/sparql) - --token TEXT Path to Vault refresh token file - --authurl TEXT Keycloak token endpoint URL [default: - https://auth.dbpedia.org/realms/dbpedia/protocol/openid- - connect/token] - --clientid TEXT Client ID for token exchange [default: vault-token- - exchange] - --help Show this message and exit. Show this message and exit. + --localdir TEXT Local databus folder (if not given, databus folder + structure is created in current working directory) + --databus TEXT Databus URL (if not given, inferred from databusuri, + e.g. https://databus.dbpedia.org/sparql) + --vault-token TEXT Path to Vault refresh token file + --databus-key TEXT Databus API key to donwload from protected databus + --authurl TEXT Keycloak token endpoint URL [default: + https://auth.dbpedia.org/realms/dbpedia/protocol/openid- + connect/token] + --clientid TEXT Client ID for token exchange [default: vault-token- + exchange] + --help Show this message and exit. ``` -Examples of using download command +### Examples of using the download command -**File**: download of a single file -``` +**Download File**: download of a single file +```bash +# Python databusclient download https://databus.dbpedia.org/dbpedia/mappings/mappingbased-literals/2022.12.01/mappingbased-literals_lang=az.ttl.bz2 +# Docker +docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://databus.dbpedia.org/dbpedia/mappings/mappingbased-literals/2022.12.01/mappingbased-literals_lang=az.ttl.bz2 ``` -**Version**: download of all files of a specific version -``` +**Download Version**: download of all files of a specific version +```bash +# Python databusclient download https://databus.dbpedia.org/dbpedia/mappings/mappingbased-literals/2022.12.01 +# Docker +docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://databus.dbpedia.org/dbpedia/mappings/mappingbased-literals/2022.12.01 ``` -**Artifact**: download of all files with latest version of an artifact -``` +**Download Artifact**: download of all files with the latest version of an artifact +```bash +# Python databusclient download https://databus.dbpedia.org/dbpedia/mappings/mappingbased-literals -``` - -**Group**: download of all files with lates version of all artifacts of a group +# Docker +docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://databus.dbpedia.org/dbpedia/mappings/mappingbased-literals ``` + +**Download Group**: download of all files with the latest version of all artifacts of a group +```bash +# Python databusclient download https://databus.dbpedia.org/dbpedia/mappings +# Docker +docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://databus.dbpedia.org/dbpedia/mappings ``` -If no `--localdir` is provided, the current working directory is used as base directory. The downloaded files will be stored in the working directory in a folder structure according to the databus structure, i.e. `./$ACCOUNT/$GROUP/$ARTIFACT/$VERSION/`. - -**Collection**: download of all files within a collection -``` +**Download Collection**: download of all files within a collection +```bash +# Python databusclient download https://databus.dbpedia.org/dbpedia/collections/dbpedia-snapshot-2022-12 +# Docker +docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://databus.dbpedia.org/dbpedia/collections/dbpedia-snapshot-2022-12 ``` -**Query**: download of all files returned by a query (sparql endpoint must be provided with `--databus`) -``` +**Download Query**: download of all files returned by a query (SPARQL endpoint must be provided with `--databus`) +```bash +# Python databusclient download 'PREFIX dcat: SELECT ?x WHERE { ?sub dcat:downloadURL ?x . } LIMIT 10' --databus https://databus.dbpedia.org/sparql +# Docker +docker run --rm -v $(pwd):/data dbpedia/databus-python-client download 'PREFIX dcat: SELECT ?x WHERE { ?sub dcat:downloadURL ?x . } LIMIT 10' --databus https://databus.dbpedia.org/sparql ``` -### Deploy command + +### Deploy + +With the deploy command, you can deploy datasets to the Databus. The deploy command supports three modes: +1. Classic dataset deployment via list of distributions +2. Metadata-based deployment via metadata JSON file +3. Upload & deploy via Nextcloud/WebDAV + +```bash +# Python +databusclient deploy [OPTIONS] [DISTRIBUTIONS]... +# Docker +docker run --rm -v $(pwd):/data dbpedia/databus-python-client deploy [OPTIONS] [DISTRIBUTIONS]... ``` + +**Help and further information on deploy command:** +```bash +# Python databusclient deploy --help -``` -``` +# Docker +docker run --rm -v $(pwd):/data dbpedia/databus-python-client deploy --help + +# Output: Usage: databusclient deploy [OPTIONS] [DISTRIBUTIONS]... - Flexible deploy to databus command: + Flexible deploy to Databus command supporting three modes: - - Classic dataset deployment + - Classic deploy (distributions as arguments) - - Metadata-based deployment + - Metadata-based deploy (--metadata ) - - Upload & deploy via Nextcloud + - Upload & deploy via Nextcloud (--webdav-url, --remote, --path) -Arguments: - DISTRIBUTIONS... Depending on mode: - - Classic mode: List of distributions in the form - URL|CV|fileext|compression|sha256sum:contentlength - (where URL is the download URL and CV the key=value pairs, - separated by underscores) - - Upload mode: List of local file or folder paths (must exist) - - Metdata mode: None - Options: --version-id TEXT Target databus version/dataset identifier of the form +### Deploy -## Module Usage -### Step 1: Create lists of distributions for the dataset +#### Step 1: Create lists of distributions for the dataset ```python from databusclient import create_distribution @@ -316,10 +420,10 @@ distributions.append( # will just place parameters correctly, nothing will be downloaded or inferred distributions.append( create_distribution( - url="https://example.org/some/random/file.csv.bz2", - cvs={"type": "example", "realfile": "false"}, - file_format="csv", - compression="bz2", + url="https://example.org/some/random/file.csv.bz2", + cvs={"type": "example", "realfile": "false"}, + file_format="csv", + compression="bz2", sha256_length_tuple=("7a751b6dd5eb8d73d97793c3c564c71ab7b565fa4ba619e4a8fd05a6f80ff653", 367116) ) ) @@ -330,7 +434,7 @@ A few notes: * The dict for content variants can be empty ONLY IF there is just one distribution * There can be no compression if there is no file format -### Step 2: Create dataset +#### Step 2: Create dataset ```python from databusclient import create_dataset @@ -359,14 +463,14 @@ dataset = create_dataset( ) ``` -NOTE: To be used you need to set all group parameters, or it will be ignored +NOTE: Group metadata is applied only if all group parameters are set. -### Step 3: Deploy to databus +#### Step 3: Deploy to Databus ```python from databusclient import deploy -# to deploy something you just need the dataset from the previous step and an APIO key +# to deploy something you just need the dataset from the previous step and an API key # API key can be found (or generated) at https://$$DATABUS_BASE$$/$$USER$$#settings -deploy(dataset, "mysterious api key") -``` \ No newline at end of file +deploy(dataset, "mysterious API key") +``` diff --git a/databusclient/cli.py b/databusclient/cli.py index 4e97470..3209008 100644 --- a/databusclient/cli.py +++ b/databusclient/cli.py @@ -94,10 +94,11 @@ def deploy(version_id, title, abstract, description, license_url, apikey, @click.argument("databusuris", nargs=-1, required=True) @click.option("--localdir", help="Local databus folder (if not given, databus folder structure is created in current working directory)") @click.option("--databus", help="Databus URL (if not given, inferred from databusuri, e.g. https://databus.dbpedia.org/sparql)") -@click.option("--token", help="Path to Vault refresh token file") +@click.option("--vault-token", help="Path to Vault refresh token file") +@click.option("--databus-key", help="Databus API key to donwload from protected databus") @click.option("--authurl", default="https://auth.dbpedia.org/realms/dbpedia/protocol/openid-connect/token", show_default=True, help="Keycloak token endpoint URL") @click.option("--clientid", default="vault-token-exchange", show_default=True, help="Client ID for token exchange") -def download(databusuris: List[str], localdir, databus, token, authurl, clientid): +def download(databusuris: List[str], localdir, databus, vault_token, databus_key, authurl, clientid): """ Download datasets from databus, optionally using vault access if vault options are provided. """ @@ -105,7 +106,8 @@ def download(databusuris: List[str], localdir, databus, token, authurl, clientid localDir=localdir, endpoint=databus, databusURIs=databusuris, - token=token, + token=vault_token, + databus_key=databus_key, auth_url=authurl, client_id=clientid, ) diff --git a/databusclient/client.py b/databusclient/client.py index 358f1a6..8138a84 100644 --- a/databusclient/client.py +++ b/databusclient/client.py @@ -491,7 +491,7 @@ def deploy_from_metadata( print(f" - {entry['url']}") -def __download_file__(url, filename, vault_token_file=None, auth_url=None, client_id=None) -> None: +def __download_file__(url, filename, vault_token_file=None, databus_key=None, auth_url=None, client_id=None) -> None: """ Download a file from the internet with a progress bar using tqdm. @@ -520,10 +520,11 @@ def __download_file__(url, filename, vault_token_file=None, auth_url=None, clien print("Redirects url: ", url) # --- 2. Try direct GET --- - response = requests.get(url, stream=True, allow_redirects=False) # no redirects here, we want to see if auth is required + response = requests.get(url, stream=True, allow_redirects=True) www = response.headers.get('WWW-Authenticate', '') # get WWW-Authenticate header if present to check for Bearer auth - if (response.status_code == 401 or "bearer" in www.lower()): + # Vault token required if 401 Unauthorized with Bearer challenge + if (response.status_code == 401 and "bearer" in www.lower()): print(f"Authentication required for {url}") if not (vault_token_file): raise ValueError("Vault token file not given for protected download") @@ -534,6 +535,15 @@ def __download_file__(url, filename, vault_token_file=None, auth_url=None, clien # --- 4. Retry with token --- response = requests.get(url, headers=headers, stream=True) + + # Databus API key required if only 401 Unauthorized + elif response.status_code == 401: + print(f"API key required for {url}") + if not databus_key: + raise ValueError("Databus API key not given for protected download") + + headers = {"X-API-KEY": databus_key} + response = requests.get(url, headers=headers, stream=True) try: response.raise_for_status() # Raise if still failing @@ -554,8 +564,10 @@ def __download_file__(url, filename, vault_token_file=None, auth_url=None, clien file.write(data) progress_bar.close() + # TODO: could be a problem of github raw / openflaas if total_size_in_bytes != 0 and progress_bar.n != total_size_in_bytes: - raise IOError("Downloaded size does not match Content-Length header") + # raise IOError("Downloaded size does not match Content-Length header") + print(f"Warning: Downloaded size does not match Content-Length header:\nExpected {total_size_in_bytes}, got {progress_bar.n}") def __get_vault_access__(download_url: str, @@ -702,31 +714,38 @@ def wsha256(raw: str): return sha256(raw.encode('utf-8')).hexdigest() -def __handle_databus_collection__(uri: str) -> str: +def __handle_databus_collection__(uri: str, databus_key: str = None) -> str: headers = {"Accept": "text/sparql"} + if databus_key is not None: + headers["X-API-KEY"] = databus_key + return requests.get(uri, headers=headers).text -def __get_json_ld_from_databus__(uri: str) -> str: +def __get_json_ld_from_databus__(uri: str, databus_key: str = None) -> str: headers = {"Accept": "application/ld+json"} + if databus_key is not None: + headers["X-API-KEY"] = databus_key return requests.get(uri, headers=headers).text def __download_list__(urls: List[str], localDir: str, vault_token_file: str = None, + databus_key: str = None, auth_url: str = None, client_id: str = None) -> None: + fileLocalDir = localDir for url in urls: if localDir is None: host, account, group, artifact, version, file = __get_databus_id_parts__(url) - localDir = os.path.join(os.getcwd(), account, group, artifact, version if version is not None else "latest") - print(f"Local directory not given, using {localDir}") + fileLocalDir = os.path.join(os.getcwd(), account, group, artifact, version if version is not None else "latest") + print(f"Local directory not given, using {fileLocalDir}") file = url.split("/")[-1] - filename = os.path.join(localDir, file) + filename = os.path.join(fileLocalDir, file) print("\n") - __download_file__(url=url, filename=filename, vault_token_file=vault_token_file, auth_url=auth_url, client_id=client_id) + __download_file__(url=url, filename=filename, vault_token_file=vault_token_file, databus_key=databus_key, auth_url=auth_url, client_id=client_id) print("\n") @@ -742,6 +761,7 @@ def download( endpoint: str, databusURIs: List[str], token=None, + databus_key=None, auth_url=None, client_id=None ) -> None: @@ -771,15 +791,15 @@ def download( if "/collections/" in databusURI: # TODO "in" is not safe! there could be an artifact named collections, need to check for the correct part position in the URI query = __handle_databus_collection__(databusURI) res = __handle_databus_file_query__(endpoint, query) - __download_list__(res, localDir, vault_token_file=token, auth_url=auth_url, client_id=client_id) + __download_list__(res, localDir, vault_token_file=token, databus_key=databus_key, auth_url=auth_url, client_id=client_id) # databus file elif file is not None: - __download_list__([databusURI], localDir, vault_token_file=token, auth_url=auth_url, client_id=client_id) + __download_list__([databusURI], localDir, vault_token_file=token, databus_key=databus_key, auth_url=auth_url, client_id=client_id) # databus artifact version elif version is not None: json_str = __get_json_ld_from_databus__(databusURI) res = __handle_databus_artifact_version__(json_str) - __download_list__(res, localDir, vault_token_file=token, auth_url=auth_url, client_id=client_id) + __download_list__(res, localDir, vault_token_file=token, databus_key=databus_key, auth_url=auth_url, client_id=client_id) # databus artifact elif artifact is not None: json_str = __get_json_ld_from_databus__(databusURI) @@ -787,7 +807,7 @@ def download( print(f"No version given, using latest version: {latest}") json_str = __get_json_ld_from_databus__(latest) res = __handle_databus_artifact_version__(json_str) - __download_list__(res, localDir, vault_token_file=token, auth_url=auth_url, client_id=client_id) + __download_list__(res, localDir, vault_token_file=token, databus_key=databus_key, auth_url=auth_url, client_id=client_id) # databus group elif group is not None: @@ -800,7 +820,7 @@ def download( print(f"No version given, using latest version: {latest}") json_str = __get_json_ld_from_databus__(latest) res = __handle_databus_artifact_version__(json_str) - __download_list__(res, localDir, vault_token_file=token, auth_url=auth_url, client_id=client_id) + __download_list__(res, localDir, vault_token_file=token, databus_key=databus_key, auth_url=auth_url, client_id=client_id) # databus account elif account is not None: @@ -816,4 +836,4 @@ def download( if endpoint is None: # endpoint is required for queries (--databus) raise ValueError("No endpoint given for query") res = __handle_databus_file_query__(endpoint, databusURI) - __download_list__(res, localDir, vault_token_file=token, auth_url=auth_url, client_id=client_id) + __download_list__(res, localDir, vault_token_file=token, databus_key=databus_key, auth_url=auth_url, client_id=client_id)