@@ -64,44 +64,6 @@ docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://d
6464A docker image is available at [ dbpedia/databus-python-client] ( https://hub.docker.com/r/dbpedia/databus-python-client ) . See [ download section] ( #usage-of-docker-image ) for details.
6565
6666
67- ## Deploy to Databus
68- Please add databus API_KEY to .env file
69- Use metadata.json file to list all files which should be added to the databus
70-
71- The script registers all files on the databus.
72- ### Example Call
73- ``` bash
74- python -m databusclient.deploy \
75- --no-upload \
76- --metadata ./metadata.json \
77- --version-id https://databus.org/user/dataset/version/1.0 \
78- --title " Test Dataset" \
79- --abstract " This is a short abstract of the test dataset." \
80- --description " This dataset was uploaded for testing the Nextcloud → Databus deployment pipeline." \
81- --license https://dalicc.net/licenselibrary/Apache-2.0
82-
83- ```
84-
85- ## Upload to Nextcloud and Deploy to Databus
86- Please add databus API_KEY to .env file
87-
88- The script uploads all given files and all files in the given folders to the given remote.
89- Then registers them on the databus.
90- ### Example Call
91- ``` bash
92- python -m databusclient.deploy \
93- --webdav-url https://cloud.scadsai.uni-leipzig.de/remote.php/webdav \
94- --remote scads-nextcloud \
95- --path test \
96- --version-id https://databus.dbpedia.org/gg46ixav/test_group/test_artifact/2023-07-03 \
97- --title " Test Dataset" \
98- --abstract " This is a short abstract of the test dataset." \
99- --description " This dataset was uploaded for testing the Nextcloud → Databus deployment pipeline." \
100- --license https://dalicc.net/licenselibrary/Apache-2.0 \
101- /home/CSVTest/newtestoutputfolder \
102- /home/CSVTest/output.csv.bz2
103-
104- ```
10567## CLI Usage
10668
10769** Installation**
@@ -259,6 +221,98 @@ If using vault authentication, make sure the token file is available in the cont
259221docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://databus.dbpedia.org/dbpedia-enterprise/live-fusion-snapshots/fusion/2025-08-23/fusion_props=all_subjectns=commons-wikimedia-org_vocab=all.ttl.gz --token vault-token.dat
260222```
261223
224+ ### Upload-and-deploy command
225+ ```
226+ databusclient upload-and-deploy --help
227+ ```
228+ ```
229+ Usage: databusclient upload-and-deploy [OPTIONS] [FILES]...
230+
231+ Upload files to Nextcloud and deploy to DBpedia Databus.
232+
233+ Arguments:
234+ FILES... files in the form of List[path], where every path must exist locally, which will be uploaded and deployed
235+
236+ Options:
237+ --webdav-url TEXT WebDAV URL (e.g.,
238+ https://cloud.example.com/remote.php/webdav)
239+ --remote TEXT rclone remote name (e.g., 'nextcloud')
240+ --path TEXT Remote path on Nextcloud (e.g., 'datasets/mydataset')
241+ --no-upload Skip file upload and use existing metadata
242+ --metadata PATH Path to metadata JSON file (required if --no-upload is
243+ used)
244+ --version-id TEXT Target databus version/dataset identifier of the form <h
245+ ttps://databus.dbpedia.org/$ACCOUNT/$GROUP/$ARTIFACT/$VE
246+ RSION> [required]
247+ --title TEXT Dataset title [required]
248+ --abstract TEXT Dataset abstract max 200 chars [required]
249+ --description TEXT Dataset description [required]
250+ --license TEXT License (see dalicc.net) [required]
251+ --apikey TEXT API key [required]
252+ --help Show this message and exit.
253+ ```
254+ The script uploads all given files and all files in the given folders to the given remote.
255+ Then registers them on the databus.
256+
257+
258+ #### Example of using upload-and-deploy command
259+
260+ ``` bash
261+ databusclient upload-and-deploy \
262+ --webdav-url https://cloud.scadsai.uni-leipzig.de/remote.php/webdav \
263+ --remote scads-nextcloud \
264+ --path test \
265+ --version-id https://databus.dbpedia.org/gg46ixav/test_group/test_artifact/2023-07-03 \
266+ --title " Test Dataset" \
267+ --abstract " This is a short abstract of the test dataset." \
268+ --description " This dataset was uploaded for testing the Nextcloud → Databus deployment pipeline." \
269+ --license https://dalicc.net/licenselibrary/Apache-2.0 \
270+ --api-key " API-KEY"
271+ /home/CSVTest/newtestoutputfolder \
272+ /home/CSVTest/output.csv.bz2
273+ ```
274+
275+
276+ ### deploy command with metadata
277+ ```
278+ databusclient deploy-with-metadata --help
279+ ```
280+ ```
281+ Usage: databusclient deploy-with-metadata [OPTIONS]
282+
283+ Deploy to DBpedia Databus using metadata json file.
284+
285+ Options:
286+ --metadata PATH Path to metadata JSON file [required]
287+ --version-id TEXT Target databus version/dataset identifier of the form <h
288+ ttps://databus.dbpedia.org/$ACCOUNT/$GROUP/$ARTIFACT/$VE
289+ RSION> [required]
290+ --title TEXT Dataset title [required]
291+ --abstract TEXT Dataset abstract max 200 chars [required]
292+ --description TEXT Dataset description [required]
293+ --license TEXT License (see dalicc.net) [required]
294+ --apikey TEXT API key [required]
295+ --help Show this message and exit.
296+ ```
297+
298+ Use the metadata.json file to list all files which should be added to the databus.
299+ The script registers all files on the databus.
300+
301+
302+ #### Examples of using deploy command
303+
304+ ``` bash
305+ databusclient upload-with-metadata \
306+ --metadata ./metadata.json \
307+ --version-id https://databus.org/user/dataset/version/1.0 \
308+ --title " Test Dataset" \
309+ --abstract " This is a short abstract of the test dataset." \
310+ --description " This dataset was uploaded for testing the Nextcloud → Databus deployment pipeline." \
311+ --license https://dalicc.net/licenselibrary/Apache-2.0
312+
313+ ```
314+
315+
262316## Module Usage
263317
264318### Step 1: Create lists of distributions for the dataset
0 commit comments