@@ -163,13 +163,25 @@ databusclient download 'PREFIX dcat: <http://www.w3.org/ns/dcat#> SELECT ?x WHER
163163databusclient deploy --help
164164```
165165```
166- Usage: databusclient deploy [OPTIONS] DISTRIBUTIONS ...
166+ Usage: databusclient deploy [OPTIONS] [INPUTS] ...
167167
168- Arguments:
169- DISTRIBUTIONS... distributions in the form of List[URL|CV|fileext|compression|sha256sum:contentlength] where URL is the
170- download URL and CV the key=value pairs (_ separted)
171- content variants of a distribution, fileExt and Compression can be set, if not they are inferred from the path [required]
168+ Flexible deploy to databus command:
169+
170+ - Classic dataset deployment
171+
172+ - Metadata-based deployment
172173
174+ - Upload & deploy via Nextcloud
175+
176+ Arguments:
177+ INPUTS... Depending on mode:
178+ - Classic mode: List of distributions in the form
179+ URL|CV|fileext|compression|sha256sum:contentlength
180+ (where URL is the download URL and CV the key=value pairs,
181+ separated by underscores)
182+ - Upload mode: List of local file or folder paths (must exist)
183+ - Metdata mode: None
184+
173185Options:
174186 --version-id TEXT Target databus version/dataset identifier of the form <h
175187 ttps://databus.dbpedia.org/$ACCOUNT/$GROUP/$ARTIFACT/$VE
@@ -179,24 +191,85 @@ Options:
179191 --description TEXT Dataset description [required]
180192 --license TEXT License (see dalicc.net) [required]
181193 --apikey TEXT API key [required]
194+ --metadata PATH Path to metadata JSON file (for metadata mode)
195+ --webdav-url TEXT WebDAV URL (e.g.,
196+ https://cloud.example.com/remote.php/webdav)
197+ --remote TEXT rclone remote name (e.g., 'nextcloud')
198+ --path TEXT Remote path on Nextcloud (e.g., 'datasets/mydataset')
182199 --help Show this message and exit.
200+
183201```
184- Examples of using deploy command
202+ #### Examples of using deploy command
203+ Mode 1: Classic Deploy (Distributions)
185204```
186205databusclient deploy --version-id https://databus.dbpedia.org/user1/group1/artifact1/2022-05-18 --title title1 --abstract abstract1 --description description1 --license http://dalicc.net/licenselibrary/AdaptivePublicLicense10 --apikey MYSTERIOUS 'https://raw.githubusercontent.com/dbpedia/databus/master/server/app/api/swagger.yml|type=swagger'
187206```
188207
189208```
190209databusclient deploy --version-id https://dev.databus.dbpedia.org/denis/group1/artifact1/2022-05-18 --title "Client Testing" --abstract "Testing the client...." --description "Testing the client...." --license http://dalicc.net/licenselibrary/AdaptivePublicLicense10 --apikey MYSTERIOUS 'https://raw.githubusercontent.com/dbpedia/databus/master/server/app/api/swagger.yml|type=swagger'
191210```
192-
193211A few more notes for CLI usage:
194212
195213* The content variants can be left out ONLY IF there is just one distribution
196214 * For complete inferred: Just use the URL with ` https://raw.githubusercontent.com/dbpedia/databus/master/server/app/api/swagger.yml `
197215 * If other parameters are used, you need to leave them empty like ` https://raw.githubusercontent.com/dbpedia/databus/master/server/app/api/swagger.yml||yml|7a751b6dd5eb8d73d97793c3c564c71ab7b565fa4ba619e4a8fd05a6f80ff653:367116 `
198216
199217
218+ Mode 2: Deploy with Metadata File
219+
220+ Use a JSON metadata file to define all distributions.
221+ The metadata.json should list all distributions and their metadata.
222+ All files referenced there will be registered on the Databus.
223+ ``` bash
224+ databusclient deploy \
225+ --metadata /home/metadata.json \
226+ --version-id https://databus.org/user/dataset/version/1.0 \
227+ --title " Metadata Deploy Example" \
228+ --abstract " This is a short abstract of the dataset." \
229+ --description " This dataset was uploaded using metadata.json." \
230+ --license https://dalicc.net/licenselibrary/Apache-2.0 \
231+ --apikey " API-KEY"
232+ ```
233+ Metadata file structure:
234+ ``` json
235+ [
236+ {
237+ "filename" : " example.ttl" ,
238+ "checksum" : " 0929436d44bba110fc7578c138ed770ae9f548e195d19c2f00d813cca24b9f39" ,
239+ "size" : 12345 ,
240+ "url" : " https://cloud.example.com/remote.php/webdav/datasets/mydataset/example.ttl"
241+ },
242+ {
243+ "filename" : " example.csv.gz" ,
244+ "checksum" : " 2238acdd7cf6bc8d9c9963a9f6014051c754bf8a04aacc5cb10448e2da72c537" ,
245+ "size" : 54321 ,
246+ "url" : " https://cloud.example.com/remote.php/webdav/datasets/mydataset/example.csv.gz"
247+ }
248+ ]
249+
250+ ```
251+
252+
253+ Mode 3: Upload & Deploy via Nextcloud
254+
255+ Upload local files or folders to a WebDAV/Nextcloud instance and automatically deploy to DBpedia Databus.
256+ Rclone is required.
257+
258+ ``` bash
259+ databusclient deploy \
260+ --webdav-url https://cloud.example.com/remote.php/webdav \
261+ --remote nextcloud \
262+ --path datasets/mydataset \
263+ --version-id https://databus.org/user/dataset/version/1.0 \
264+ --title " Test Dataset" \
265+ --abstract " Short abstract of dataset" \
266+ --description " This dataset was uploaded for testing the Nextcloud → Databus pipeline." \
267+ --license https://dalicc.net/licenselibrary/Apache-2.0 \
268+ --apikey " API-KEY" \
269+ ./localfile1.ttl \
270+ ./data_folder
271+ ```
272+
200273
201274#### Authentication with vault
202275
@@ -222,98 +295,6 @@ docker run --rm -v $(pwd):/data dbpedia/databus-python-client download https://d
222295```
223296
224297
225- ### Upload-and-deploy command
226- ``` bash
227- databusclient upload-and-deploy --help
228- ```
229- ``` text
230- Usage: databusclient upload-and-deploy [OPTIONS] [FILES]...
231-
232- Upload files to Nextcloud and deploy to DBpedia Databus.
233-
234- Arguments:
235- FILES... files in the form of List[path], where every path must exist locally, which will be uploaded and deployed
236-
237- Options:
238- --webdav-url TEXT WebDAV URL (e.g.,
239- https://cloud.example.com/remote.php/webdav)
240- --remote TEXT rclone remote name (e.g., 'nextcloud')
241- --path TEXT Remote path on Nextcloud (e.g., 'datasets/mydataset')
242- --no-upload Skip file upload and use existing metadata
243- --metadata PATH Path to metadata JSON file (required if --no-upload is
244- used)
245- --version-id TEXT Target databus version/dataset identifier of the form <h
246- ttps://databus.dbpedia.org/$ACCOUNT/$GROUP/$ARTIFACT/$VE
247- RSION> [required]
248- --title TEXT Dataset title [required]
249- --abstract TEXT Dataset abstract max 200 chars [required]
250- --description TEXT Dataset description [required]
251- --license TEXT License (see dalicc.net) [required]
252- --apikey TEXT API key [required]
253- --help Show this message and exit.
254- ```
255- The script uploads all given files and all files in the given folders to the given remote.
256- Then registers them on the databus.
257-
258-
259- #### Example of using upload-and-deploy command
260-
261- ``` bash
262- databusclient upload-and-deploy \
263- --webdav-url https://cloud.scadsai.uni-leipzig.de/remote.php/webdav \
264- --remote scads-nextcloud \
265- --path test \
266- --version-id https://databus.org/user/dataset/version/1.0 \
267- --title " Test Dataset" \
268- --abstract " This is a short abstract of the test dataset." \
269- --description " This dataset was uploaded for testing the Nextcloud → Databus deployment pipeline." \
270- --license https://dalicc.net/licenselibrary/Apache-2.0 \
271- --apikey " API-KEY" \
272- /home/test \
273- /home/test_folder/test
274- ```
275-
276-
277- ### deploy-with-metadata command
278- ``` bash
279- databusclient deploy-with-metadata --help
280- ```
281- ``` text
282- Usage: databusclient deploy-with-metadata [OPTIONS]
283-
284- Deploy to DBpedia Databus using metadata json file.
285-
286- Options:
287- --metadata PATH Path to metadata JSON file [required]
288- --version-id TEXT Target databus version/dataset identifier of the form <h
289- ttps://databus.dbpedia.org/$ACCOUNT/$GROUP/$ARTIFACT/$VE
290- RSION> [required]
291- --title TEXT Dataset title [required]
292- --abstract TEXT Dataset abstract max 200 chars [required]
293- --description TEXT Dataset description [required]
294- --license TEXT License (see dalicc.net) [required]
295- --apikey TEXT API key [required]
296- --help Show this message and exit.
297- ```
298-
299- Use the metadata.json file (see [ databusclient/metadata.json] ( databusclient/metadata.json ) ) to list all files which should be added to the databus.
300- The script registers all files on the databus.
301-
302-
303- #### Examples of using deploy-with-metadata command
304-
305- ``` bash
306- databusclient deploy-with-metadata \
307- --metadata /home/metadata.json \
308- --version-id https://databus.org/user/dataset/version/1.0 \
309- --title " Test Dataset" \
310- --abstract " This is a short abstract of the test dataset." \
311- --description " This dataset was uploaded for testing the Nextcloud → Databus deployment pipeline." \
312- --license https://dalicc.net/licenselibrary/Apache-2.0 \
313- --apikey " API-KEY"
314- ```
315-
316-
317298## Module Usage
318299### Step 1: Create lists of distributions for the dataset
319300
0 commit comments