Skip to content

Commit 586cac3

Browse files
authored
Merge branch 'develop' into 11996-fix-settings
2 parents 984cd27 + da4f7f8 commit 586cac3

27 files changed

+765
-98
lines changed
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
## Get Dataset/Dataverse Storage Driver API
2+
3+
### Changed Json response - breaking change!
4+
5+
The API for getting the Storage Driver info has been changed/extended.
6+
/api/datasets/{identifier}/storageDriver
7+
/api/admin/dataverse/{dataverse-alias}/storageDriver
8+
Rather than returning just the name/id of the driver (with the key "message"), the api call now returns a JSONObject with the driver's "name", "type" and "label", and booleans indicating whether the driver has "directUpload", "directDownload", and/or "uploadOutOfBand" enabled.
9+
10+
This change also affects the /api/admin/dataverse/{dataverse-alias}/storageDriver api call. In addition, this call now supports an optional ?getEffective=true to find the effective storageDriver (the driver that will be used for new datasets in the collection)
11+
12+
See also [the guides](https://dataverse-guide--11664.org.readthedocs.build/en/11664/api/native-api.html#configure-a-dataset-to-store-all-new-files-in-a-specific-file-store), #11695, and #11664.

doc/release-notes/11695-change-api-get-storage-driver.md

Lines changed: 0 additions & 12 deletions
This file was deleted.
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
### Bug Fix
2+
3+
Editing a controlled vocabulary field (i.e. one with values specified in the field's metadatablock) that only allows a single selection would also update the value in the prior published version if (and only if) the edit was made starting from the published version (versus an existing draft). This is now fixed.
4+
The bug appears to be 11+ years old and previously unreported. As the value in the database was overwritten, there is no simple way to detect if/when this occurred without looking at backups or archival file copies.
5+

doc/sphinx-guides/source/admin/dataverses-datasets.rst

Lines changed: 9 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -56,17 +56,19 @@ To direct new files (uploaded when datasets are created or edited) for all datas
5656

5757
(Note that for ``dataverse.files.store1.label=MyLabel``, you should pass ``MyLabel``.)
5858

59-
The current driver can be seen using::
59+
A store assigned directly to a collection can be seen using::
6060

6161
curl -H "X-Dataverse-key: $API_TOKEN" http://$SERVER/api/admin/dataverse/$dataverse-alias/storageDriver
6262

63-
Or to recurse the chain of parents to find the effective storageDriver::
63+
This may be null. To get the effective storageDriver for a collection, which may be inherited from a parent collection or be the installation default, you can use::
6464

6565
curl -H "X-Dataverse-key: $API_TOKEN" http://$SERVER/api/admin/dataverse/$dataverse-alias/storageDriver?getEffective=true
66+
67+
This will never be null.
6668

67-
(Note that for ``dataverse.files.store1.label=MyLabel``, ``store1`` will be returned.)
69+
(Note that for ``dataverse.files.store1.label=MyLabel``, the JSON response will include "name":"store1" and "label":"MyLabel".)
6870

69-
and can be reset to the default store with::
71+
To delete a store assigned directly to a collection (so that the colllection's effective store is inherted from it's parent or is the global default), use::
7072

7173
curl -H "X-Dataverse-key: $API_TOKEN" -X DELETE http://$SERVER/api/admin/dataverse/$dataverse-alias/storageDriver
7274
@@ -261,15 +263,15 @@ To identify invalid data values in specific datasets (if, for example, an attemp
261263
Configure a Dataset to Store All New Files in a Specific File Store
262264
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
263265

264-
Configure a dataset to use a specific file store (this API can only be used by a superuser) ::
266+
Configure an individual dataset to use a specific file store (this API can only be used by a superuser) ::
265267
266268
curl -H "X-Dataverse-key: $API_TOKEN" -X PUT -d $storageDriverLabel http://$SERVER/api/datasets/$dataset-id/storageDriver
267269
268-
The current driver can be seen using::
270+
The effective store can be seen using::
269271

270272
curl http://$SERVER/api/datasets/$dataset-id/storageDriver
271273

272-
It can be reset to the default store as follows (only a superuser can do this) ::
274+
To remove an assigned store, and allow the dataset to inherit the store from it's parent collection, use the following (only a superuser can do this) ::
273275

274276
curl -H "X-Dataverse-key: $API_TOKEN" -X DELETE http://$SERVER/api/datasets/$dataset-id/storageDriver
275277

doc/sphinx-guides/source/api/changelog.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,14 +14,14 @@ v6.9
1414
- The way to set per-format size limits for tabular ingest has changed. JSON input is now used. See :ref:`:TabularIngestSizeLimit`.
1515
- In the past, the settings API would accept any key and value. This is no longer the case because validation has been added. See :ref:`settings_put_single`, for example.
1616
- For GET /api/notifications/all the JSON response has changed breaking the backward compatibility of the API.
17+
- For GET /api/admin/dataverse/{dataverse-alias}/storageDriver and /api/datasets/{identifier}/storageDriver the driver name is no longer returned in data.message. Instead, it is returned as data.name (along with other information about the storageDriver).
1718

1819
v6.8
1920
----
2021

2122
- For POST /api/files/{id}/metadata passing an empty string ("description":"") or array ("categories":[]) will no longer be ignored. Empty fields will now clear out the values in the file's metadata. To ignore the fields simply do not include them in the JSON string.
2223
- For PUT /api/datasets/{id}/editMetadata the query parameter "sourceInternalVersionNumber" has been removed and replaced with "sourceLastUpdateTime" to verify that the data being edited hasn't been modified and isn't stale.
2324
- For GET /api/dataverses/$dataverse-alias/links the JSON response has changed breaking the backward compatibility of the API.
24-
- For GET /api/admin/dataverse/{dataverse-alias}/storageDriver and /api/datasets/{identifier}/storageDriver the driver name is no longer returned in data.message. This value is now returned in data.name.
2525
- For PUT /api/dataverses/$dataverse-alias/inputLevels custom input levels that had been previously set will no longer be deleted. To delete input levels send an empty list (deletes all), then send the new/modified list.
2626
- For GET /api/externalTools and /api/externalTools/{id} the responses are now formatted as JSON (previously the toolParameters and allowedApiCalls were a JSON object and array (respectively) that were serialized as JSON strings) and any configured "requirements" are included.
2727

doc/sphinx-guides/source/developers/big-data-support.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -165,7 +165,7 @@ Globus File Transfer
165165
Note: Globus file transfer is still experimental but feedback is welcome! See :ref:`support`.
166166

167167
Users can transfer files via `Globus <https://www.globus.org>`_ into and out of datasets, or reference files on a remote Globus endpoint, when their Dataverse installation is configured to use a Globus accessible store(s)
168-
and a community-developed `dataverse-globus <https://github.com/scholarsportal/dataverse-globus>`_ app has been properly installed and configured.
168+
and a community-developed `dataverse-globus <https://github.com/gdcc/dataverse-globus>`_ app has been properly installed and configured.
169169

170170
Globus endpoints can be in a variety of places, from data centers to personal computers.
171171
This means that from within the Dataverse software, a Globus transfer can feel like an upload or a download (with Globus Personal Connect running on your laptop, for example) or it can feel like a true transfer from one server to another (from a cluster in a data center into a Dataverse dataset or vice versa).

doc/sphinx-guides/source/index.rst

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,13 +6,16 @@
66
Dataverse Documentation v. |version|
77
====================================
88

9+
Check out the new :doc:`/quickstart/index`!
10+
911
These documentation guides are for the |version| version of Dataverse. To find guides belonging to previous or future versions, :ref:`guides_versions` has a list of all available versions.
1012

1113
.. toctree::
1214
:glob:
1315
:titlesonly:
1416
:maxdepth: 2
1517

18+
quickstart/index
1619
user/index
1720
admin/index
1821
ai/index
Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
# Quickstart Guide
2+
3+
```{toctree}
4+
:caption: "Contents:"
5+
:maxdepth: 1
6+
what-is-dataverse.md
7+
publish-a-dataset.md
8+
publish-a-collection.md
9+
```
Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,63 @@
1+
# Publish a Collection
2+
3+
## 🔐 Step 1: Log In & Create a Draft
4+
5+
- {ref}`Log in <account-log-in-options>`.
6+
- (Optional) Navigate to {doc}`a collection </user/dataverse-management>`.
7+
- Click "Add Data" → "New Dataverse".
8+
9+
Note: If you don’t see the "Add Data" button, contact your repository support team.
10+
11+
## 📝 Step 2: Enter Basic Metadata & Settings
12+
13+
- Fill in the required {ref}`metadata <create-dataverse>` fields
14+
- Select metadata settings.
15+
- Click "Create Dataverse" at the bottom to save your draft collection.
16+
17+
## 🚀 Step 3: Publish Your Collection
18+
19+
Note: once published, easy deletion of a collection is no longer possible.
20+
21+
- Click "Publish" (top right).
22+
23+
---
24+
25+
## ✅ Choose Look & Feel (optional)
26+
27+
- Click "Edit" → "Theme \+ Widgets".
28+
- Select {ref}`theme settings <theme>`.
29+
- Click "Save Changes".
30+
31+
## ✅ Set Permissions (optional)
32+
33+
- Click "Edit" → "Permissions".
34+
- Under "Permissions", click "Edit Access" to {ref}`set general permissions <dataverse-permissions>`.
35+
- (Optional) Add users or groups with specific permissions under "Users/Groups" by clicking "Assign Roles to Users/Groups".
36+
37+
## ✅ Create Groups (optional)
38+
39+
- Click "Edit" → "Groups".
40+
- Click "Create Group".
41+
- Enter a Group Name and Group Identifier.
42+
- Add users to this group using autofill in "Users/Groups".
43+
- Click "Create Group".
44+
45+
## ✅ Set Dataset Templates (optional)
46+
47+
- Click "Edit" → "Dataset Templates".
48+
- Click {ref}`Create Dataset Template <dataset-templates>`.
49+
- Enter a template name.
50+
- Add any template metadata in the metadata fields.
51+
- Click "Save \+ Add Terms".
52+
- {ref}`Choose a license <choosing-license>` from the dropdown or select {ref}`Custom Dataset Terms <custom-terms>`.
53+
- Provide other relevant terms information
54+
- Click "Save Dataset Template"
55+
56+
## ✅ Set Dataset Guestbooks (optional)
57+
58+
- Click "Edit" → "Dataset Guestbooks".
59+
- Click {ref}`Create Dataset Guestbook <dataset-guestbooks>`.
60+
- Enter a guestbook name.
61+
- Add information to collect.
62+
- Click "Create Dataset Guestbook".
63+
Lines changed: 61 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,61 @@
1+
# Publish a Dataset
2+
3+
## 🔐 Step 1: Log In & Create a Draft
4+
5+
- {ref}`Log in <account-log-in-options>`.
6+
- (Optional) Navigate to {doc}`a collection </user/dataverse-management>`.
7+
- Click "Add Data" → "{ref}`New Dataset <adding-new-dataset>`".
8+
9+
Note: If you don’t see the "Add Data" button, contact your repository support team.
10+
11+
## 📝Step 2: Enter Basic Metadata
12+
13+
- Fill in the required {ref}`metadata <metadata-supported>` fields.
14+
- Click "Save" at the bottom to save your draft dataset.
15+
16+
17+
## 📁 Step 3: Upload or Edit Files
18+
19+
- In the draft dataset, scroll down to the "Files" tab.
20+
- Click "{ref}`Upload Files <dataset-file-upload>`".
21+
- Choose "Select Files to Add" or drag and drop files.
22+
- (Optional) Use "{ref}`Upload Folder <folder-upload>`" if available.
23+
- Click "Done" when upload is completed.
24+
25+
To edit files:
26+
27+
- Select files on the left → Click "{ref}`Edit Files <edit-files>`".
28+
- To {ref}`restrict/embargo files <edit-file-metadata>`, choose the relevant option.
29+
30+
## 📜 Step 4: Set Terms of Use
31+
32+
- Go to the terms tab.
33+
- Click "{ref}`Edit Terms Requirements <license-terms>`".
34+
- {ref}`Choose a license <choosing-license>` from the dropdown or select {ref}`Custom Dataset Terms <custom-terms>`.
35+
- Click "Save changes".
36+
37+
## 🧾 Step 5: Add or Edit Metadata
38+
39+
- Go to the metadata tab.
40+
- Click "Add + Edit Metadata".
41+
- Click "Save" after making changes.
42+
43+
## 🚀 Step 6: Publish Your Dataset
44+
45+
Note: once published, easy deletion of a dataset is no longer possible.
46+
47+
- Click publish-dataset or submit-for-review (top right).
48+
49+
- {ref}`Publish Dataset <publish-dataset>`: Immediately publish a dataset. This option is only available on repositories without a review phase.
50+
51+
- {ref}`Submit for Review <submit-for-review>`: Locks draft and sends to support staff. If changes are needed, you’ll be notified via email and you can resubmit. This option is only available on repositories with a review phase.
52+
53+
## 🔄 Optional: Update a Published Dataset
54+
55+
- Edit the dataset and publish {ref}`a new version <dataset-versions>`.
56+
- The DOI remains the same.
57+
- The dataset-versioning tracks all changes:
58+
59+
- Metadata changes = minor version.
60+
61+
- Data changes = major version.

0 commit comments

Comments
 (0)