Skip to content

Commit 2088849

Browse files
Merge branch 'develop' into container-naming-fix
2 parents a025d9e + c325662 commit 2088849

File tree

28 files changed

+786
-266
lines changed

28 files changed

+786
-266
lines changed

.github/workflows/deploy_beta_testing.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ jobs:
3636
run: echo "war_file=$(ls *.war | head -1)">> $GITHUB_ENV
3737

3838
- name: Upload war artifact
39-
uses: actions/upload-artifact@v4
39+
uses: actions/upload-artifact@v5
4040
with:
4141
name: built-app
4242
path: ./target/${{ env.war_file }}
@@ -50,7 +50,7 @@ jobs:
5050
- uses: actions/checkout@v5
5151

5252
- name: Download war artifact
53-
uses: actions/download-artifact@v5
53+
uses: actions/download-artifact@v6
5454
with:
5555
name: built-app
5656
path: ./

.github/workflows/maven_unit_test.yml

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ jobs:
6262

6363
# Upload the built war file. For download, it will be wrapped in a ZIP by GitHub.
6464
# See also https://github.com/actions/upload-artifact#zipped-artifact-downloads
65-
- uses: actions/upload-artifact@v4
65+
- uses: actions/upload-artifact@v5
6666
with:
6767
name: dataverse-java${{ matrix.jdk }}.war
6868
path: target/dataverse*.war
@@ -72,7 +72,7 @@ jobs:
7272
- run: |
7373
tar -cvf java-builddir.tar target
7474
tar -cvf java-m2-selection.tar ~/.m2/repository/io/gdcc/dataverse-*
75-
- uses: actions/upload-artifact@v4
75+
- uses: actions/upload-artifact@v5
7676
with:
7777
name: java-artifacts
7878
path: |
@@ -112,7 +112,7 @@ jobs:
112112
cache: maven
113113

114114
# Get the build output from the unit test job
115-
- uses: actions/download-artifact@v5
115+
- uses: actions/download-artifact@v6
116116
with:
117117
name: java-artifacts
118118
- run: |
@@ -124,7 +124,7 @@ jobs:
124124

125125
# Wrap up and send to coverage job
126126
- run: tar -cvf java-reportdir.tar target/site
127-
- uses: actions/upload-artifact@v4
127+
- uses: actions/upload-artifact@v5
128128
with:
129129
name: java-reportdir
130130
path: java-reportdir.tar
@@ -145,7 +145,7 @@ jobs:
145145
cache: maven
146146

147147
# Get the build output from the integration test job
148-
- uses: actions/download-artifact@v5
148+
- uses: actions/download-artifact@v6
149149
with:
150150
name: java-reportdir
151151
- run: tar -xvf java-reportdir.tar
Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
## New Endpoint: `/datasets/{id}/license`
2+
3+
A new endpoint has been implemented to manage dataset licenses.
4+
5+
### Functionality
6+
- Updates the license of a dataset by applying it to the draft version.
7+
- If no draft exists, a new one is automatically created.
8+
9+
### Usage
10+
This endpoint supports two ways of defining a license:
11+
1. **Predefined License** – Provide the license name (e.g., `CC BY 4.0`).
12+
2. **Custom Terms of Use and Access** – Provide a JSON body with the `customTerms` object.
13+
- All fields are optional **except** `termsOfUse`, which is required.
Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
### Suppression of the Host Dataverse field
2+
3+
When creating a dataset, the _host dataverse_ field is not shown when the user can only add datasets to one collection.

doc/sphinx-guides/source/admin/integrations.rst

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -240,11 +240,6 @@ Discoverability
240240

241241
A number of builtin features related to data discovery are listed under :doc:`discoverability` but you can further increase the discoverability of your data by setting up integrations.
242242

243-
SHARE
244-
+++++
245-
246-
`SHARE <http://www.share-research.org>`_ is building a free, open, data set about research and scholarly activities across their life cycle. It's possible to add a Dataverse installation as one of the `sources <https://share.osf.io/sources>`_ they include if you contact the SHARE team.
247-
248243
Geodisy
249244
+++++++
250245

doc/sphinx-guides/source/api/native-api.rst

Lines changed: 51 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4208,24 +4208,24 @@ Delete files from a dataset. This API call allows you to delete multiple files f
42084208
42094209
curl -H "X-Dataverse-key:$API_TOKEN" -X PUT "$SERVER_URL/api/datasets/:persistentId/deleteFiles?persistentId=$PERSISTENT_IDENTIFIER" \
42104210
-H "Content-Type: application/json" \
4211-
-d '{"fileIds": [1, 2, 3]}'
4211+
-d '[1, 2, 3]'
42124212
42134213
The fully expanded example above (without environment variables) looks like this:
42144214
42154215
.. code-block:: bash
42164216
42174217
curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X PUT "https://demo.dataverse.org/api/datasets/:persistentId/deleteFiles?persistentId=doi:10.5072/FK2ABCDEF" \
42184218
-H "Content-Type: application/json" \
4219-
-d '{"fileIds": [1, 2, 3]}'
4219+
-d '[1, 2, 3]'
42204220
4221-
The ``fileIds`` in the JSON payload should be an array of file IDs that you want to delete from the dataset.
4221+
The JSON payload should be an array of file IDs that you want to delete from the dataset.
42224222
42234223
You must have the appropriate permissions to delete files from the dataset.
42244224
42254225
Upon success, the API will return a JSON response with a success message and the number of files deleted.
42264226
42274227
The API call will report a 400 (BAD REQUEST) error if any of the files specified do not exist or are not in the latest version of the specified dataset.
4228-
The ``fileIds`` in the JSON payload should be an array of file IDs that you want to delete from the dataset.
4228+
The JSON payload should be an array of file IDs that you want to delete from the dataset.
42294229
42304230
.. _api-dataset-role-assignment-history:
42314231
@@ -4357,6 +4357,53 @@ The CSV response for this call is the same as for the /api/datasets/{id}/assignm
43574357
43584358
Note: This feature requires the "role-assignment-history" feature flag to be enabled (see :ref:`feature-flags`).
43594359
4360+
Update Dataset License
4361+
~~~~~~~~~~~~~~~~~~~~~~
4362+
4363+
Updates the license of a dataset by applying it to the draft version, or by creating a draft if none exists.
4364+
4365+
The JSON representation of a license can take two forms, depending on whether you want to specify a predefined license or define custom terms of use and access.
4366+
4367+
To set a predefined license (e.g., CC BY 4.0), provide a JSON body with the license name:
4368+
4369+
.. code-block:: json
4370+
4371+
{
4372+
"name": "CC BY 4.0"
4373+
}
4374+
4375+
To define custom terms of use and access, provide a JSON body with the following properties. All fields within ``customTerms`` are optional, except for the ``termsOfUse`` field, which is required:
4376+
4377+
.. code-block:: json
4378+
4379+
{
4380+
"customTerms": {
4381+
"termsOfUse": "Your terms of use",
4382+
"confidentialityDeclaration": "Your confidentiality declaration",
4383+
"specialPermissions": "Your special permissions",
4384+
"restrictions": "Your restrictions",
4385+
"citationRequirements": "Your citation requirements",
4386+
"depositorRequirements": "Your depositor requirements",
4387+
"conditions": "Your conditions",
4388+
"disclaimer": "Your disclaimer"
4389+
}
4390+
}
4391+
4392+
.. code-block:: bash
4393+
4394+
export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
4395+
export SERVER_URL=https://demo.dataverse.org
4396+
export ID=3
4397+
export FILE_PATH=license.json
4398+
4399+
curl -H "X-Dataverse-key:$API_TOKEN" -X PUT "$SERVER_URL/api/datasets/$ID/license" -H "Content-type:application/json" --upload-file $FILE_PATH
4400+
4401+
The fully expanded example above (without environment variables) looks like this:
4402+
4403+
.. code-block:: bash
4404+
4405+
curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X PUT "https://demo.dataverse.org/api/datasets/3/license" -H "Content-type:application/json" --upload-file license.json
4406+
43604407
Files
43614408
-----
43624409

doc/sphinx-guides/source/container/dev-usage.rst

Lines changed: 14 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -233,6 +233,13 @@ Hotswapping methods requires using JDWP (Debug Mode), but does not allow switchi
233233
**Requires IntelliJ Ultimate!**
234234
(Note that `free educational licenses <https://www.jetbrains.com/community/education/>`_ are available)
235235

236+
Go to settings, then plugins. Install "Payara Ultimate Tools". For more information:
237+
238+
- `plugin homepage <https://plugins.jetbrains.com/plugin/15114-payara-ultimate-tools>`_
239+
- `docs <https://docs.payara.fish/community/docs/Technical%20Documentation/Ecosystem/IDE%20Integration/IntelliJ%20Plugin/Overview.html>`_
240+
- `source <https://github.com/payara/ecosystem-intellij-plugin>`_
241+
- `issues <https://github.com/payara/ecosystem-support>`_
242+
236243
.. image:: img/intellij-payara-plugin-install.png
237244

238245
#. Configure a connection to Payara:
@@ -284,6 +291,7 @@ Hotswapping methods requires using JDWP (Debug Mode), but does not allow switchi
284291

285292
You might want to tweak the hot deploy behavior in the "Server" tab now.
286293
"Update action" can be found in the run window (see below).
294+
By default it is "Hot Swap classes", which works fine, but as the screenshot shows you can also change it to "Redeploy".
287295
"Frame deactivation" means switching from IntelliJ window to something else, e.g. your browser.
288296
*Note: static resources like properties, XHTML etc will only update when redeploying!*
289297

@@ -305,7 +313,11 @@ Hotswapping methods requires using JDWP (Debug Mode), but does not allow switchi
305313
See cheat sheet above for more options.
306314
Note that this command either assumes you built the :doc:`app-image` first or will download it from Docker Hub.
307315
.. group-tab:: IntelliJ
308-
You can create a service configuration to automatically start services for you.
316+
Note that you can skip this step if you're ok running the command under the "Maven" tab, which is this:
317+
318+
``mvn -Pct docker:run -Dapp.skipDeploy``
319+
320+
In IntelliJ you can create a service configuration to automatically start services for you.
309321

310322
**IMPORTANT**: This requires installation of the `Docker plugin <https://plugins.jetbrains.com/plugin/7724-docker>`_.
311323

@@ -362,7 +374,7 @@ Hotswapping methods requires using JDWP (Debug Mode), but does not allow switchi
362374

363375
.. image:: img/intellij-payara-run-output.png
364376

365-
Manually hotswap classes in "Debug" mode via "Run" > "Debugging Actions" > "Reload Changed Classes".
377+
Manually hotswap classes in "Debug" mode via "Run" > "Debugging Actions" > "Compile and Reload Modified Files".
366378

367379
.. image:: img/intellij-payara-run-menu-reload.png
368380

doc/sphinx-guides/source/developers/making-library-releases.rst

Lines changed: 10 additions & 50 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,9 @@ Note: See :doc:`making-releases` for Dataverse itself.
1313
We release Java libraries to Maven Central that are used by Dataverse (and perhaps `other <https://github.com/gdcc/xoai/issues/141>`_ `software <https://github.com/gdcc/xoai/issues/170>`_!):
1414

1515
- https://central.sonatype.com/namespace/org.dataverse
16+
- https://central.sonatype.com/namespace/org.dataverse.test
1617
- https://central.sonatype.com/namespace/io.gdcc
18+
- https://central.sonatype.com/namespace/io.gdcc.export
1719

1820
We release JavaScript/TypeScript libraries to npm:
1921

@@ -109,60 +111,18 @@ Releasing a New Library to Maven Central
109111
At a high level:
110112

111113
- Start with a snapshot release.
112-
- Use an existing pom.xml as a starting point.
113-
- Use existing GitHub Actions workflows as a starting point.
114-
- Create secrets in the new library's GitHub repo used by the workflow.
115-
- If you need an entire new namespace, look at previous issues such as https://issues.sonatype.org/browse/OSSRH-94575 and https://issues.sonatype.org/browse/OSSRH-94577
116-
117-
Updating pom.xml for a Snapshot Release
118-
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
119-
120-
Before publishing a final version to Maven Central, you should publish a snapshot release or two. For each snapshot release you publish, the jar name will be unique each time (e.g. ``foobar-0.0.1-20240430.175110-3.jar``), so you can safely publish over and over with the same version number.
121-
122-
We use the `Nexus Staging Maven Plugin <https://github.com/sonatype/nexus-maven-plugins/blob/main/staging/maven-plugin/README.md>`_ to push snapshot releases to https://s01.oss.sonatype.org/content/groups/staging/io/gdcc/ and https://s01.oss.sonatype.org/content/groups/staging/org/dataverse/
123-
124-
Add the following to your pom.xml:
125-
126-
.. code-block:: xml
114+
- Use an existing pom.xml as a starting point, such as from `Croissant <https://github.com/gdcc/exporter-croissant>`_, that inherits from the common Maven parent (https://github.com/gdcc/maven-parent). You can also play around with the "hello" project (https://github.com/gdcc/hello) and even make releases from it since it is designed to be a sandbox for publishing to Maven Central.
115+
- Use existing GitHub Actions workflows as a starting point, such as from `Croissant <https://github.com/gdcc/exporter-croissant>`_. As of this writing we have separate actions for ``maven-snapshot.yml`` and ``maven-release.yml``.
116+
- For repos under https://github.com/IQSS, create secrets in the new library's GitHub repo used by the workflow. This is necessary for the IQSS org because "organization secrets are not available for organizations on legacy per-repository billing plans." For repos under https://github.com/gdcc you can make use of shared secrets at the org level. These are the environment variables we use:
127117

128-
<version>0.0.1-SNAPSHOT</version>
118+
- DATAVERSEBOT_GPG_KEY
129119

130-
<distributionManagement>
131-
<snapshotRepository>
132-
<id>ossrh</id>
133-
<url>https://s01.oss.sonatype.org/content/repositories/snapshots</url>
134-
</snapshotRepository>
135-
<repository>
136-
<id>ossrh</id>
137-
<url>https://s01.oss.sonatype.org/service/local/staging/deploy/maven2/</url>
138-
</repository>
139-
</distributionManagement>
120+
- DATAVERSEBOT_GPG_PASSWORD
140121

141-
<plugin>
142-
<groupId>org.sonatype.plugins</groupId>
143-
<artifactId>nexus-staging-maven-plugin</artifactId>
144-
<version>${nexus-staging.version}</version>
145-
<extensions>true</extensions>
146-
<configuration>
147-
<serverId>ossrh</serverId>
148-
<nexusUrl>https://s01.oss.sonatype.org</nexusUrl>
149-
<autoReleaseAfterClose>true</autoReleaseAfterClose>
150-
</configuration>
151-
</plugin>
122+
- DATAVERSEBOT_SONATYPE_TOKEN
152123

153-
Configuring Secrets
154-
~~~~~~~~~~~~~~~~~~~
155-
156-
In GitHub, you will likely need to configure the following secrets:
157-
158-
- DATAVERSEBOT_GPG_KEY
159-
- DATAVERSEBOT_GPG_PASSWORD
160-
- DATAVERSEBOT_SONATYPE_TOKEN
161-
- DATAVERSEBOT_SONATYPE_USERNAME
162-
163-
Note that some of these secrets might be configured at the org level (e.g. gdcc or IQSS).
164-
165-
Many of the automated tasks are performed by the dataversebot account on GitHub: https://github.com/dataversebot
124+
- DATAVERSEBOT_SONATYPE_USERNAME
125+
- If you need an entire new namespace, look at previous issues such as https://issues.sonatype.org/browse/OSSRH-94575 and https://issues.sonatype.org/browse/OSSRH-94577
166126

167127
npm (JavaScript/TypeScript)
168128
---------------------------

doc/sphinx-guides/source/installation/config.rst

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -115,6 +115,23 @@ See the :ref:`payara` section of :doc:`prerequisites` for details and init scrip
115115

116116
Related to this is that you should remove ``/root/.payara/pass`` to ensure that Payara isn't ever accidentally started as root. Without the password, Payara won't be able to start as root, which is a good thing.
117117

118+
.. _payara-ports-localhost-only:
119+
120+
Restricting Payara's Ports to localhost
121+
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
122+
123+
In the recommended setup of Dataverse, you do not expose Payara's ports directly to the Internet. Rather, you front Payara with a proxy such as Apache.
124+
125+
If you are running Payara and your proxy on the same server, we recommend having Payara listen only to localhost, which is how your proxy talks to it, with the following command:
126+
127+
``./asadmin set server-config.network-config.network-listeners.network-listener.http-listener-1.address=127.0.0.1``
128+
129+
(You should **NOT** use the configuration option above if you are running in a load-balanced environment, or otherwise have your proxy on a different host than Payara.)
130+
131+
To test that Payara is now only listening on localhost, try hitting port 8080 from the Internet. Payara should not respond.
132+
133+
See also :ref:`network-ports`.
134+
118135
.. _secure-password-storage:
119136

120137
Secure Password Storage
@@ -246,6 +263,8 @@ If you are running an installation with Apache and Payara on the same server, an
246263

247264
You should **NOT** use the configuration option above if you are running in a load-balanced environment, or otherwise have the web server on a different host than the application server.
248265

266+
This security tip is also mentioned at :ref:`payara-ports-localhost-only`.
267+
249268
.. _root-collection-permissions:
250269

251270
Root Dataverse Collection Permissions

doc/sphinx-guides/source/user/dataset-management.rst

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -704,7 +704,7 @@ If you have a Contributor role (can edit metadata, upload files, and edit files,
704704
Preview URL to Review Unpublished Dataset
705705
=========================================
706706

707-
Creating a Preview URL for a draft version of your dataset allows you to share your dataset (for viewing and downloading of files) before it is published to a wide group of individuals who may not have a user account on the Dataverse installation. Anyone you send the Preview URL to will not have to log into the Dataverse installation to view the unpublished dataset. Once a dataset has been published you may create new General Preview URLs for subsequent draft versions, but the Anonymous Preview URL will no longer be available.
707+
Creating a Preview URL for a draft version of your dataset allows you to share your dataset (for viewing and downloading files, including :ref:`restricted <restricted-files>` and :ref:`embargoed <embargoes>` files) before it is published to a wide group of people who might not have a user account on the Dataverse installation. Anyone you send the Preview URL to will not have to log in to the Dataverse installation to view the unpublished dataset. Once a dataset has been published, you may create new General Preview URLs for subsequent draft versions, but the Anonymous Preview URL will no longer be available.
708708

709709
**Note:** To create a Preview URL, you must have the *ManageDatasetPermissions* permission for your draft dataset, usually given by the :ref:`roles <permissions>` *Curator* or *Administrator*.
710710

@@ -726,6 +726,8 @@ To disable a Preview URL and to revoke access, follow the same steps as above un
726726

727727
Note that only one Preview URL (normal or with anonymized access) can be configured per dataset at a time.
728728

729+
.. _embargoes:
730+
729731
Embargoes
730732
=========
731733

0 commit comments

Comments
 (0)