Skip to content

Commit ad2b1e7

Browse files
Merge branch 'develop' into 7618-file-level-permissions-restricted-draft
2 parents 0c30e97 + 9d47bb2 commit ad2b1e7

File tree

13 files changed

+602
-105
lines changed

13 files changed

+602
-105
lines changed
Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
## New Endpoint: `/datasets/{id}/license`
2+
3+
A new endpoint has been implemented to manage dataset licenses.
4+
5+
### Functionality
6+
- Updates the license of a dataset by applying it to the draft version.
7+
- If no draft exists, a new one is automatically created.
8+
9+
### Usage
10+
This endpoint supports two ways of defining a license:
11+
1. **Predefined License** – Provide the license name (e.g., `CC BY 4.0`).
12+
2. **Custom Terms of Use and Access** – Provide a JSON body with the `customTerms` object.
13+
- All fields are optional **except** `termsOfUse`, which is required.

doc/sphinx-guides/source/api/native-api.rst

Lines changed: 51 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4208,24 +4208,24 @@ Delete files from a dataset. This API call allows you to delete multiple files f
42084208
42094209
curl -H "X-Dataverse-key:$API_TOKEN" -X PUT "$SERVER_URL/api/datasets/:persistentId/deleteFiles?persistentId=$PERSISTENT_IDENTIFIER" \
42104210
-H "Content-Type: application/json" \
4211-
-d '{"fileIds": [1, 2, 3]}'
4211+
-d '[1, 2, 3]'
42124212
42134213
The fully expanded example above (without environment variables) looks like this:
42144214
42154215
.. code-block:: bash
42164216
42174217
curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X PUT "https://demo.dataverse.org/api/datasets/:persistentId/deleteFiles?persistentId=doi:10.5072/FK2ABCDEF" \
42184218
-H "Content-Type: application/json" \
4219-
-d '{"fileIds": [1, 2, 3]}'
4219+
-d '[1, 2, 3]'
42204220
4221-
The ``fileIds`` in the JSON payload should be an array of file IDs that you want to delete from the dataset.
4221+
The JSON payload should be an array of file IDs that you want to delete from the dataset.
42224222
42234223
You must have the appropriate permissions to delete files from the dataset.
42244224
42254225
Upon success, the API will return a JSON response with a success message and the number of files deleted.
42264226
42274227
The API call will report a 400 (BAD REQUEST) error if any of the files specified do not exist or are not in the latest version of the specified dataset.
4228-
The ``fileIds`` in the JSON payload should be an array of file IDs that you want to delete from the dataset.
4228+
The JSON payload should be an array of file IDs that you want to delete from the dataset.
42294229
42304230
.. _api-dataset-role-assignment-history:
42314231
@@ -4357,6 +4357,53 @@ The CSV response for this call is the same as for the /api/datasets/{id}/assignm
43574357
43584358
Note: This feature requires the "role-assignment-history" feature flag to be enabled (see :ref:`feature-flags`).
43594359
4360+
Update Dataset License
4361+
~~~~~~~~~~~~~~~~~~~~~~
4362+
4363+
Updates the license of a dataset by applying it to the draft version, or by creating a draft if none exists.
4364+
4365+
The JSON representation of a license can take two forms, depending on whether you want to specify a predefined license or define custom terms of use and access.
4366+
4367+
To set a predefined license (e.g., CC BY 4.0), provide a JSON body with the license name:
4368+
4369+
.. code-block:: json
4370+
4371+
{
4372+
"name": "CC BY 4.0"
4373+
}
4374+
4375+
To define custom terms of use and access, provide a JSON body with the following properties. All fields within ``customTerms`` are optional, except for the ``termsOfUse`` field, which is required:
4376+
4377+
.. code-block:: json
4378+
4379+
{
4380+
"customTerms": {
4381+
"termsOfUse": "Your terms of use",
4382+
"confidentialityDeclaration": "Your confidentiality declaration",
4383+
"specialPermissions": "Your special permissions",
4384+
"restrictions": "Your restrictions",
4385+
"citationRequirements": "Your citation requirements",
4386+
"depositorRequirements": "Your depositor requirements",
4387+
"conditions": "Your conditions",
4388+
"disclaimer": "Your disclaimer"
4389+
}
4390+
}
4391+
4392+
.. code-block:: bash
4393+
4394+
export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
4395+
export SERVER_URL=https://demo.dataverse.org
4396+
export ID=3
4397+
export FILE_PATH=license.json
4398+
4399+
curl -H "X-Dataverse-key:$API_TOKEN" -X PUT "$SERVER_URL/api/datasets/$ID/license" -H "Content-type:application/json" --upload-file $FILE_PATH
4400+
4401+
The fully expanded example above (without environment variables) looks like this:
4402+
4403+
.. code-block:: bash
4404+
4405+
curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X PUT "https://demo.dataverse.org/api/datasets/3/license" -H "Content-type:application/json" --upload-file license.json
4406+
43604407
Files
43614408
-----
43624409

modules/dataverse-spi/src/main/java/io/gdcc/spi/export/ExportDataOption.java

Lines changed: 0 additions & 51 deletions
This file was deleted.

src/main/java/edu/harvard/iq/dataverse/api/AbstractApiBean.java

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,7 @@
2424
import edu.harvard.iq.dataverse.engine.command.impl.GetSpecificPublishedDatasetVersionCommand;
2525
import edu.harvard.iq.dataverse.externaltools.ExternalToolServiceBean;
2626
import edu.harvard.iq.dataverse.license.LicenseServiceBean;
27+
import edu.harvard.iq.dataverse.makedatacount.DatasetMetricsServiceBean;
2728
import edu.harvard.iq.dataverse.pidproviders.FailedPIDResolutionLoggingServiceBean;
2829
import edu.harvard.iq.dataverse.pidproviders.PidUtil;
2930
import edu.harvard.iq.dataverse.pidproviders.FailedPIDResolutionLoggingServiceBean.FailedPIDResolutionEntry;
@@ -224,6 +225,9 @@ String getWrappedMessageWhenJson() {
224225
@EJB
225226
protected ExternalToolServiceBean externalToolService;
226227

228+
@EJB
229+
protected DatasetMetricsServiceBean datasetMetricsService;
230+
227231
@EJB
228232
DataFileServiceBean fileSvc;
229233

src/main/java/edu/harvard/iq/dataverse/api/Datasets.java

Lines changed: 38 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,10 @@
22

33
import edu.harvard.iq.dataverse.*;
44
import edu.harvard.iq.dataverse.DatasetLock.Reason;
5-
import edu.harvard.iq.dataverse.DatasetVersion.VersionState;
6-
import edu.harvard.iq.dataverse.DataverseRoleServiceBean.RoleAssignmentHistoryConsolidatedEntry;
75
import edu.harvard.iq.dataverse.actionlogging.ActionLogRecord;
8-
import edu.harvard.iq.dataverse.api.AbstractApiBean.WrappedResponse;
96
import edu.harvard.iq.dataverse.api.auth.AuthRequired;
7+
import edu.harvard.iq.dataverse.api.dto.CustomTermsDTO;
8+
import edu.harvard.iq.dataverse.api.dto.LicenseUpdateRequest;
109
import edu.harvard.iq.dataverse.api.dto.RoleAssignmentDTO;
1110
import edu.harvard.iq.dataverse.authorization.AuthenticationServiceBean;
1211
import edu.harvard.iq.dataverse.authorization.DataverseRole;
@@ -31,7 +30,6 @@
3130
import edu.harvard.iq.dataverse.engine.command.DataverseRequest;
3231
import edu.harvard.iq.dataverse.engine.command.exception.*;
3332
import edu.harvard.iq.dataverse.engine.command.impl.*;
34-
import edu.harvard.iq.dataverse.export.DDIExportServiceBean;
3533
import edu.harvard.iq.dataverse.export.ExportService;
3634
import edu.harvard.iq.dataverse.externaltools.ExternalTool;
3735
import edu.harvard.iq.dataverse.externaltools.ExternalToolHandler;
@@ -142,9 +140,6 @@ public class Datasets extends AbstractApiBean {
142140
@EJB
143141
AuthenticationServiceBean authenticationServiceBean;
144142

145-
@EJB
146-
DDIExportServiceBean ddiExportService;
147-
148143
@EJB
149144
MetadataBlockServiceBean metadataBlockService;
150145

@@ -166,10 +161,6 @@ public class Datasets extends AbstractApiBean {
166161
@EJB
167162
SettingsServiceBean settingsService;
168163

169-
// TODO: Move to AbstractApiBean
170-
@EJB
171-
DatasetMetricsServiceBean datasetMetricsSvc;
172-
173164
@EJB
174165
DatasetExternalCitationsServiceBean datasetExternalCitationsService;
175166

@@ -203,9 +194,6 @@ public class Datasets extends AbstractApiBean {
203194
@Inject
204195
DatasetTypeServiceBean datasetTypeSvc;
205196

206-
@Inject
207-
DatasetFieldsValidator datasetFieldsValidator;
208-
209197
@Inject
210198
DataFileCategoryServiceBean dataFileCategoryService;
211199

@@ -1155,16 +1143,16 @@ public Response editVersionMetadata(@Context ContainerRequestContext crc, String
11551143
return ex.getResponse();
11561144
}
11571145
}
1158-
1146+
11591147
@PUT
11601148
@AuthRequired
11611149
@Path("{id}/access")
11621150
public Response editVersionTermsOfAccess(@Context ContainerRequestContext crc, String jsonBody, @PathParam("id") String id,
11631151
@QueryParam("sourceLastUpdateTime") String sourceLastUpdateTime) {
11641152
try {
1165-
1153+
11661154
boolean publicInstall = settingsSvc.isTrueForKey(SettingsServiceBean.Key.PublicInstall, false);
1167-
1155+
11681156
Dataset dataset = findDatasetOrDie(id);
11691157

11701158
if (sourceLastUpdateTime != null) {
@@ -1174,11 +1162,11 @@ public Response editVersionTermsOfAccess(@Context ContainerRequestContext crc, S
11741162
JsonObject json = JsonUtil.getJsonObject(jsonBody);
11751163

11761164
TermsOfUseAndAccess toua = jsonParser().parseTermsOfAccess(json);
1177-
1165+
11781166
if (publicInstall && (toua.isFileAccessRequest() || !toua.getTermsOfAccess().isEmpty())){
11791167
return error(BAD_REQUEST, "Setting File Access Request or Terms of Access is not permitted on a public installation.");
11801168
}
1181-
1169+
11821170
DatasetVersion updatedVersion = execCommand(new UpdateDatasetTermsOfAccessCommand(dataset, toua, createDataverseRequest(getRequestUser(crc)))).getLatestVersion();
11831171

11841172
return ok(json(updatedVersion, true));
@@ -3575,7 +3563,7 @@ public Response getMakeDataCountMetric(@PathParam("id") String idSupplied, @Path
35753563
return error(Response.Status.BAD_REQUEST, "Country must be one of the ISO 1366 Country Codes");
35763564
}
35773565
}
3578-
DatasetMetrics datasetMetrics = datasetMetricsSvc.getDatasetMetricsByDatasetForDisplay(dataset, monthYear, country);
3566+
DatasetMetrics datasetMetrics = datasetMetricsService.getDatasetMetricsByDatasetForDisplay(dataset, monthYear, country);
35793567
if (datasetMetrics == null) {
35803568
return ok("No metrics available for dataset " + dataset.getId() + " for " + yyyymm + " for country code " + country + ".");
35813569
} else if (datasetMetrics.getDownloadsTotal() + datasetMetrics.getViewsTotal() == 0) {
@@ -6041,7 +6029,7 @@ public Response deleteDatasetFiles(@Context ContainerRequestContext crc, @PathPa
60416029
}, getRequestUser(crc));
60426030
}
60436031

6044-
@GET
6032+
@GET
60456033
@AuthRequired
60466034
@Path("{id}/versions/{versionId}/versionNote")
60476035
public Response getVersionCreationNote(@Context ContainerRequestContext crc, @PathParam("id") String datasetId, @PathParam("versionId") String versionId, @Context UriInfo uriInfo, @Context HttpHeaders headers) throws WrappedResponse {
@@ -6110,15 +6098,15 @@ public Response deleteVersionNote(@Context ContainerRequestContext crc, @PathPar
61106098
return ok("Note deleted");
61116099
}, getRequestUser(crc));
61126100
}
6113-
6101+
61146102
@GET
61156103
@AuthRequired
61166104
@Path("{identifier}/assignments/history")
61176105
@Produces({ MediaType.APPLICATION_JSON, "text/csv" })
61186106
public Response getRoleAssignmentHistory(@Context ContainerRequestContext crc, @PathParam("identifier") String id, @Context HttpHeaders headers) {
61196107
return response(req -> {
61206108
Dataset dataset = findDatasetOrDie(id);
6121-
6109+
61226110
// user is authenticated
61236111
AuthenticatedUser authenticatedUser = getRequestAuthenticatedUserOrDie(crc);
61246112

@@ -6135,11 +6123,37 @@ public Response getFilesRoleAssignmentHistory(@Context ContainerRequestContext c
61356123
@Context HttpHeaders headers) {
61366124
return response(req -> {
61376125
Dataset dataset = findDatasetOrDie(id);
6138-
6126+
61396127
// user is authenticated
61406128
AuthenticatedUser authenticatedUser = getRequestAuthenticatedUserOrDie(crc);
61416129

61426130
return getRoleAssignmentHistoryResponse(dataset, authenticatedUser, true, headers);
61436131
}, getRequestUser(crc));
61446132
}
6133+
6134+
@PUT
6135+
@AuthRequired
6136+
@Path("{id}/license")
6137+
public Response updateLicense(@Context ContainerRequestContext crc,
6138+
@PathParam("id") String datasetId,
6139+
LicenseUpdateRequest requestBody) {
6140+
return response(req -> {
6141+
Dataset dataset = findDatasetOrDie(datasetId);
6142+
if (requestBody.getName() != null && !requestBody.getName().isEmpty()) {
6143+
String licenseName = requestBody.getName();
6144+
License license = licenseSvc.getByNameOrUri(licenseName);
6145+
if (license == null) {
6146+
return notFound(BundleUtil.getStringFromBundle("datasets.api.updateLicense.licenseNotFound", List.of(licenseName)));
6147+
}
6148+
execCommand(new UpdateDatasetLicenseCommand(req, dataset, license));
6149+
return ok(BundleUtil.getStringFromBundle("datasets.api.updateLicense.success"));
6150+
} else if (requestBody.getCustomTerms() != null) {
6151+
CustomTermsDTO customTerms = requestBody.getCustomTerms();
6152+
execCommand(new UpdateDatasetLicenseCommand(req, dataset, customTerms.toTermsOfUseAndAccess()));
6153+
return ok(BundleUtil.getStringFromBundle("datasets.api.updateLicense.success"));
6154+
} else {
6155+
return badRequest(BundleUtil.getStringFromBundle("datasets.api.updateLicense.licenseNameIsEmpty"));
6156+
}
6157+
}, getRequestUser(crc));
6158+
}
61456159
}

src/main/java/edu/harvard/iq/dataverse/api/MakeDataCountApi.java

Lines changed: 1 addition & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,28 +1,21 @@
11
package edu.harvard.iq.dataverse.api;
22

33
import edu.harvard.iq.dataverse.Dataset;
4-
import edu.harvard.iq.dataverse.DatasetServiceBean;
54
import edu.harvard.iq.dataverse.GlobalId;
65
import edu.harvard.iq.dataverse.makedatacount.DatasetExternalCitations;
76
import edu.harvard.iq.dataverse.makedatacount.DatasetExternalCitationsServiceBean;
87
import edu.harvard.iq.dataverse.makedatacount.DatasetMetrics;
9-
import edu.harvard.iq.dataverse.makedatacount.DatasetMetricsServiceBean;
108
import edu.harvard.iq.dataverse.makedatacount.MakeDataCountProcessState;
119
import edu.harvard.iq.dataverse.makedatacount.MakeDataCountProcessStateServiceBean;
1210
import edu.harvard.iq.dataverse.pidproviders.PidProvider;
1311
import edu.harvard.iq.dataverse.pidproviders.PidUtil;
1412
import edu.harvard.iq.dataverse.pidproviders.doi.datacite.DataCiteDOIProvider;
1513
import edu.harvard.iq.dataverse.settings.JvmSettings;
16-
import edu.harvard.iq.dataverse.util.SystemConfig;
1714
import edu.harvard.iq.dataverse.util.json.JsonUtil;
1815

1916
import java.io.IOException;
2017
import java.io.InputStream;
21-
import java.net.HttpURLConnection;
22-
import java.net.MalformedURLException;
23-
import java.net.URI;
24-
import java.net.URISyntaxException;
25-
import java.net.URL;
18+
import java.net.*;
2619
import java.util.Iterator;
2720
import java.util.List;
2821
import java.util.concurrent.Future;
@@ -57,16 +50,10 @@ public class MakeDataCountApi extends AbstractApiBean {
5750

5851
private static final Logger logger = Logger.getLogger(MakeDataCountApi.class.getCanonicalName());
5952

60-
@EJB
61-
DatasetMetricsServiceBean datasetMetricsService;
6253
@EJB
6354
MakeDataCountProcessStateServiceBean makeDataCountProcessStateService;
6455
@EJB
6556
DatasetExternalCitationsServiceBean datasetExternalCitationsService;
66-
@EJB
67-
DatasetServiceBean datasetService;
68-
@EJB
69-
SystemConfig systemConfig;
7057

7158
// Inject the managed executor service provided by the container
7259
@Resource(name = "concurrent/CitationUpdateExecutor")

0 commit comments

Comments
 (0)