Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
593c31a
api calls to search for projects/get a project
qqmyers Feb 24, 2025
b573a45
initial tests
qqmyers Feb 24, 2025
3724a1d
updated tests - for failure cases
qqmyers Feb 24, 2025
67c9441
docs
qqmyers Feb 24, 2025
7da4bd7
example image
qqmyers Feb 24, 2025
bf8e354
release note
qqmyers Feb 24, 2025
cdd32d2
remove links
qqmyers Feb 24, 2025
8e7f2eb
Merge remote-tracking branch 'IQSS/develop' into DANS-CSL
qqmyers Feb 28, 2025
e6f1169
path fix
qqmyers Feb 28, 2025
80c66af
further path fix
qqmyers Feb 28, 2025
b80856e
link fix
qqmyers Feb 28, 2025
c7c5af6
missing colon
qqmyers Feb 28, 2025
ecb9c79
Merge remote-tracking branch 'IQSS/develop' into DANS-CSL
qqmyers Mar 5, 2025
9be84b2
Merge remote-tracking branch 'IQSS/develop' into TKLabels
qqmyers Mar 10, 2025
f3fffe8
fix query per Ashley
qqmyers Mar 10, 2025
5292848
Merge remote-tracking branch 'IQSS/develop' into TKLabels
qqmyers Mar 13, 2025
da3f314
Merge remote-tracking branch 'IQSS/develop' into TKLabels
qqmyers Mar 19, 2025
c86e49f
Merge remote-tracking branch 'IQSS/develop' into AWSv2
qqmyers Apr 10, 2025
387a500
Merge remote-tracking branch 'IQSS/develop' into ApacheHTTPUpdate
qqmyers May 30, 2025
8948d60
add reference to LC mdb
qqmyers Jun 23, 2025
229fbbc
remove unused imports, info->fine per review
qqmyers Jun 23, 2025
0d2ed6d
Merge remote-tracking branch 'IQSS/develop' into TKLabels
qqmyers Jun 23, 2025
7cbd078
Merge branch 'TKLabels' of https://github.com/GlobalDataverseCommunit…
qqmyers Jun 23, 2025
2fdcafd
Update doc/release-notes/TKLabels.md
qqmyers Jun 24, 2025
3cdc3f9
Update doc/release-notes/TKLabels.md
qqmyers Jun 24, 2025
f9d9efe
Merge remote-tracking branch 'IQSS/develop' into TKLabels
qqmyers Jun 26, 2025
0bd20e4
Merge branch 'TKLabels' of https://github.com/GlobalDataverseCommunit…
qqmyers Jun 26, 2025
29cbca9
Merge remote-tracking branch 'IQSS/develop' into TKLabels
qqmyers Jun 26, 2025
b075488
Merge remote-tracking branch 'IQSS/develop' into TKLabels
qqmyers Jun 26, 2025
61181ba
Add a feature flag to optionally enable permission check in LC api
qqmyers Jun 27, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 19 additions & 0 deletions doc/release-notes/TKLabels.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
New API calls to find projects at https://localcontextshub.org associated with a dataset have been added. This supports integration via
an external vocabulary script that allows users to associate such a project with their dataset and display the associated Notices and Tribal Knowledge Labels.


Connecting to LocalContexts requires a LocalContexts API Key. Using both the production and sandbox (test) LocalContexts servers are supported.

See also [the guides](https://dataverse-guide--11294.org.readthedocs.build/en/11294/installation/localcontexts.html) and #11294.

## Settings Added

The following settings have been added:

dataverse.localcontexts.url
dataverse.localcontexts.api-key





25 changes: 25 additions & 0 deletions doc/sphinx-guides/source/installation/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3594,6 +3594,31 @@ This setting allows admins to highlight a few of the 1000+ CSL citation styles a
These will be listed above the alphabetical list of all styles in the "View Styled Citations" pop-up.
The default value when not set is "chicago-author-date, ieee".

.. _localcontexts:

localcontexts.url
+++++++++++++++++

.. note::
For more information about LocalContexts integration, see :doc:`/installation/localcontexts`.

The URL for the Local Contexts Hub API.

| Example: ``https://localcontextshub.org/``
| The sandbox URL ``https://sandbox.localcontextshub.org/`` can be used for testing.

Can also be set via *MicroProfile Config API* sources, e.g. the environment variable ``DATAVERSE_LOCALCONTEXTS_URL``.

localcontexts.api-key
+++++++++++++++++++++

The API key for accessing the Local Contexts Hub.

| Example: ``your_api_key_here``
| It's recommended to use a password alias for this setting, as described in the :ref:`secure-password-storage` section.

Can also be set via *MicroProfile Config API* sources, e.g. the environment variable ``DATAVERSE_LOCALCONTEXTS_API_KEY``.

.. _dataverse.cors:

CORS Settings
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1 change: 1 addition & 0 deletions doc/sphinx-guides/source/installation/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,4 +22,5 @@ Installation Guide
oidc
orcid
external-tools
localcontexts
advanced
35 changes: 35 additions & 0 deletions doc/sphinx-guides/source/installation/localcontexts.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
LocalContexts Integration
=========================

.. contents:: |toctitle|
:local:

`Local Contexts <https://localcontexts.org/>`_ is a global initiative that supports Indigenous communities in the management and sharing of their cultural heritage and data.
The `Local Contexts Hub <https://localcontextshub.org/>`_ is a platform that enables the creation and application of Traditional Knowledge (TK) and Biocultural (BC) Labels and Notices.
These labels and notices help to communicate the cultural context and appropriate use of Indigenous data and cultural heritage materials.

Dataverse supports integration with the Local Contexts Hub so that Labels and Notices associated with a dataset can be displayed on the dataset page:

.. figure:: ./img/LCDemo.png
:alt: Dataset Page showing Local Contexts integration with Dataverse Software

Configuration
-------------

There are several steps to LocalContexts integration.

First, you need to configure the LOCALCONTEXTS_URL and LOCALCONTEXTS_API_KEY as described in the :ref:`localcontexts` section of the Configuration Guide.
API Keys are available to Local Contexts Integration Partners - see https://localcontexts.org/hub-agreements/integration-partners/ for details.

Next, you should add the Local Contexts metadatablock and configure the associated external vocabulary script.
The metadatablock contains one field allowing Dataverse to store the URL of an associated Local Contexts Hub project.
The external vocabulary script interacts with the Local Contexts Hub (via the Dataverse server) to display the Labels and Notices associated with the proect and provide a link to it.
The script also supports adding/removing such a link from the dataset's metadata. Note that only a project that references the dataset in its `publication_doi` field can be linked to a dataset.
See https://github.com/gdcc/dataverse-external-vocab-support/blob/main/packages/local_contexts/README.md for details on these steps.

Lastly, if you wish the Local Contexts information to be shown in the summary section of the dataset page, as shown in the image above, you should add `LCProjectUrl` to list of custom summary fields via use of the :ref:`:CustomDatasetSummaryFields` setting.

Optionally, one could also set the dataverse.feature.add-local-contexts-permission-check FeatureFlag to true. This assures that only users editing datasets can use the LocalContexts search functionality.
However, as this currently would also require setting the dataverse.feature.api-session-auth, the security implications of which haven't been fully explored, it is not recommended unless problematic use is seen.
(When API access via OpenIdConnect is available, use of api-session-auth would not be required.)

1 change: 1 addition & 0 deletions doc/sphinx-guides/source/user/appendix.rst
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@ Unlike supported metadata, experimental metadata is not enabled by default in a
- `CodeMeta Software Metadata <https://docs.google.com/spreadsheets/d/e/2PACX-1vTE-aSW0J7UQ0prYq8rP_P_AWVtqhyv46aJu9uPszpa9_UuOWRsyFjbWFDnCd7us7PSIpW7Qg2KwZ8v/pub>`__: based on the `CodeMeta Software Metadata Schema, version 2.0 <https://codemeta.github.io/terms/>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/codemeta.tsv>`__)
- Computational Workflow Metadata (`see .tsv <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/computational_workflow.tsv>`__): adapted from `Bioschemas Computational Workflow Profile, version 1.0 <https://bioschemas.org/profiles/ComputationalWorkflow/1.0-RELEASE>`__ and `Codemeta <https://codemeta.github.io/terms/>`__.
- Archival Metadata (`see .tsv <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/archival.tsv>`__): Enables repositories to register metadata relating to the potential archiving of the dataset at a depositor archive, whether that be your own institutional archive or an external archive, i.e. a historical archive.
- Local Contexts Metadata (`see .tsv <https://github.com/gdcc/dataverse-external-vocab-support/blob/main/packages/local_contexts/cvocLocalContexts.tsv>`__): Supports integration with the `Local Contexts <https://localcontexts.org/>`__ platform, enabling the use of Traditional Knowledge and Biocultural Labels, and Notices. For more information on setup and configuration, see :doc:`../installation/localcontexts`.

Please note: these custom metadata schemas are not included in the Solr schema for indexing by default, you will need
to add them as necessary for your custom metadata blocks. See "Update the Solr Schema" in :doc:`../admin/metadatacustomization`.
Expand Down
172 changes: 172 additions & 0 deletions src/main/java/edu/harvard/iq/dataverse/api/LocalContexts.java
Original file line number Diff line number Diff line change
@@ -0,0 +1,172 @@
package edu.harvard.iq.dataverse.api;

import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;
import java.net.http.HttpClient;
import java.net.http.HttpRequest;
import java.net.http.HttpResponse;
import java.util.logging.Logger;

import edu.harvard.iq.dataverse.Dataset;
import edu.harvard.iq.dataverse.DatasetServiceBean;
import edu.harvard.iq.dataverse.DataverseRequestServiceBean;
import edu.harvard.iq.dataverse.PermissionServiceBean;
import edu.harvard.iq.dataverse.api.auth.AuthRequired;
import edu.harvard.iq.dataverse.authorization.Permission;
import edu.harvard.iq.dataverse.engine.command.DataverseRequest;
import edu.harvard.iq.dataverse.settings.FeatureFlags;
import edu.harvard.iq.dataverse.settings.JvmSettings;
import edu.harvard.iq.dataverse.util.json.JsonUtil;
import jakarta.ejb.EJB;
import jakarta.inject.Inject;
import jakarta.json.JsonObject;
import jakarta.ws.rs.*;
import jakarta.ws.rs.core.Context;
import jakarta.ws.rs.core.MediaType;
import jakarta.ws.rs.core.Response;
import jakarta.ws.rs.container.ContainerRequestContext;

@Path("localcontexts")
public class LocalContexts extends AbstractApiBean {

protected static final Logger logger = Logger.getLogger(LocalContexts.class.getName());

@EJB
DatasetServiceBean datasetService;

@Inject
DataverseRequestServiceBean dvRequestService;

@EJB
PermissionServiceBean permissionService;

@GET
@Path("/datasets/{id}")
@Produces(MediaType.APPLICATION_JSON)
@AuthRequired
public Response getDatasetLocalContexts(@Context ContainerRequestContext crc, @PathParam("id") String id) {
try {
Dataset dataset = findDatasetOrDie(id);
DataverseRequest req = createDataverseRequest(getRequestUser(crc));

// Check if the user has edit dataset permission
/* Feature flag to skip permission check
* If you add the api-session-auth FeatureFlag, you can verify if the user has edit permissions
*
*/
if (FeatureFlags.ADD_LOCAL_CONTEXTS_PERMISSION_CHECK.enabled() && !permissionService.userOn(req.getUser(), dataset).has(Permission.EditDataset)) {
return error(Response.Status.FORBIDDEN,
"You do not have permission to query LocalContexts about this dataset.");
}

String localContextsUrl = JvmSettings.LOCALCONTEXTS_URL.lookupOptional().orElse(null);
String localContextsApiKey = JvmSettings.LOCALCONTEXTS_API_KEY.lookupOptional().orElse(null);

if (localContextsUrl == null || localContextsApiKey == null) {
return error(Response.Status.NOT_FOUND, "LocalContexts API configuration is missing.");
}

String datasetDoi = dataset.getGlobalId().asString();
String apiUrl = localContextsUrl + "api/v2/projects/?publication_doi=" + datasetDoi;
logger.fine("URL used: " + apiUrl);
try {
HttpClient client = HttpClient.newHttpClient();
HttpRequest request = HttpRequest.newBuilder().uri(new URI(apiUrl))
.header("X-Api-Key", localContextsApiKey).GET().build();

HttpResponse<String> response;

response = client.send(request, HttpResponse.BodyHandlers.ofString());

if (response.statusCode() == 200) {
// Assuming the response is already in JSON format
logger.fine("Response from search: " + response.body());
JsonObject jsonObject = JsonUtil.getJsonObject(response.body());
return ok(jsonObject);
} else {
return error(Response.Status.SERVICE_UNAVAILABLE,
"Error from LocalContexts API: " + response.statusCode());
}
} catch (URISyntaxException e) {
logger.warning(e.getMessage());
return error(Response.Status.SERVICE_UNAVAILABLE, "LocalContexts connection misconfigured.");
} catch (IOException | InterruptedException e) {
logger.warning(e.getMessage());
e.printStackTrace();
return error(Response.Status.SERVICE_UNAVAILABLE, "Error contacting LocalContexts");

}
} catch (WrappedResponse ex) {
return ex.getResponse();
}
}

@GET
@Path("/datasets/{id}/{projectId}")
@Produces(MediaType.APPLICATION_JSON)
public Response searchLocalContexts(@PathParam("id") String datasetId, @PathParam("projectId") String projectId) {
try {
Dataset dataset = findDatasetOrDie(datasetId);
String localContextsUrl = JvmSettings.LOCALCONTEXTS_URL.lookupOptional().orElse(null);
String localContextsApiKey = JvmSettings.LOCALCONTEXTS_API_KEY.lookupOptional().orElse(null);

if (localContextsUrl == null || localContextsApiKey == null) {
return error(Response.Status.NOT_FOUND, "LocalContexts API configuration is missing.");
}

String apiUrl = localContextsUrl + "api/v2/projects/" + projectId + "/";
logger.fine("URL used: " + apiUrl);
try {
HttpClient client = HttpClient.newHttpClient();
HttpRequest request = HttpRequest.newBuilder().uri(new URI(apiUrl))
.header("X-Api-Key", localContextsApiKey).GET().build();

HttpResponse<String> response = client.send(request, HttpResponse.BodyHandlers.ofString());

if (response.statusCode() == 200) {
// Parse the JSON response
JsonObject jsonResponse = JsonUtil.getJsonObject(response.body());
logger.fine("Response from get: " + JsonUtil.prettyPrint(jsonResponse));

// Check if the response contains a "publication_doi" key
if (jsonResponse.containsKey("external_ids")) {
JsonObject externalIds = jsonResponse.getJsonObject("external_ids");
if (externalIds.containsKey("publication_doi")) {
String responseDoi = externalIds.getString("publication_doi");
String datasetDoi = dataset.getGlobalId().asString();
// Compare the DOI from the response with the dataset's DOI
if (responseDoi.equals(datasetDoi)) {
// Return the JSON response as-is
return ok(jsonResponse);
} else {
// DOI mismatch, return 404
return error(Response.Status.NOT_FOUND,
"LocalContexts information not found for this dataset.");
}
} else {
// "publication_doi" key not found in the response, return 404
return error(Response.Status.NOT_FOUND, "Invalid response from Local Contexts API.");
}
} else {
// "external_ids" key not found in the response, return 404
return error(Response.Status.NOT_FOUND, "Invalid response from Local Contexts API.");
}

} else {
return error(Response.Status.SERVICE_UNAVAILABLE,
"Error from Local Contexts API: " + response.statusCode());
}
} catch (URISyntaxException e) {
logger.warning(e.getMessage());
return error(Response.Status.SERVICE_UNAVAILABLE, "LocalContexts connection misconfigured.");
} catch (IOException | InterruptedException e) {
logger.warning(e.getMessage());
e.printStackTrace();
return error(Response.Status.SERVICE_UNAVAILABLE, "Error contacting LocalContexts");
}
} catch (WrappedResponse ex) {
return ex.getResponse();
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
public enum FeatureFlags {

/**
* Enables API authentication via session cookie (JSESSIONID). Caution: Enabling this feature flag exposes the installation to CSRF risks
* Enables API authentication via session cookie (JSESSIONID). Caution: Enabling this feature flag may expose the installation to CSRF risks
* @apiNote Raise flag by setting "dataverse.feature.api-session-auth"
* @since Dataverse 5.14
*/
Expand Down Expand Up @@ -151,6 +151,21 @@ public enum FeatureFlags {
* @since Dataverse 6.5
*/
VERSION_NOTE("enable-version-note"),
/**
* This flag adds a permission check to assure that the user calling the
* /api/localcontexts/datasets/{id} can edit the dataset with that id. This is
* currently the only use case - see
* https://github.com/gdcc/dataverse-external-vocab-support/tree/main/packages/local_contexts.
* The flag adds additional security to stop other uses, but would currently
* have to be used in conjunction with the api-session-auth feature flag (the
* security implications of which have not been fully investigated) to still
* allow adding Local Contexts metadata to a dataset.
*
* @apiNote Raise flag by setting
* "dataverse.feature.add-local-contexts-permission-check"
* @since Dataverse 6.5
*/
ADD_LOCAL_CONTEXTS_PERMISSION_CHECK("add-local-contexts-permission-check"),

;

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -281,6 +281,11 @@ public enum JvmSettings {
SCOPE_CORS_HEADERS(SCOPE_CORS, "headers"),
CORS_ALLOW_HEADERS(SCOPE_CORS_HEADERS, "allow"),
CORS_EXPOSE_HEADERS(SCOPE_CORS_HEADERS, "expose"),

// LOCALCONTEXTS
SCOPE_LOCALCONTEXTS(PREFIX, "localcontexts"),
LOCALCONTEXTS_URL(SCOPE_LOCALCONTEXTS, "url"),
LOCALCONTEXTS_API_KEY(SCOPE_LOCALCONTEXTS, "api-key"),
;

private static final String SCOPE_SEPARATOR = ".";
Expand Down
Loading