Skip to content

Commit 629382e

Browse files
authored
Merge pull request #268038 from shellyhaverkamp/dicom-datalakeGA
DICOM + Azure Data Lake GA
2 parents c6f192e + d6cf1cd commit 629382e

13 files changed

+47
-124
lines changed

articles/healthcare-apis/.openpublishing.redirection.healthcare-apis.json

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -496,6 +496,11 @@
496496
"source_path_from_root": "/articles/healthcare-apis/dicom/dicom-change-feed-overview.md",
497497
"redirect_url": "/azure/healthcare-apis/dicom/change-feed-overview",
498498
"redirect_document_id": true
499+
},
500+
{
501+
"source_path_from_root": "/articles/healthcare-apis/dicom/get-started-with-dicom.md",
502+
"redirect_url": "/azure/healthcare-apis/dicom/dicom-data-lake",
503+
"redirect_document_id": true
499504
},
500505
{
501506
"source_path_from_root": "/articles/healthcare-apis/fhir/configure-azure-rbac-for-fhir.md",

articles/healthcare-apis/dicom/deploy-dicom-services-in-azure-data-lake.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ ms.author: mmitrik
1010
ms.custom: mode-api, devx-track-arm-template
1111
---
1212

13-
# Deploy the DICOM service with Data Lake Storage (Preview)
13+
# Deploy the DICOM service with Azure Data Lake Storage
1414

1515
Deploying the [DICOM® service with Azure Data Lake Storage](dicom-data-lake.md) enables organizations to store and process imaging data in a standardized, secure, and scalable way.
1616

@@ -25,7 +25,7 @@ After deployment completes, you can use the Azure portal to see the details abou
2525
> [!NOTE]
2626
> The Azure Data Lake Storage option is only available for new instances of the DICOM service. After the option becomes generally available, we plan to offer a migration path for existing DICOM service instances.
2727
28-
## Deploy the DICOM service with Data Lake Storage using the Azure portal
28+
## Deploy the DICOM service with Azure Data Lake Storage by using the Azure portal
2929

3030
1. On the **Resource group** page of the Azure portal, select the name of the **Azure Health Data Services workspace**.
3131

@@ -41,9 +41,9 @@ After deployment completes, you can use the Azure portal to see the details abou
4141

4242
1. Enter a name for the DICOM service.
4343

44-
1. Select **External (preview)** for the Storage Location.
44+
1. Select **Data Lake Storage (default)** for the storage location.
4545

46-
:::image type="content" source="media/deploy-data-lake/dicom-deploy-options.png" alt-text="Screenshot showing the options in the Create DICOM service view." lightbox="media/deploy-data-lake/dicom-deploy-options.png":::
46+
:::image type="content" source="media/deploy-data-lake/create-dicom-service-data-lake-sml.png" alt-text="Screenshot showing the storage location option." lightbox="media/deploy-data-lake/create-dicom-service-data-lake-lrg.png":::
4747

4848
1. Select the **subscription** and **resource group** that contains the storage account.
4949

articles/healthcare-apis/dicom/deploy-dicom-services-in-azure.md

Lines changed: 13 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -4,19 +4,19 @@ description: This article describes how to deploy the DICOM service in the Azure
44
author: mmitrik
55
ms.service: healthcare-apis
66
ms.topic: how-to
7-
ms.date: 10/06/2023
7+
ms.date: 03/11/2024
88
ms.author: mmitrik
99
ms.custom: mode-api
1010
---
1111

12-
# Deploy the DICOM service
12+
# Deploy the DICOM service by using the Azure portal
1313

1414
In this quickstart, you learn how to deploy the DICOM® service by using the Azure portal.
1515

1616
After deployment completes, you can use the Azure portal to see the details about the DICOM service, including the service URL. The service URL to access your DICOM service is ```https://<workspacename-dicomservicename>.dicom.azurehealthcareapis.com```. Make sure to specify the API version as part of the URL when you make requests. For more information, see [API versioning for the DICOM service](api-versioning-dicom-service.md).
1717

1818
> [!NOTE]
19-
> A public preview of the DICOM service with Data Lake Storage is now available. This capability provides greater flexibility and control over your imaging data. Learn more: [Deploy the DICOM service with Data Lake Storage (Preview)](deploy-dicom-services-in-azure-data-lake.md)
19+
> The DICOM service with Azure Data Lake Storage is generally available. This capability provides greater flexibility and control over your imaging data. Learn more: [Deploy the DICOM service with Azure Data Lake Storage](deploy-dicom-services-in-azure-data-lake.md)
2020
2121
## Prerequisites
2222

@@ -26,19 +26,23 @@ To deploy the DICOM service, you need a workspace created in the Azure portal. F
2626

2727
1. On the **Resource group** page of the Azure portal, select the name of your **Azure Health Data Services workspace**.
2828

29-
[![Screenshot that shows selecting a workspace resource group.](media/select-workspace-resource-group.png) ](media/select-workspace-resource-group.png#lightbox)
29+
:::image type="content" source="media/select-workspace-resource-group.png" alt-text="Screenshot showing selecting a workspace resource group." lightbox="media/select-workspace-resource-group.png":::
30+
3031

3132
1. Select **Deploy DICOM service**.
3233

33-
[![Screenshot that shows deploying the DICOM service.](media/workspace-deploy-dicom-services.png) ](media/workspace-deploy-dicom-services.png#lightbox)
34+
:::image type="content" source="media/workspace-deploy-dicom-services.png" alt-text="Screenshot showing deployment of the DICOM service." lightbox="media/workspace-deploy-dicom-services.png":::
35+
3436

3537
1. Select **Add DICOM service**.
3638

37-
[![Screenshot that shows adding the DICOM service.](media/add-dicom-service.png) ](media/add-dicom-service.png#lightbox)
39+
:::image type="content" source="media/add-dicom-service.png" alt-text="Screenshot showing how to add the DICOM service." lightbox="media/add-dicom-service.png":::
40+
3841

3942
1. Enter a name for the DICOM service, and then select **Review + create**.
4043

41-
[![Screenshot that shows the DICOM service name.](media/enter-dicom-service-name.png) ](media/enter-dicom-service-name.png#lightbox)
44+
:::image type="content" source="media/enter-dicom-service-name.png" alt-text="Screenshot showing the DICOM service name." lightbox="media/enter-dicom-service-name.png":::
45+
4246

4347
1. (Optional) Select **Next: Tags**.
4448

@@ -48,11 +52,11 @@ To deploy the DICOM service, you need a workspace created in the Azure portal. F
4852

4953
1. After the deployment process is finished, select **Go to resource**.
5054

51-
[![Screenshot that shows Go to resource.](media/go-to-resource.png) ](media/go-to-resource.png#lightbox)
55+
:::image type="content" source="media/go-to-resource.png" alt-text="Screenshot showing Go to resource." lightbox="media/go-to-resource.png":::
5256

5357
The result of the newly deployed DICOM service is shown here.
5458

55-
[![Screenshot that shows the DICOM finished deployment.](media/results-deployed-dicom-service.png) ](media/results-deployed-dicom-service.png#lightbox)
59+
:::image type="content" source="media/results-deployed-dicom-service.png" alt-text="Screenshot showing the DICOM finished deployment." lightbox="media/results-deployed-dicom-service.png":::
5660

5761
## Next steps
5862

articles/healthcare-apis/dicom/dicom-data-lake.md

Lines changed: 7 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,18 @@
11
---
2-
title: Azure Data Lake Storage integration for the DICOM service in Azure Health Data Services
3-
description: Learn how to use Azure Data Lake Storage with the DICOM service to store, access, and analyze medical imaging data in the cloud. Explore the benefits, architecture, and data contracts of this integration.
2+
title: Manage medical imaging data with the DICOM service and Azure Data Lake Storage
3+
description: Learn how to use the DICOM service in Azure Health Data Services to store, access, and analyze medical imaging data in the cloud. Explore the benefits, architecture, and data contracts of the integration of the DICOM service with Azure Data Lake Storage.
44
author: mmitrik
55
ms.service: healthcare-apis
66
ms.subservice: dicom
77
ms.topic: how-to
8-
ms.date: 11/21/2023
8+
ms.date: 03/11/2024
99
ms.author: mmitrik
1010
ms.custom: mode-api
1111
---
1212

13-
# Azure Data Lake Storage integration for the DICOM service (Preview)
13+
# Manage medical imaging data with the DICOM service and Azure Data Lake Storage
1414

15-
The [DICOM&reg; service](overview.md) provides cloud-scale storage for medical imaging data using the DICOMweb standard. With the integration of Azure Data Lake Storage, you gain full control of your imaging data and increased flexibility for accessing and working with that data through the Azure storage ecosystem and APIs.
15+
The [DICOM&reg; service](overview.md) provides cloud-scale storage for medical imaging data using the DICOMweb standard. The integration of the DICOM service with Azure Data Lake Storage means you gain full control of your imaging data and increased flexibility for accessing and working with that data through the Azure storage ecosystem and APIs.
1616

1717
By using Azure Data Lake Storage with the DICOM service, organizations are able to:
1818

@@ -49,9 +49,6 @@ AHDS/{workspace-name}/dicom/{dicom-service-name}/{partition-name}
4949

5050
In addition to DICOM data, a small file to enable [health checks](#health-check) will be written to this location.
5151

52-
> [!NOTE]
53-
> During public preview, the DICOM service writes data to the storage container and reads the data, but user-added data isn't read and indexed by the DICOM service. Similarly, if DICOM data written by the DICOM service is modified or removed, it may result in errors when accessing data with the DICOMweb APIs.
54-
5552
## Permissions
5653

5754
The DICOM service is granted access to the data like any other service or application accessing data in a storage account. Access can be revoked at any time without affecting your organization's ability to access the data. The DICOM service needs the ability to read, write, and delete files in the provided file system. This can be provided by granting the [Storage Blob Data Contributor](/azure/role-based-access-control/built-in-roles#storage-blob-data-contributor) role to the system-assigned or user-assigned managed identity attached to the DICOM service.
@@ -69,18 +66,17 @@ If there is an issue with access, status and details are displayed by [Azure Res
6966

7067
## Limitations
7168

72-
During public preview, the DICOM service with data lake storage has these limitations:
69+
The DICOM service with data lake storage has these limitations:
7370

7471
- [Bulk Import](import-files.md) isn't supported.
7572
- UPS-RS work items aren't stored in the data lake storage account.
7673
- User data added to the data lake storage account isn't read and indexed by the DICOM service. It's possible that a filename collision could occur, so we recommend that you don't write data to the folder structure used by the DICOM service.
7774
- If DICOM data written by the DICOM service is modified or removed, errors might result when accessing data with the DICOMweb APIs.
78-
- Configuration of customer-managed keys isn't supported during the creation of a DICOM service when you opt to use external storage.
7975
- The archive access tier isn't supported. Moving data to the archive tier will result in errors when accessing data with the DICOMweb APIs.
8076

8177
## Next steps
8278

83-
[Deploy the DICOM service with Azure Data Lake Storage (Preview)](deploy-dicom-services-in-azure-data-lake.md)
79+
[Deploy the DICOM service with Azure Data Lake Storage](deploy-dicom-services-in-azure-data-lake.md)
8480

8581
[Get started using DICOM data in analytics workloads](get-started-with-analytics-dicom.md)
8682

articles/healthcare-apis/dicom/get-started-with-analytics-dicom.md

Lines changed: 8 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
title: Get started using DICOM data in analytics workloads - Azure Health Data Services
3-
description: This article demonstrates how to use Azure Data Factory and Microsoft Fabric to perform analytics on DICOM data.
3+
description: Learn how to use Azure Data Factory and Microsoft Fabric to perform analytics on DICOM data.
44
services: healthcare-apis
55
author: mmitrik
66
ms.service: healthcare-apis
@@ -21,7 +21,7 @@ Before you get started, complete these steps:
2121
* Create a [storage account with Azure Data Lake Storage Gen2 capabilities](../../storage/blobs/create-data-lake-storage-account.md) by enabling a hierarchical namespace:
2222
* Create a container to store DICOM metadata, for example, named `dicom`.
2323
* Deploy an instance of the [DICOM service](deploy-dicom-services-in-azure.md).
24-
* (_Optional_) Deploy the [DICOM service with Data Lake Storage (Preview)](deploy-dicom-services-in-azure-data-lake.md) to enable direct access to DICOM files.
24+
* (_Optional_) Deploy the [DICOM service with Data Lake Storage](deploy-dicom-services-in-azure-data-lake.md) to enable direct access to DICOM files.
2525
* Create a [Data Factory](../../data-factory/quickstart-create-data-factory.md) instance:
2626
* Enable a [system-assigned managed identity](../../data-factory/data-factory-service-identity.md).
2727
* Create a [lakehouse](/fabric/data-engineering/tutorial-build-lakehouse) in Fabric.
@@ -105,15 +105,15 @@ Data Factory pipelines are a collection of _activities_ that perform a task, lik
105105

106106
1. Select **Use this template** to create the new pipeline.
107107

108-
### Create a pipeline for DICOM data (Preview)
108+
### Create a pipeline for DICOM data
109109

110-
If you created the DICOM service with Azure Data Lake Storage (Preview), you need to use a custom template to include a new `fileName` parameter in the metadata pipeline. Instead of using the template from the template gallery, follow these steps to configure the pipeline.
110+
If you created the DICOM service with Azure Data Lake Storage, you need to use a custom template to include a new `fileName` parameter in the metadata pipeline. Instead of using the template from the template gallery, follow these steps to configure the pipeline.
111111

112-
1. Download the [preview template](https://github.com/microsoft/dicom-server/blob/main/samples/templates/Copy%20DICOM%20Metadata%20Changes%20to%20ADLS%20Gen2%20in%20Delta%20Format.zip) from GitHub. The template file is a compressed (zipped) folder. You don't need to extract the files because they're already uploaded in compressed form.
112+
1. Download the [template](https://github.com/microsoft/dicom-server/blob/main/samples/templates/Copy%20DICOM%20Metadata%20Changes%20to%20ADLS%20Gen2%20in%20Delta%20Format.zip) from GitHub. The template file is a compressed (zipped) folder. You don't need to extract the files because they're already uploaded in compressed form.
113113

114114
1. In Azure Data Factory, select **Author** from the left menu. On the **Factory Resources** pane, select the plus sign (+) to add a new resource. Select **Pipeline** and then select **Import from pipeline template**.
115115

116-
1. In the **Open** window, select the preview template that you downloaded. Select **Open**.
116+
1. In the **Open** window, select the template that you downloaded. Select **Open**.
117117

118118
1. In the **Inputs** section, select the linked services created for the DICOM service and Azure Data Lake Storage Gen2 account.
119119

@@ -259,7 +259,7 @@ If you're using a [DICOM service with Data Lake Storage](dicom-data-lake.md), yo
259259

260260
1. Enter a **Shortcut Name** that describes the DICOM data. For example, **contoso-dicom-files**.
261261

262-
1. Enter the **Sub Path** that matches the name of the storage container and folder used by the DICOM service. For example, if you wanted to link to the root folder the Sub Path would be **/dicom/AHDS**. Note, the root folder will always be `AHDS`, but you can optionally link to a child folder for a specific workspace or DICOM service instance.
262+
1. Enter the **Sub Path** that matches the name of the storage container and folder used by the DICOM service. For example, if you wanted to link to the root folder the Sub Path would be **/dicom/AHDS**. Note that the root folder is always `AHDS`, but you can optionally link to a child folder for a specific workspace or DICOM service instance.
263263

264264
1. Select **Create** to create the shortcut.
265265

@@ -289,7 +289,7 @@ After a few seconds, the results of the query appear in a table underneath the c
289289

290290
#### Access DICOM file data in notebooks
291291

292-
If you used the preview template to create the pipeline and created a shortcut to the DICOM file data, you can use the `filePath` column in the `instance` table to correlate instance metadata to file data.
292+
If you used the template to create the pipeline and created a shortcut to the DICOM file data, you can use the `filePath` column in the `instance` table to correlate instance metadata to file data.
293293

294294
``` SQL
295295
SELECT sopInstanceUid, filePath from instance
@@ -308,8 +308,6 @@ In this article, you learned how to:
308308

309309
## Next steps
310310

311-
Learn more about Data Factory pipelines:
312-
313311
* [Pipelines and activities in Data Factory](../../data-factory/concepts-pipelines-activities.md)
314312
* [Use Microsoft Fabric notebooks](/fabric/data-engineering/how-to-use-notebook)
315313

0 commit comments

Comments
 (0)