You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/healthcare-apis/dicom/deploy-dicom-services-in-azure-data-lake.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ ms.author: mmitrik
10
10
ms.custom: mode-api, devx-track-arm-template
11
11
---
12
12
13
-
# Deploy the DICOM service with Data Lake Storage (Preview)
13
+
# Deploy the DICOM service with Azure Data Lake Storage
14
14
15
15
Deploying the [DICOM® service with Azure Data Lake Storage](dicom-data-lake.md) enables organizations to store and process imaging data in a standardized, secure, and scalable way.
16
16
@@ -25,7 +25,7 @@ After deployment completes, you can use the Azure portal to see the details abou
25
25
> [!NOTE]
26
26
> The Azure Data Lake Storage option is only available for new instances of the DICOM service. After the option becomes generally available, we plan to offer a migration path for existing DICOM service instances.
27
27
28
-
## Deploy the DICOM service with Data Lake Storage using the Azure portal
28
+
## Deploy the DICOM service with Azure Data Lake Storage by using the Azure portal
29
29
30
30
1. On the **Resource group** page of the Azure portal, select the name of the **Azure Health Data Services workspace**.
31
31
@@ -41,9 +41,9 @@ After deployment completes, you can use the Azure portal to see the details abou
41
41
42
42
1. Enter a name for the DICOM service.
43
43
44
-
1. Select **External (preview)** for the Storage Location.
44
+
1. Select **Data Lake Storage (default)** for the storage location.
45
45
46
-
:::image type="content" source="media/deploy-data-lake/dicom-deploy-options.png" alt-text="Screenshot showing the options in the Create DICOM service view." lightbox="media/deploy-data-lake/dicom-deploy-options.png":::
46
+
:::image type="content" source="media/deploy-data-lake/create-dicom-service-data-lake-sml.png" alt-text="Screenshot showing the storage location option." lightbox="media/deploy-data-lake/create-dicom-service-data-lake-lrg.png":::
47
47
48
48
1. Select the **subscription** and **resource group** that contains the storage account.
Copy file name to clipboardExpand all lines: articles/healthcare-apis/dicom/deploy-dicom-services-in-azure.md
+13-9Lines changed: 13 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,19 +4,19 @@ description: This article describes how to deploy the DICOM service in the Azure
4
4
author: mmitrik
5
5
ms.service: healthcare-apis
6
6
ms.topic: how-to
7
-
ms.date: 10/06/2023
7
+
ms.date: 03/11/2024
8
8
ms.author: mmitrik
9
9
ms.custom: mode-api
10
10
---
11
11
12
-
# Deploy the DICOM service
12
+
# Deploy the DICOM service by using the Azure portal
13
13
14
14
In this quickstart, you learn how to deploy the DICOM® service by using the Azure portal.
15
15
16
16
After deployment completes, you can use the Azure portal to see the details about the DICOM service, including the service URL. The service URL to access your DICOM service is ```https://<workspacename-dicomservicename>.dicom.azurehealthcareapis.com```. Make sure to specify the API version as part of the URL when you make requests. For more information, see [API versioning for the DICOM service](api-versioning-dicom-service.md).
17
17
18
18
> [!NOTE]
19
-
> A public preview of the DICOM service with Data Lake Storage is now available. This capability provides greater flexibility and control over your imaging data. Learn more: [Deploy the DICOM service with Data Lake Storage (Preview)](deploy-dicom-services-in-azure-data-lake.md)
19
+
> The DICOM service with Azure Data Lake Storage is generally available. This capability provides greater flexibility and control over your imaging data. Learn more: [Deploy the DICOM service with Azure Data Lake Storage](deploy-dicom-services-in-azure-data-lake.md)
20
20
21
21
## Prerequisites
22
22
@@ -26,19 +26,23 @@ To deploy the DICOM service, you need a workspace created in the Azure portal. F
26
26
27
27
1. On the **Resource group** page of the Azure portal, select the name of your **Azure Health Data Services workspace**.
28
28
29
-
[](media/select-workspace-resource-group.png#lightbox)
[](media/workspace-deploy-dicom-services.png#lightbox)
34
+
:::image type="content" source="media/workspace-deploy-dicom-services.png" alt-text="Screenshot showing deployment of the DICOM service." lightbox="media/workspace-deploy-dicom-services.png":::
35
+
34
36
35
37
1. Select **Add DICOM service**.
36
38
37
-
[](media/add-dicom-service.png#lightbox)
39
+
:::image type="content" source="media/add-dicom-service.png" alt-text="Screenshot showing how to add the DICOM service." lightbox="media/add-dicom-service.png":::
40
+
38
41
39
42
1. Enter a name for the DICOM service, and then select **Review + create**.
40
43
41
-
[](media/enter-dicom-service-name.png#lightbox)
44
+
:::image type="content" source="media/enter-dicom-service-name.png" alt-text="Screenshot showing the DICOM service name." lightbox="media/enter-dicom-service-name.png":::
45
+
42
46
43
47
1. (Optional) Select **Next: Tags**.
44
48
@@ -48,11 +52,11 @@ To deploy the DICOM service, you need a workspace created in the Azure portal. F
48
52
49
53
1. After the deployment process is finished, select **Go to resource**.
50
54
51
-
[](media/go-to-resource.png#lightbox)
55
+
:::image type="content" source="media/go-to-resource.png" alt-text="Screenshot showing Go to resource." lightbox="media/go-to-resource.png":::
52
56
53
57
The result of the newly deployed DICOM service is shown here.
54
58
55
-
[](media/results-deployed-dicom-service.png#lightbox)
59
+
:::image type="content" source="media/results-deployed-dicom-service.png" alt-text="Screenshot showing the DICOM finished deployment." lightbox="media/results-deployed-dicom-service.png":::
Copy file name to clipboardExpand all lines: articles/healthcare-apis/dicom/dicom-data-lake.md
+7-11Lines changed: 7 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,18 +1,18 @@
1
1
---
2
-
title: Azure Data Lake Storage integration for the DICOM service in Azure Health Data Services
3
-
description: Learn how to use Azure Data Lake Storage with the DICOM service to store, access, and analyze medical imaging data in the cloud. Explore the benefits, architecture, and data contracts of this integration.
2
+
title: Manage medical imaging data with the DICOM service and Azure Data Lake Storage
3
+
description: Learn how to use the DICOM service in Azure Health Data Services to store, access, and analyze medical imaging data in the cloud. Explore the benefits, architecture, and data contracts of the integration of the DICOM service with Azure Data Lake Storage.
4
4
author: mmitrik
5
5
ms.service: healthcare-apis
6
6
ms.subservice: dicom
7
7
ms.topic: how-to
8
-
ms.date: 11/21/2023
8
+
ms.date: 03/11/2024
9
9
ms.author: mmitrik
10
10
ms.custom: mode-api
11
11
---
12
12
13
-
# Azure Data Lake Storage integration for the DICOM service (Preview)
13
+
# Manage medical imaging data with the DICOM service and Azure Data Lake Storage
14
14
15
-
The [DICOM® service](overview.md) provides cloud-scale storage for medical imaging data using the DICOMweb standard. With the integration of Azure Data Lake Storage, you gain full control of your imaging data and increased flexibility for accessing and working with that data through the Azure storage ecosystem and APIs.
15
+
The [DICOM® service](overview.md) provides cloud-scale storage for medical imaging data using the DICOMweb standard. The integration of the DICOM service with Azure Data Lake Storage means you gain full control of your imaging data and increased flexibility for accessing and working with that data through the Azure storage ecosystem and APIs.
16
16
17
17
By using Azure Data Lake Storage with the DICOM service, organizations are able to:
In addition to DICOM data, a small file to enable [health checks](#health-check) will be written to this location.
51
51
52
-
> [!NOTE]
53
-
> During public preview, the DICOM service writes data to the storage container and reads the data, but user-added data isn't read and indexed by the DICOM service. Similarly, if DICOM data written by the DICOM service is modified or removed, it may result in errors when accessing data with the DICOMweb APIs.
54
-
55
52
## Permissions
56
53
57
54
The DICOM service is granted access to the data like any other service or application accessing data in a storage account. Access can be revoked at any time without affecting your organization's ability to access the data. The DICOM service needs the ability to read, write, and delete files in the provided file system. This can be provided by granting the [Storage Blob Data Contributor](/azure/role-based-access-control/built-in-roles#storage-blob-data-contributor) role to the system-assigned or user-assigned managed identity attached to the DICOM service.
@@ -69,18 +66,17 @@ If there is an issue with access, status and details are displayed by [Azure Res
69
66
70
67
## Limitations
71
68
72
-
During public preview, the DICOM service with data lake storage has these limitations:
69
+
The DICOM service with data lake storage has these limitations:
73
70
74
71
-[Bulk Import](import-files.md) isn't supported.
75
72
- UPS-RS work items aren't stored in the data lake storage account.
76
73
- User data added to the data lake storage account isn't read and indexed by the DICOM service. It's possible that a filename collision could occur, so we recommend that you don't write data to the folder structure used by the DICOM service.
77
74
- If DICOM data written by the DICOM service is modified or removed, errors might result when accessing data with the DICOMweb APIs.
78
-
- Configuration of customer-managed keys isn't supported during the creation of a DICOM service when you opt to use external storage.
79
75
- The archive access tier isn't supported. Moving data to the archive tier will result in errors when accessing data with the DICOMweb APIs.
80
76
81
77
## Next steps
82
78
83
-
[Deploy the DICOM service with Azure Data Lake Storage (Preview)](deploy-dicom-services-in-azure-data-lake.md)
79
+
[Deploy the DICOM service with Azure Data Lake Storage](deploy-dicom-services-in-azure-data-lake.md)
84
80
85
81
[Get started using DICOM data in analytics workloads](get-started-with-analytics-dicom.md)
Copy file name to clipboardExpand all lines: articles/healthcare-apis/dicom/get-started-with-analytics-dicom.md
+8-10Lines changed: 8 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
---
2
2
title: Get started using DICOM data in analytics workloads - Azure Health Data Services
3
-
description: This article demonstrates how to use Azure Data Factory and Microsoft Fabric to perform analytics on DICOM data.
3
+
description: Learn how to use Azure Data Factory and Microsoft Fabric to perform analytics on DICOM data.
4
4
services: healthcare-apis
5
5
author: mmitrik
6
6
ms.service: healthcare-apis
@@ -21,7 +21,7 @@ Before you get started, complete these steps:
21
21
* Create a [storage account with Azure Data Lake Storage Gen2 capabilities](../../storage/blobs/create-data-lake-storage-account.md) by enabling a hierarchical namespace:
22
22
* Create a container to store DICOM metadata, for example, named `dicom`.
23
23
* Deploy an instance of the [DICOM service](deploy-dicom-services-in-azure.md).
24
-
* (_Optional_) Deploy the [DICOM service with Data Lake Storage (Preview)](deploy-dicom-services-in-azure-data-lake.md) to enable direct access to DICOM files.
24
+
* (_Optional_) Deploy the [DICOM service with Data Lake Storage](deploy-dicom-services-in-azure-data-lake.md) to enable direct access to DICOM files.
25
25
* Create a [Data Factory](../../data-factory/quickstart-create-data-factory.md) instance:
26
26
* Enable a [system-assigned managed identity](../../data-factory/data-factory-service-identity.md).
27
27
* Create a [lakehouse](/fabric/data-engineering/tutorial-build-lakehouse) in Fabric.
@@ -105,15 +105,15 @@ Data Factory pipelines are a collection of _activities_ that perform a task, lik
105
105
106
106
1. Select **Use this template** to create the new pipeline.
107
107
108
-
### Create a pipeline for DICOM data (Preview)
108
+
### Create a pipeline for DICOM data
109
109
110
-
If you created the DICOM service with Azure Data Lake Storage (Preview), you need to use a custom template to include a new `fileName` parameter in the metadata pipeline. Instead of using the template from the template gallery, follow these steps to configure the pipeline.
110
+
If you created the DICOM service with Azure Data Lake Storage, you need to use a custom template to include a new `fileName` parameter in the metadata pipeline. Instead of using the template from the template gallery, follow these steps to configure the pipeline.
111
111
112
-
1. Download the [preview template](https://github.com/microsoft/dicom-server/blob/main/samples/templates/Copy%20DICOM%20Metadata%20Changes%20to%20ADLS%20Gen2%20in%20Delta%20Format.zip) from GitHub. The template file is a compressed (zipped) folder. You don't need to extract the files because they're already uploaded in compressed form.
112
+
1. Download the [template](https://github.com/microsoft/dicom-server/blob/main/samples/templates/Copy%20DICOM%20Metadata%20Changes%20to%20ADLS%20Gen2%20in%20Delta%20Format.zip) from GitHub. The template file is a compressed (zipped) folder. You don't need to extract the files because they're already uploaded in compressed form.
113
113
114
114
1. In Azure Data Factory, select **Author** from the left menu. On the **Factory Resources** pane, select the plus sign (+) to add a new resource. Select **Pipeline** and then select **Import from pipeline template**.
115
115
116
-
1. In the **Open** window, select the preview template that you downloaded. Select **Open**.
116
+
1. In the **Open** window, select the template that you downloaded. Select **Open**.
117
117
118
118
1. In the **Inputs** section, select the linked services created for the DICOM service and Azure Data Lake Storage Gen2 account.
119
119
@@ -259,7 +259,7 @@ If you're using a [DICOM service with Data Lake Storage](dicom-data-lake.md), yo
259
259
260
260
1. Enter a **Shortcut Name** that describes the DICOM data. For example, **contoso-dicom-files**.
261
261
262
-
1. Enter the **Sub Path** that matches the name of the storage container and folder used by the DICOM service. For example, if you wanted to link to the root folder the Sub Path would be **/dicom/AHDS**. Note, the root folder will always be`AHDS`, but you can optionally link to a child folder for a specific workspace or DICOM service instance.
262
+
1. Enter the **Sub Path** that matches the name of the storage container and folder used by the DICOM service. For example, if you wanted to link to the root folder the Sub Path would be **/dicom/AHDS**. Note that the root folder is always `AHDS`, but you can optionally link to a child folder for a specific workspace or DICOM service instance.
263
263
264
264
1. Select **Create** to create the shortcut.
265
265
@@ -289,7 +289,7 @@ After a few seconds, the results of the query appear in a table underneath the c
289
289
290
290
#### Access DICOM file data in notebooks
291
291
292
-
If you used the preview template to create the pipeline and created a shortcut to the DICOM file data, you can use the `filePath` column in the `instance` table to correlate instance metadata to file data.
292
+
If you used the template to create the pipeline and created a shortcut to the DICOM file data, you can use the `filePath` column in the `instance` table to correlate instance metadata to file data.
293
293
294
294
```SQL
295
295
SELECT sopInstanceUid, filePath from instance
@@ -308,8 +308,6 @@ In this article, you learned how to:
308
308
309
309
## Next steps
310
310
311
-
Learn more about Data Factory pipelines:
312
-
313
311
*[Pipelines and activities in Data Factory](../../data-factory/concepts-pipelines-activities.md)
314
312
*[Use Microsoft Fabric notebooks](/fabric/data-engineering/how-to-use-notebook)
0 commit comments