You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/healthcare-apis/dicom/get-started-with-analytics-dicom.md
+8-10Lines changed: 8 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
---
2
2
title: Get started using DICOM data in analytics workloads - Azure Health Data Services
3
-
description: This article demonstrates how to use Azure Data Factory and Microsoft Fabric to perform analytics on DICOM data.
3
+
description: Learn how to use Azure Data Factory and Microsoft Fabric to perform analytics on DICOM data.
4
4
services: healthcare-apis
5
5
author: mmitrik
6
6
ms.service: healthcare-apis
@@ -21,7 +21,7 @@ Before you get started, complete these steps:
21
21
* Create a [storage account with Azure Data Lake Storage Gen2 capabilities](../../storage/blobs/create-data-lake-storage-account.md) by enabling a hierarchical namespace:
22
22
* Create a container to store DICOM metadata, for example, named `dicom`.
23
23
* Deploy an instance of the [DICOM service](deploy-dicom-services-in-azure.md).
24
-
* (_Optional_) Deploy the [DICOM service with Data Lake Storage (Preview)](deploy-dicom-services-in-azure-data-lake.md) to enable direct access to DICOM files.
24
+
* (_Optional_) Deploy the [DICOM service with Data Lake Storage](deploy-dicom-services-in-azure-data-lake.md) to enable direct access to DICOM files.
25
25
* Create a [Data Factory](../../data-factory/quickstart-create-data-factory.md) instance:
26
26
* Enable a [system-assigned managed identity](../../data-factory/data-factory-service-identity.md).
27
27
* Create a [lakehouse](/fabric/data-engineering/tutorial-build-lakehouse) in Fabric.
@@ -105,15 +105,15 @@ Data Factory pipelines are a collection of _activities_ that perform a task, lik
105
105
106
106
1. Select **Use this template** to create the new pipeline.
107
107
108
-
### Create a pipeline for DICOM data (Preview)
108
+
### Create a pipeline for DICOM data
109
109
110
-
If you created the DICOM service with Azure Data Lake Storage (Preview), you need to use a custom template to include a new `fileName` parameter in the metadata pipeline. Instead of using the template from the template gallery, follow these steps to configure the pipeline.
110
+
If you created the DICOM service with Azure Data Lake Storage, you need to use a custom template to include a new `fileName` parameter in the metadata pipeline. Instead of using the template from the template gallery, follow these steps to configure the pipeline.
111
111
112
-
1. Download the [preview template](https://github.com/microsoft/dicom-server/blob/main/samples/templates/Copy%20DICOM%20Metadata%20Changes%20to%20ADLS%20Gen2%20in%20Delta%20Format.zip) from GitHub. The template file is a compressed (zipped) folder. You don't need to extract the files because they're already uploaded in compressed form.
112
+
1. Download the [template](https://github.com/microsoft/dicom-server/blob/main/samples/templates/Copy%20DICOM%20Metadata%20Changes%20to%20ADLS%20Gen2%20in%20Delta%20Format.zip) from GitHub. The template file is a compressed (zipped) folder. You don't need to extract the files because they're already uploaded in compressed form.
113
113
114
114
1. In Azure Data Factory, select **Author** from the left menu. On the **Factory Resources** pane, select the plus sign (+) to add a new resource. Select **Pipeline** and then select **Import from pipeline template**.
115
115
116
-
1. In the **Open** window, select the preview template that you downloaded. Select **Open**.
116
+
1. In the **Open** window, select the template that you downloaded. Select **Open**.
117
117
118
118
1. In the **Inputs** section, select the linked services created for the DICOM service and Azure Data Lake Storage Gen2 account.
119
119
@@ -259,7 +259,7 @@ If you're using a [DICOM service with Data Lake Storage](dicom-data-lake.md), yo
259
259
260
260
1. Enter a **Shortcut Name** that describes the DICOM data. For example, **contoso-dicom-files**.
261
261
262
-
1. Enter the **Sub Path** that matches the name of the storage container and folder used by the DICOM service. For example, if you wanted to link to the root folder the Sub Path would be **/dicom/AHDS**. Note, the root folder will always be`AHDS`, but you can optionally link to a child folder for a specific workspace or DICOM service instance.
262
+
1. Enter the **Sub Path** that matches the name of the storage container and folder used by the DICOM service. For example, if you wanted to link to the root folder the Sub Path would be **/dicom/AHDS**. Note that the root folder is always `AHDS`, but you can optionally link to a child folder for a specific workspace or DICOM service instance.
263
263
264
264
1. Select **Create** to create the shortcut.
265
265
@@ -289,7 +289,7 @@ After a few seconds, the results of the query appear in a table underneath the c
289
289
290
290
#### Access DICOM file data in notebooks
291
291
292
-
If you used the preview template to create the pipeline and created a shortcut to the DICOM file data, you can use the `filePath` column in the `instance` table to correlate instance metadata to file data.
292
+
If you used the template to create the pipeline and created a shortcut to the DICOM file data, you can use the `filePath` column in the `instance` table to correlate instance metadata to file data.
293
293
294
294
```SQL
295
295
SELECT sopInstanceUid, filePath from instance
@@ -308,8 +308,6 @@ In this article, you learned how to:
308
308
309
309
## Next steps
310
310
311
-
Learn more about Data Factory pipelines:
312
-
313
311
*[Pipelines and activities in Data Factory](../../data-factory/concepts-pipelines-activities.md)
314
312
*[Use Microsoft Fabric notebooks](/fabric/data-engineering/how-to-use-notebook)
Copy file name to clipboardExpand all lines: articles/healthcare-apis/github-projects.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ services: healthcare-apis
5
5
author: evachen96
6
6
ms.service: healthcare-apis
7
7
ms.topic: reference
8
-
ms.date: 10/18/2023
8
+
ms.date: 03/11/2024
9
9
ms.author: evach
10
10
---
11
11
@@ -80,7 +80,7 @@ This solution enables you to transform the data into tabular format as it gets w
80
80
81
81
## DICOM service
82
82
83
-
The DICOM service provides an open-source [Medical Imaging Server](https://github.com/microsoft/dicom-server) for DICOM that is easily deployed on Azure. It allows standards-based communication with any DICOMweb™ enabled systems, and injects DICOM metadata into a FHIR server to create a holistic view of patient data. See [DICOM service](./dicom/get-started-with-dicom.md) for more information.
83
+
The DICOM service provides an open-source [Medical Imaging Server](https://github.com/microsoft/dicom-server) for DICOM that is easily deployed on Azure. It allows standards-based communication with any DICOMweb™ enabled systems, and injects DICOM metadata into a FHIR server to create a holistic view of patient data. For more information, see [Manage medical imaging data with the DICOM service](./dicom/dicom-data-lake.md).
0 commit comments