You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/healthcare-apis/dicom/api-versioning-dicom-service.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -35,7 +35,7 @@ The OpenAPI Doc for the supported versions can be found at the following url:
35
35
`<service_url>/v<version>/api.yaml`
36
36
37
37
## DICOM Conformance Statement
38
-
All versions of the DICOM APIs conform to the DICOMweb™ Standard specifications, but different versions might expose different APIs. See the specific version of the conformance statement for details:
38
+
All versions of the DICOM APIs conform to the DICOMweb™ Standard specifications, but different versions might expose different APIs. See the specific version of the conformance statement for details:
[](media/api-supported-deprecated-versions.png#lightbox)
82
+
[](media/api-supported-deprecated-versions.png#lightbox)
Copy file name to clipboardExpand all lines: articles/healthcare-apis/dicom/data-partitions.md
+6-4Lines changed: 6 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,9 +11,9 @@ ms.author: mmitrik
11
11
12
12
# Enable data partitioning
13
13
14
-
Data partitioning allows you to set up a lightweight data partition scheme to store multiple copies of the same image with the same unique identifier (UID) in a single DICOM instance.
14
+
Data partitioning allows you to set up a lightweight data partition scheme to store multiple copies of the same image with the same unique identifier (UID) in a single DICOM® instance.
15
15
16
-
Although UIDs should be [unique across all contexts](http://dicom.nema.org/dicom/2013/output/chtml/part05/chapter_9.html), it's common practice for healthcare providers to write DICOM files to portable storage media and then give them to a patient. The patient then gives the files to another healthcare provider, who then transfers the files into a new DICOM storage system. Therefore, multiple copies of one DICOM file do commonly exist in isolated DICOM systems. Data partitioning provides an on-ramp for your existing data stores and workflows.
16
+
Although UIDs should be [unique across all contexts](http://dicom.nema.org/dicom/2013/output/chtml/part05/chapter_9.html), it's common practice for healthcare providers to write DICOM files to portable storage media and give them to a patient. The patient then gives the files to another healthcare provider, who transfers the files into a new DICOM storage system. As a result, multiple copies of one DICOM file commonly exist in several isolated DICOM systems. Data partitioning provides an on-ramp for your existing data stores and workflows.
17
17
18
18
## Limitations
19
19
@@ -60,7 +60,7 @@ GET /partitions
60
60
After partitions are enabled, STOW, WADO, QIDO, delete, export, update, and worklist requests must include a data partition URI segment after the base URI, with the form `/partitions/{partitionName}`, where `partitionName` is:
61
61
62
62
- Up to 64 characters long.
63
-
- Any combination of alphanumeric characters, `.`, `-`, and `_`, to allow both DICOM UID and GUID formats, as well as human-readable identifiers.
63
+
- Any combination of alphanumeric characters, `.`, `-`, and `_` (to allow both DICOM UID and GUID formats), as well as human-readable identifiers.
Copy file name to clipboardExpand all lines: articles/healthcare-apis/dicom/deploy-dicom-services-in-azure-data-lake.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,9 +18,9 @@ After deployment completes, you can use the Azure portal to see the details abou
18
18
19
19
## Prerequisites
20
20
21
-
-**Deploy an Azure Health Data Services workspace**. For more information, see [Deploy a workspace in the Azure portal](../healthcare-apis-quickstart.md).
22
-
-**Create a storage account with a hierarchical namespace**. For more information, see [Create a storage account to use with Azure Data Lake Storage Gen2](/azure/storage/blobs/create-data-lake-storage-account).
23
-
-**Create a blob container in the storage account**. The container is used by the DICOM service to store DICOM files. For more information, see [Manage blob containers using the Azure portal](/azure/storage/blobs/blob-containers-portal).
21
+
-**Deploy an Azure Health Data Services workspace**. For more information, see [Deploy a workspace in the Azure portal](../healthcare-apis-quickstart.md).
22
+
-**Create a storage account with a hierarchical namespace**. For more information, see [Create a storage account to use with Azure Data Lake Storage Gen2](/azure/storage/blobs/create-data-lake-storage-account).
23
+
-**Create a blob container in the storage account**. The container is used by the DICOM service to store DICOM files. For more information, see [Manage blob containers using the Azure portal](/azure/storage/blobs/blob-containers-portal).
24
24
25
25
> [!NOTE]
26
26
> The Azure Data Lake Storage option is only available for new instances of the DICOM service. After the option becomes generally available, we plan to offer a migration path for existing DICOM service instances.
@@ -65,7 +65,7 @@ After deployment completes, you can use the Azure portal to see the details abou
65
65
66
66
## Deploy the DICOM service with Data Lake Storage by using an ARM template
67
67
68
-
Use the Azure portal to **Deploy a custom template** and then use the sample ARM template to deploy the DICOM service with Azure Data Lake Storage. For more information, see [Create and deploy ARM templates by using the Azure portal](../../azure-resource-manager/templates/quickstart-create-templates-use-the-portal.md).
68
+
Use the Azure portal to **Deploy a custom template**. Then use the sample ARM template to deploy the DICOM service with Azure Data Lake Storage. For more information, see [Create and deploy ARM templates by using the Azure portal](../../azure-resource-manager/templates/quickstart-create-templates-use-the-portal.md).
69
69
70
70
```json
71
71
{
@@ -212,13 +212,13 @@ Use the Azure portal to **Deploy a custom template** and then use the sample ARM
212
212
213
213
### Connectivity
214
214
215
-
To be alerted to store health and connectivity failures, please sign up for [Resource Health alerts](/azure/service-health/resource-health-alert-monitor-guide).
215
+
To receive alerts regarding store health and connectivity failures please sign up for [Resource Health alerts](/azure/service-health/resource-health-alert-monitor-guide).
216
216
217
217
### 424 Failed Dependency
218
218
219
219
When the response status code is `424 Failed Dependency`, the issue lies with a dependency configured with DICOM and it may be the data lake store.
220
220
The response body indicates which dependency failed and provides more context on the failure. For data lake storage account failures, an error code is provided which was received when attempting to interact with the store. For more information, see [Azure Blob Storage error codes](/rest/api/storageservices/blob-service-error-codes).
221
-
Note that a code of `ConditionNotMet` typically indicates the blob file has been moved, deleted or modified without using DICOM APIs. The best way to mitigate such a situation is to use the DICOM API to DELETE the instance to remove the index and then reupload the modified file. This will enable you to continue to reference and interact with the file through DICOM APIs.
221
+
Note that a `ConditionNotMet`code typically indicates the blob file has been moved, deleted or modified without using DICOM APIs. The best way to mitigate this situation is to use the DICOM API to DELETE the instance, to remove the index and then reupload the modified file. This enables you to continue to reference and interact with the file through DICOM APIs.
222
222
223
223
## Next steps
224
224
[Receive resource health alerts](/azure/service-health/resource-health-alert-monitor-guide)
Copy file name to clipboardExpand all lines: articles/healthcare-apis/dicom/deploy-dicom-services-in-azure.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,10 +13,10 @@ ms.custom: mode-api
13
13
14
14
In this quickstart, you learn how to deploy the DICOM® service by using the Azure portal.
15
15
16
-
After deployment completes, you can use the Azure portal to see the details about the DICOM service, including the service URL. The service URL to access your DICOM service is ```https://<workspacename-dicomservicename>.dicom.azurehealthcareapis.com```. Make sure to specify the API version as part of the URL when you make requests. For more information, see [API versioning for the DICOM service](api-versioning-dicom-service.md).
16
+
After deployment completes, you can use the Azure portal to see the details about the DICOM service, including the service URL. The service URL used to access your DICOM service is ```https://<workspacename-dicomservicename>.dicom.azurehealthcareapis.com```. Make sure to specify the API version as part of the URL when you make requests. For more information, see [API versioning for the DICOM service](api-versioning-dicom-service.md).
17
17
18
18
> [!NOTE]
19
-
> The DICOM service with Azure Data Lake Storage is generally available. This capability provides greater flexibility and control over your imaging data. Learn more:[Deploy the DICOM service with Azure Data Lake Storage](deploy-dicom-services-in-azure-data-lake.md)
19
+
> The DICOM service with Azure Data Lake Storage is generally available. This capability provides greater flexibility and control over your imaging data. For more information, see[Deploy the DICOM service with Azure Data Lake Storage](deploy-dicom-services-in-azure-data-lake.md)
Copy file name to clipboardExpand all lines: articles/healthcare-apis/dicom/dicom-configure-azure-rbac.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,19 +16,19 @@ In this article, you'll learn how to use [Azure role-based access control (Azure
16
16
17
17
To grant users, service principals, or groups access to the DICOM data plane, select the **Access control (IAM)** blade. Select the **Role assignments** tab, and select **+ Add**.
18
18
19
-
[](media/dicom-access-control.png#lightbox)
19
+
[](media/dicom-access-control.png#lightbox)
20
20
21
21
22
22
In the **Role** selection, search for one of the built-in roles for the DICOM data plane:
23
23
24
-
[](media/rbac-add-role-assignment.png#lightbox)
24
+
[](media/rbac-add-role-assignment.png#lightbox)
25
25
26
26
You can choose between:
27
27
28
28
* DICOM Data Owner: Full access to DICOM data.
29
29
* DICOM Data Reader: Read and search DICOM data.
30
30
31
-
If these roles aren't sufficient for your need, you can use PowerShell to create custom roles. For information about creating custom roles, see [Create a custom role using Azure PowerShell](../../role-based-access-control/tutorial-custom-role-powershell.md).
31
+
If these roles aren't sufficient for your need, you can use PowerShell to create custom roles. For information about creating custom roles, see [Create a custom role using Azure PowerShell](../../role-based-access-control/tutorial-custom-role-powershell.md).
32
32
33
33
In the **Select** box, search for a user, service principal, or group that you want to assign the role to.
Copy file name to clipboardExpand all lines: articles/healthcare-apis/dicom/dicom-data-lake.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,7 @@ ms.custom: mode-api
12
12
13
13
# Manage medical imaging data with the DICOM service and Azure Data Lake Storage
14
14
15
-
The [DICOM® service](overview.md) provides cloud-scale storage for medical imaging data using the DICOMweb standard. The integration of the DICOM service with Azure Data Lake Storage means you gain full control of your imaging data and increased flexibility for accessing and working with that data through the Azure storage ecosystem and APIs.
15
+
The [DICOM® service](overview.md) provides cloud-scale storage for medical imaging data using the DICOMweb standard. The integration of the DICOM service with Azure Data Lake Storage means you gain full control of your imaging data. It provides increased flexibility for accessing and working with that data through the Azure storage ecosystem and APIs.
16
16
17
17
By using Azure Data Lake Storage with the DICOM service, organizations are able to:
18
18
@@ -21,7 +21,7 @@ By using Azure Data Lake Storage with the DICOM service, organizations are able
21
21
-**Unlock new analytics and AI/ML scenarios** by using services that natively integrate with Azure Data Lake Storage, including Azure Synapse, Azure Databricks, Azure Machine Learning, and Microsoft Fabric.
22
22
-**Grant controls to manage storage permissions, access controls, tiers, and rules**.
23
23
24
-
Another benefit of Azure Data Lake Storage is that it connects to [Microsoft Fabric](/fabric/get-started/microsoft-fabric-overview). Microsoft Fabric is an end-to-end, unified analytics platform that brings together all the data and analytics tools that organizations need to unlock the potential of their data and lay the foundation for AI scenarios. By using Microsoft Fabric, you can use the rich ecosystem of Azure services to perform advanced analytics and AI/ML with medical imaging data, such as building and deploying machine learning models, creating cohorts for clinical trials, and generating insights for patient care and outcomes.
24
+
Another benefit of Azure Data Lake Storage is that it connects to [Microsoft Fabric](/fabric/get-started/microsoft-fabric-overview). Microsoft Fabric is an end-to-end, unified analytics platform that brings together all the data and analytics tools. Organizations then have a unified way to unlock the potential of their data, and lay the foundation for AI scenarios. By using Microsoft Fabric, you can use the rich ecosystem of Azure services to perform advanced analytics and AI/ML with medical imaging data. This could include building and deploying machine learning models, creating cohorts for clinical trials, and generating insights for patient care and outcomes.
25
25
26
26
To learn more about using Microsoft Fabric with imaging data, see [Get started using DICOM data in analytics workloads](get-started-with-analytics-dicom.md).
27
27
@@ -51,7 +51,7 @@ In addition to DICOM data, a small file to enable [health checks](#health-check)
51
51
52
52
## Permissions
53
53
54
-
The DICOM service is granted access to the data like any other service or application accessing data in a storage account. Access can be revoked at any time without affecting your organization's ability to access the data. The DICOM service needs the ability to read, write, and delete files in the provided file system. This can be provided by granting the [Storage Blob Data Contributor](/azure/role-based-access-control/built-in-roles#storage-blob-data-contributor) role to the system-assigned or user-assigned managed identity attached to the DICOM service.
54
+
The DICOM service is granted access to the data like any other service or application accessing data in a storage account. Access can be revoked at any time without affecting your organization's ability to access the data. The DICOM service needs the ability to read, write, and delete files in the provided file system. This can be provided by granting the [Storage Blob Data Contributor](/azure/role-based-access-control/built-in-roles#storage-blob-data-contributor) role to the system-assigned, or user-assigned managed identity attached to the DICOM service.
55
55
56
56
## Access tiers
57
57
@@ -61,8 +61,8 @@ To learn more about access tiers, including cost tradeoffs and best practices, s
61
61
62
62
## Health check
63
63
64
-
The DICOM service writes a small file to the data lake every 30 seconds, following the [Data Contract](#data-contracts) to ensure it maintains access. Making any changes to files stored under the `healthCheck`sub-directory might result in incorrect status of the health check.
65
-
If there is an issue with access, status and details are displayed by [Azure Resource Health](../../service-health/overview.md). Azure Resource Health specifies if any action is required to restore access, for example reinstating a role to the DICOM service's identity.
64
+
The DICOM service writes a small file to the data lake every 30 seconds, following the [Data Contract](#data-contracts) to ensure it maintains access. Making any changes to files stored under the `healthCheck`subdirectory might result in incorrect status of the health check.
65
+
If there's an issue with access, status and details are displayed by [Azure Resource Health](../../service-health/overview.md). Azure Resource Health specifies if any action is required to restore access, for example reinstating a role to the DICOM service's identity.
0 commit comments