Skip to content

Commit 3c5e6b8

Browse files
incorp ketki's feedback about
1 parent 0cfe459 commit 3c5e6b8

File tree

1 file changed

+24
-24
lines changed

1 file changed

+24
-24
lines changed

articles/healthcare-apis/fhir/import-data.md

Lines changed: 24 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -11,11 +11,11 @@ ms.author: kesheth
1111

1212
# Import FHIR data
1313

14-
The import operation allows you to ingest FHIR® data into the FHIR server with high throughput.
14+
The `import` operation allows you to ingest FHIR® data into the FHIR server with high throughput.
1515

1616
## Import operation modes
1717

18-
The `$import` operation supports two modes: initial mode and incremental mode. Each mode has different features and use cases.
18+
The `import` operation supports two modes: initial mode and incremental mode. Each mode has different features and use cases.
1919

2020
#### Initial mode
2121

@@ -36,37 +36,37 @@ The `$import` operation supports two modes: initial mode and incremental mode. E
3636
- Allows you to ingest soft deleted resources. This capability is beneficial when you migrate from Azure API for FHIR to the FHIR service in Azure Health Data Services.
3737

3838
> [!IMPORTANT]
39-
> The `$import` operation doesn't support conditional references in resources.
39+
> The `import` operation doesn't support conditional references in resources.
4040
>
4141
> Also, if multiple resources share the same resource ID, then only one of those resources is imported at random. An error is logged for the resources sharing the same resource ID.
4242
4343
## Performance considerations
4444

45-
To achieve the best performance with the `$import` operation, consider these factors:
45+
To achieve the best performance with the `import` operation, consider these factors:
4646

47-
- **Use large files for import.** The file size of a single import operation should be more than 200 MB. Smaller files might result in slower import times.
47+
- **Use large files for import.** The file size of a single `import` operation should be more than 200 MB. Smaller files might result in slower import times.
4848

49-
- **Import FHIR resource files as a single batch.** For optimal performance, import all the FHIR resource files that you want to ingest in the FHIR server in one import operation. Importing all the files in one operation reduces the overhead of creating and managing multiple import jobs.
49+
- **Import FHIR resource files as a single batch.** For optimal performance, import all the FHIR resource files that you want to ingest in the FHIR server in one `import` operation. Importing all the files in one operation reduces the overhead of creating and managing multiple import jobs.
5050

51-
- **Limit the number of parallel import jobs.** You can run multiple import jobs at the same time, but running multiple jobs might affect the overall throughput of the import operation. The FHIR server can handle up to five parallel import jobs. If you exceed this limit, the FHIR server might throttle or reject your requests.
51+
- **Limit the number of parallel import jobs.** You can run multiple `import` jobs at the same time, but running multiple jobs might affect the overall throughput of the import operation. The FHIR server can handle up to five parallel `import` jobs. If you exceed this limit, the FHIR server might throttle or reject your requests.
5252

53-
## Perform the $import operation
53+
## Perform the import operation
5454

5555
### Prerequisites
5656

57-
- You need the **FHIR Data Importer** role on the FHIR server to use `$import`.
57+
- You need the **FHIR Data Importer** role on the FHIR server to use the `import` operation.
5858

59-
- Configure the FHIR server for `$import`. The FHIR data must be stored in resource-specific files in FHIR NDJSON format on the Azure blob store. For more information, see [Configure import settings](configure-import-data.md).
59+
- Configure the FHIR server. The FHIR data must be stored in resource-specific files in FHIR NDJSON format on the Azure blob store. For more information, see [Configure import settings](configure-import-data.md).
6060

6161
- All the resources in a file must be the same type. You can have multiple files per resource type.
6262

6363
- The data must be in the same tenant as the FHIR service.
6464

65-
- The maximum number of files allowed per `$import` operation is 10,000.
65+
- The maximum number of files allowed per `import` operation is 10,000.
6666

6767
### Call $import
6868

69-
Make a ```POST``` call to ```<<FHIR service base URL>>/$import``` with the request header and body shown.
69+
Make a `POST` call to `<<FHIR service base URL>>/$import` with the request header and body shown.
7070

7171
#### Request header
7272

@@ -80,14 +80,14 @@ Content-Type:application/fhir+json
8080
| Parameter Name | Description | Card. | Accepted values |
8181
| ----------- | ----------- | ----------- | ----------- |
8282
| inputFormat | String representing the name of the data source format. Only FHIR NDJSON files are supported. | 1..1 | ```application/fhir+ndjson``` |
83-
| mode | Import mode value | 1..1 | For an initial mode import, use the `InitialLoad` mode value. For incremental mode import, use the `IncrementalLoad` mode value. If no mode value is provided, the `IncrementalLoad`` mode value is used by default. |
83+
| mode | Import mode value | 1..1 | For an initial mode `import`, use the `InitialLoad` mode value. For incremental mode `import`, use the `IncrementalLoad` mode value. If no mode value is provided, the `IncrementalLoad`` mode value is used by default. |
8484
| input | Details of the input files. | 1..* | A JSON array with the three parts described in the table. |
8585

8686
| Input part name | Description | Card. | Accepted values |
8787
| ----------- | ----------- | ----------- | ----------- |
8888
| type | Resource type of input file | 1..1 | A valid [FHIR resource type](https://www.hl7.org/fhir/resourcelist.html) that matches the input file. |
8989
|URL | Azure storage URL of the input file | 1..1 | URL value of the input file. The value can't be modified. |
90-
| etag | Etag of the input file on Azure storage; used to verify the file content isn't changed after $import registration. | 0..1 | Etag value of the input file. |
90+
| etag | Etag of the input file on Azure storage; used to verify the file content isn't changed after `import` registration. | 0..1 | Etag value of the input file. |
9191

9292
```json
9393
{
@@ -139,9 +139,9 @@ Content-Type:application/fhir+json
139139

140140
After an $import is initiated, an empty response body with a `callback` link is returned in the `Content-location` header of the response, together with an `202-Accepted` status code. Store the callback link to check the import status.
141141

142-
The `$import` operation registration is implemented as an idempotent call. The same registration payload yields the same registration, which affects ability to reprocess files with the same name. Refrain from updating files in-place, instead we suggest you use different file name for updated data, or, if update in-place with same file name is unavoidable, add e-tags in the registration payload.
142+
The import operation registration is implemented as an idempotent call. The same registration payload yields the same registration, which affects ability to reprocess files with the same name. Refrain from updating files in-place, instead we suggest you use different file name for updated data, or, if update in-place with same file name is unavoidable, add e-tags in the registration payload.
143143

144-
To check import status, make the REST call with the ```GET``` method to the `callback` link returned in the previous step.
144+
To check import status, make the REST call with the `GET` method to the `callback` link returned in the previous step.
145145

146146
Interpret the response by using the table:
147147

@@ -188,15 +188,15 @@ Table describes the important fields in the response body:
188188
}
189189
```
190190
### Ingest soft deleted resources
191-
Incremental mode import supports ingestion of soft deleted resources. You need to use the extension to ingest soft deleted resources in the FHIR service.
191+
Incremental mode `import` supports ingestion of soft deleted resources. You need to use the extension to ingest soft deleted resources in the FHIR service.
192192

193193
Add the extension to the resource to inform the FHIR service that the resource was soft deleted.
194194

195195
```ndjson
196196
{"resourceType": "Patient", "id": "example10", "meta": { "lastUpdated": "2023-10-27T04:00:00.000Z", "versionId": 4, "extension": [ { "url": "http://azurehealthcareapis.com/data-extensions/deleted-state", "valueString": "soft-deleted" } ] } }
197197
```
198198

199-
After the `$import` operation completes successfully, perform history search on the resource to validate soft deleted resources. If you know the ID of the deleted resource, use the URL pattern in the example.
199+
After the `import` operation completes successfully, perform history search on the resource to validate soft deleted resources. If you know the ID of the deleted resource, use the URL pattern in the example.
200200

201201
```json
202202
<FHIR_URL>/<resource-type>/<resource-id>/_history
@@ -207,13 +207,13 @@ If you don't know the ID of the resource, do a history search on the resource ty
207207
<FHIR_URL>/<resource-type>/_history
208208
```
209209

210-
## Troubleshoot the $import operation
210+
## Troubleshoot the import operation
211211

212-
Here are the error messages that occur if the `$import` operation fails, and recommended actions to take to resolve the issue.
212+
Here are the error messages that occur if the `import` operation fails, and recommended actions to take to resolve the issue.
213213

214214
#### 200 OK, but there's an error with the URL in the response
215215

216-
**Behavior:** The `$import` operation succeeds and returns `200 OK`. However, `error.url` is present in the response body. Files present at the `error.url` location contain JSON fragments similar to the example:
216+
**Behavior:** The `import` operation succeeds and returns `200 OK`. However, `error.url` is present in the response body. Files present at the `error.url` location contain JSON fragments similar to the example:
217217

218218
```json
219219
{
@@ -236,7 +236,7 @@ Here are the error messages that occur if the `$import` operation fails, and rec
236236

237237
#### 400 Bad Request
238238

239-
**Behavior:** The `$import` operation failed and `400 Bad Request` is returned. The response body includes this content:
239+
**Behavior:** The import operation failed and `400 Bad Request` is returned. The response body includes this content:
240240

241241
```json
242242
{
@@ -256,7 +256,7 @@ Here are the error messages that occur if the `$import` operation fails, and rec
256256

257257
#### 403 Forbidden
258258

259-
**Behavior:** The `$import` operation failed and `403 Forbidden` is returned. The response body contains this content:
259+
**Behavior:** The import operation failed and `403 Forbidden` is returned. The response body contains this content:
260260

261261
```json
262262
{
@@ -278,7 +278,7 @@ Here are the error messages that occur if the `$import` operation fails, and rec
278278

279279
#### 500 Internal Server Error
280280

281-
**Behavior:** The `$import` operation failed and `500 Internal Server Error` is returned. The response body contains this content:
281+
**Behavior:** The import operation failed and `500 Internal Server Error` is returned. The response body contains this content:
282282

283283
```json
284284
{

0 commit comments

Comments
 (0)