You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -18,87 +18,265 @@ Comma-separated values (CSV) parser ingestion provides the capability to ingest
18
18
19
19
In this tutorial, you learn how to:
20
20
21
-
> [!div class="checklist"]
22
-
>
23
-
> * Ingest a sample wellbore data CSV file into an Azure Data Manager for Energy instance by using Postman.
24
-
> * Search for storage metadata records created during CSV ingestion by using Postman.
21
+
> * Ingest a sample wellbore data CSV file into an Azure Data Manager for Energy instance by using `cURL`.
22
+
> * Search for storage metadata records created during CSV ingestion by using `cURL`.
25
23
26
24
## Prerequisites
27
-
28
-
Before you start this tutorial, complete the following prerequisites.
25
+
* An Azure subscription
26
+
* An instance of [Azure Data Manager for Energy](quickstart-create-microsoft-energy-data-services-instance.md) created in your Azure subscription
27
+
* cURL command-line tool installed on your machine
28
+
* Generate the service principal access token to call the Seismic APIs. See [How to generate auth token](how-to-generate-auth-token.md).
29
29
30
30
### Get details for the Azure Data Manager for Energy instance
31
31
32
-
* You need an Azure Data Manager for Energy instance. If you don't already have one, create one by following the steps in [Quickstart: Create an Azure Data Manager for Energy instance](quickstart-create-microsoft-energy-data-services-instance.md).
33
32
* For this tutorial, you need the following parameters:
34
33
35
-
| Parameter | Value to use | Example | Where to find this value |
|`CLIENT_ID`| Application (client) ID |`00001111-aaaa-2222-bbbb-3333cccc4444`| You use this app or client ID when registering the application with the Microsoft identity platform. See [Register an application](../active-directory/develop/quickstart-register-app.md#register-an-application). |
38
-
|`CLIENT_SECRET`| Client secrets |`_fl******************`| Sometimes called an *application password*, a client secret is a string value that your app can use in place of a certificate to identity itself. See [Add a client secret](../active-directory/develop/quickstart-register-app.md#add-a-client-secret).|
39
-
|`TENANT_ID`| Directory (tenant) ID |`72f988bf-86f1-41af-91ab-xxxxxxxxxxxx`| Hover over your account name in the Azure portal to get the directory or tenant ID. Alternately, search for and select **Microsoft Entra ID** > **Properties** > **Tenant ID** in the Azure portal. |
40
-
|`SCOPE`| Application (client) ID |`00001111-aaaa-2222-bbbb-3333cccc4444`| This value is the same as the app or client ID mentioned earlier. |
41
-
|`refresh_token`| Refresh token value |`0.ATcA01-XWHdJ0ES-qDevC6r...........`| Follow [How to generate auth token](how-to-generate-auth-token.md) to create a refresh token and save it. You need this refresh token later to generate a user token. |
42
-
|`DNS`| URI |`<instance>.energy.Azure.com`| Find this value on the overview page of the Azure Data Manager for Energy instance.|
43
-
|`data-partition-id`| Data partitions |`<data-partition-id>`| Find this value on the Data Partitions page of the Azure Data Manager for Energy instance.|
34
+
| Parameter | Value to use | Example | Where to find this value |
35
+
|----|----|----|----|
36
+
|`DNS`| URI |`<instance>.energy.azure.com`| Find this value on the overview page of the Azure Data Manager for Energy instance. |
37
+
|`data-partition-id`| Data partitions |`<data-partition-id>`| Find this value on the Data Partitions section within the Azure Data Manager for Energy instance. |
38
+
|`access_token`| Access token value |`0.ATcA01-XWHdJ0ES-qDevC6r...........`| Follow [How to generate auth token](how-to-generate-auth-token.md) to create an access token and save it.|
44
39
45
40
Follow the [Manage users](how-to-manage-users.md) guide to add appropriate entitlements for the user who's running this tutorial.
46
41
47
-
### Set up Postman and execute requests
48
-
49
-
1. Download and install the [Postman](https://www.postman.com/) desktop app.
To import the Postman collection and environment variables, follow the steps in [Importing data into Postman](https://learning.postman.com/docs/getting-started/importing-and-exporting-data/#importing-data-into-postman).
57
-
58
-
1. Update **CURRENT VALUE** for the Postman environment with the information that you obtained in the details of the Azure Data Manager for Energy instance.
59
-
60
-
1. The Postman collection for CSV parser ingestion contains 10 requests that you must execute sequentially.
61
-
62
-
Be sure to choose **Ingestion Workflow Environment** before you trigger the Postman collection.
63
-
64
-
:::image type="content" source="media/tutorial-csv-ingestion/tutorial-postman-choose-environment.png" alt-text="Screenshot of the Postman environment." lightbox="media/tutorial-csv-ingestion/tutorial-postman-choose-environment.png":::
65
-
66
-
1. Trigger each request by selecting the **Send** button.
67
-
68
-
On every request, Postman validates the actual API response code against the expected response code. If there's any mismatch, the test section indicates failures.
69
-
70
-
Here's an example of a successful Postman request:
71
-
72
-
:::image type="content" source="media/tutorial-csv-ingestion/tutorial-postman-test-success.png" alt-text="Screenshot of a successful Postman call." lightbox="media/tutorial-csv-ingestion/tutorial-postman-test-success.png":::
73
-
74
-
Here's an example of a failed Postman request:
75
-
76
-
:::image type="content" source="media/tutorial-csv-ingestion/tutorial-postman-test-failure.png" alt-text="Screenshot of a failed Postman call." lightbox="media/tutorial-csv-ingestion/tutorial-postman-test-failure.png":::
77
-
78
-
## Ingest wellbore data by using Postman
79
-
80
-
To ingest a sample wellbore data CSV file into the Azure Data Manager for Energy instance by using the Postman collection, complete the following steps:
81
-
82
-
1.**Get a User Access Token**: Generate the user token, which will be used to authenticate further API calls.
83
-
1.**Create a Schema**: Generate a schema that adheres to the columns present in the CSV file.
84
-
1.**Get Schema details**: Get the schema created in the previous step and validate it.
85
-
1.**Create a Legal Tag**: Create a legal tag that will be added to the CSV data for data compliance purposes.
86
-
1.**Get a signed URL for uploading a CSV file**: Get the signed URL path to which the CSV file will be uploaded.
87
-
1.**Upload a CSV file**: Download the [Wellbore.csv](https://github.com/microsoft/meds-samples/blob/main/test-data/wellbore.csv) sample to your local machine, and then select this file in Postman by clicking the **Select File** button.
88
-
89
-
:::image type="content" source="media/tutorial-csv-ingestion/tutorial-select-csv-file.png" alt-text="Screenshot of uploading a CSV file." lightbox="media/tutorial-csv-ingestion/tutorial-select-csv-file.png":::
90
-
1.**Upload CSV file metadata**: Upload the file metadata information, such as file location and other relevant fields.
91
-
1.**Create a CSV Parser Ingestion Workflow**: Create the directed acyclic graph (DAG) for the CSV parser ingestion workflow.
92
-
1.**Trigger a CSV Parser Ingestion Workflow**: Trigger the DAG for the CSV parser ingestion workflow.
93
-
1.**Search for ingested CSV Parser Ingestion Workflow status**: Get the status of the CSV parser's DAG run.
94
-
95
-
## Search for ingested wellbore data by using Postman
96
-
97
-
To search for the storage metadata records created during the CSV ingestion by using the Postman collection, complete the following step:
98
-
99
-
***Search for ingested CSV records**: Search for the CSV records created earlier.
100
-
101
-
:::image type="content" source="media/tutorial-csv-ingestion/tutorial-search-success.png" alt-text="Screenshot of searching ingested CSV records." lightbox="media/tutorial-csv-ingestion/tutorial-search-success.png":::
42
+
### Set up your environment
43
+
44
+
Ensure you have `cURL` installed on your system. You will use it to make API calls.
45
+
46
+
## Ingest wellbore data by using `cURL`
47
+
48
+
To ingest a sample wellbore data CSV file into the Azure Data Manager for Energy instance, complete the following steps:
49
+
Replace the placeholders (`<DNS>`, `<access_token>`, etc.) with the appropriate values.
50
+
51
+
### 1. Create a Schema
52
+
53
+
Run the following `cURL` command to create a schema:
54
+
55
+
```bash
56
+
curl -X POST "https://<DNS>/api/schema-service/v1/schema" \
Save the `SignedURL` and `FileSource` from the response for use in the next steps.
147
+
148
+
### 4. Upload a CSV File
149
+
150
+
Download the [Wellbore.csv](https://github.com/microsoft/meds-samples/blob/main/test-data/wellbore.csv) sample to your local machine. Then, run the following `cURL` command to upload the file:
151
+
152
+
```bash
153
+
curl -X PUT -T "Wellbore.csv""<SignedURL>" -H "x-ms-blob-type: BlockBlob"
154
+
```
155
+
156
+
**Sample Response:**
157
+
```json
158
+
{
159
+
"status": "Success"
160
+
}
161
+
```
162
+
163
+
### 5. Upload CSV File Metadata
164
+
165
+
Run the following `cURL` command to upload metadata for the CSV file:
166
+
167
+
```bash
168
+
curl -X POST "https://<DNS>/api/file/v2/files/metadata" \
0 commit comments