You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/energy-data-services/how-to-convert-segy-to-zgy.md
+18-31Lines changed: 18 additions & 31 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,33 +9,28 @@ ms.date: 08/18/2022
9
9
ms.custom: template-how-to #Required; leave this attribute/value as-is.
10
10
---
11
11
12
-
# How to convert a SEG-Y file to ZGY?
12
+
# How to convert a SEG-Y file to ZGY
13
13
14
-
Seismic data stored in industry standard SEG-Y format can be converted to ZGY for use in applications such as Petrel via the Seismic DMS. See here for [ZGY Conversion FAQ's](https://community.opengroup.org/osdu/platform/data-flow/ingestion/segy-to-zgy-conversion#faq) and more background can be found in the OSDU™ community here: [SEG-Y to ZGY conversation](https://community.opengroup.org/osdu/platform/data-flow/ingestion/segy-to-zgy-conversion/-/tree/azure/m10-master)
14
+
In this article, you will learn how to convert SEG-Y formatted data to the ZGY format. Seismic data stored in industry standard SEG-Y format can be converted to ZGY for use in applications such as Petrel via the Seismic DMS. See here for [ZGY Conversion FAQ's](https://community.opengroup.org/osdu/platform/data-flow/ingestion/segy-to-zgy-conversion#faq) and more background can be found in the OSDU™ community here: [SEG-Y to ZGY conversation](https://community.opengroup.org/osdu/platform/data-flow/ingestion/segy-to-zgy-conversion/-/tree/azure/m10-master)
15
15
16
16
[!INCLUDE [preview features callout](./includes/preview/preview-callout.md)]
17
17
18
18
## Prerequisites
19
19
20
-
### Postman
21
-
22
-
* Download and install [Postman](https://www.postman.com/) desktop app.
23
-
* Import the [oZGY Conversions.postman_collection](https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M9/Azure-M9/Services/DDMS/oZGY%20Conversions.postman_collection.json) into Postman. All curl commands used below are added to this collection. Update your Environment file accordingly
24
-
* Microsoft Energy Data Services Preview instance is created already
25
-
* Clone the **sdutil** repo as shown below:
20
+
1. Download and install [Postman](https://www.postman.com/) desktop app.
21
+
2. Import the [oZGY Conversions.postman_collection](https://github.com/microsoft/meds-samples/blob/main/postman/SegyToZgyConversion%20Workflow%20using%20SeisStore%20R3%20CI-CD%20v1.0.postman_collection.json) into Postman. All curl commands used below are added to this collection. Update your Environment file accordingly
22
+
3. Ensure that your Microsoft Energy Data Services Preview instance is created already
* The [jq command](https://stedolan.github.io/jq/download/), using your favorite tool on your favorite OS.
29
+
5. The [jq command](https://stedolan.github.io/jq/download/), using your favorite tool on your favorite OS.
32
30
33
-
## Step by Step guide
34
-
35
-
1. The user needs to be part of the `users.datalake.admins` group and user needs to generate a valid refresh token. See [How to generate a refresh token](how-to-generate-refresh-token.md) for further instructions. If you continue to follow other "how-to" documentation, you'll use this refresh token again. Once you've generated the token, store it in a place where you'll be able to access it in the future. If it isn't present, add the group for the member ID. In this case, use the app ID you have been using for everything as the `user-email`.
31
+
## Convert SEG-Y file to ZGY file
36
32
37
-
> [!NOTE]
38
-
> `data-partition-id` should be in the format `<instance-name>-<data-partition-name>` in both the header and the url, and will be for any following command that requires `data-partition-id`.
33
+
1. The user needs to be part of the `users.datalake.admins` group and user needs to generate a valid refresh token. See [How to generate a refresh token](how-to-generate-refresh-token.md) for further instructions. If you continue to follow other "how-to" documentation, you'll use this refresh token again. Once you've generated the token, store it in a place where you'll be able to access it in the future. If it isn't present, add the group for the member ID. In this case, use the app ID you have been using for everything as the `user-email`. Additionally, the `data-partition-id` should be in the format `<instance-name>-<data-partition-name>` in both the header and the url, and will be for any following command that requires `data-partition-id`.
39
34
40
35
```bash
41
36
curl --location --request POST "<url>/api/entitlements/v2/groups/users.datalake.admins@<data-partition>.<domain>.com/members" \
@@ -105,10 +100,9 @@ Seismic data stored in industry standard SEG-Y format can be converted to ZGY fo
105
100
}'
106
101
```
107
102
108
-
5. Create Subproject. Use your previously created entitlements groups that you would like to add as ACLs (Access Control List) admins and viewers. If you haven't yet created entitlements groups, follow the directions as outlined in [How to manage users?](how-to-manage-users.md). If you would like to see what groups you have, use [Get entitlements groups for a given user](how-to-manage-users.md#get-entitlements-groups-for-a-given-user). Data access isolation achieved with this dedicated ACL (access control list) per object within a given data partition. You may have many subprojects within a data partition, so this command allows you to provide access to a specific subproject without providing access to an entire data partition. Data partition entitlements don't necessarily translate to the subprojects within it, so it's important to be explicit about the ACLs for each subproject, regardless of what data partition it is in.
103
+
5. Create Subproject. Use your previously created entitlements groups that you would like to add as ACLs (Access Control List) admins and viewers. If you haven't yet created entitlements groups, follow the directions as outlined in [How to manage users](how-to-manage-users.md). If you would like to see what groups you have, use [Get entitlements groups for a given user](how-to-manage-users.md#get-entitlements-groups-for-a-given-user). Data access isolation is achieved with this dedicated ACL (access control list) per object within a given data partition. You may have many subprojects within a data partition, so this command allows you to provide access to a specific subproject without providing access to an entire data partition. Data partition entitlements don't necessarily translate to the subprojects within it, so it's important to be explicit about the ACLs for each subproject, regardless of what data partition it is in.
109
104
110
-
> [!NOTE]
111
-
> Later in this tutorial, you'll need at least one `owner` and at least one `viewer`. These user groups will look like `data.default.owners` and `data.default.viewers`. Make sure to include one of each in your list of `acls` in the request below.
105
+
Later in this tutorial, you'll need at least one `owner` and at least one `viewer`. These user groups will look like `data.default.owners` and `data.default.viewers`. Make sure to include one of each in your list of `acls` in the request below.
112
106
113
107
```bash
114
108
curl --location --request POST '<url>/seistore-svc/api/v3/subproject/tenant/<data-partition>/subproject/<subproject>' \
@@ -158,7 +152,7 @@ Seismic data stored in industry standard SEG-Y format can be converted to ZGY fo
158
152
}'
159
153
```
160
154
161
-
6. Patch Subproject with the legal tag you created above:
155
+
6. Patch Subproject with the legal tag you created above. Recall that the format of the legal tag will be prefixed with the Microsoft Energy Data Services instance name and data partition name, so it looks like `<instancename>`-`<datapartitionname>`-`<legaltagname>`.
@@ -182,10 +176,7 @@ Seismic data stored in industry standard SEG-Y format can be converted to ZGY fo
182
176
}'
183
177
```
184
178
185
-
> [!NOTE]
186
-
> Recall that the format of the legal tag will be prefixed with the Microsoft Energy Data Services instance name and data partition name, so it looks like `<instancename>`-`<datapartitionname>`-`<legaltagname>`.
187
-
188
-
7. Open the [sdutil](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/tree/azure/stable) codebase and edit the `config.yaml` at the root. Update this config to:
179
+
7. Open the [sdutil](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/tree/azure/stable) codebase and edit the `config.yaml` at the root. Replace the contents of this config file with the following yaml. See [How to generate a refresh token](how-to-generate-refresh-token.md) to generate the required refresh token. Once you've generated the token, store it in a place where you'll be able to access it in the future.
189
180
190
181
```yaml
191
182
seistore:
@@ -208,10 +199,7 @@ Seismic data stored in industry standard SEG-Y format can be converted to ZGY fo
208
199
empty: none
209
200
```
210
201
211
-
> [!NOTE]
212
-
> See [How to generate a refresh token](how-to-generate-refresh-token.md). Once you've generated the token, store it in a place where you'll be able to access it in the future.
213
-
214
-
8. Run the following commands using **sdutil** to see its working fine. Follow the directions in [Setup and Usage for Azure env](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/tree/azure/stable#setup-and-usage-for-azure-env). Understand that depending on your OS and Python version, you may have to run `python3` command as opposed to `python`. If you run into errors with these commands, refer to the [SDUTIL tutorial](/azure/energy-data-services/tutorial-seismic-ddms-sdutil).
202
+
8. Run the following commands using **sdutil** to see its working fine. Follow the directions in [Setup and Usage for Azure env](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/tree/azure/stable#setup-and-usage-for-azure-env). Understand that depending on your OS and Python version, you may have to run `python3` command as opposed to `python`. If you run into errors with these commands, refer to the [SDUTIL tutorial](/tutorials/tutorial-seismic-ddms-sdutil.md). See [How to generate a refresh token](how-to-generate-refresh-token.md). Once you've generated the token, store it in a place where you'll be able to access it in the future.
215
203
216
204
> [!NOTE]
217
205
> when running `python sdutil config init`, you don't need to enter anything when prompted with `Insert the azure (azureGlabEnv) application key:`.
@@ -239,22 +227,21 @@ Seismic data stored in industry standard SEG-Y format can be converted to ZGY fo
239
227
240
228
10. Create the manifest file (otherwise known as the records file)
241
229
242
-
ZGY conversion uses a manifest file that you'll upload to your storage account in order to run the conversion. This manifest file is created by using multiple JSON files and running a script. The JSON files for this process are stored [here](https://community.opengroup.org/osdu/platform/data-flow/ingestion/segy-to-zgy-conversion/-/tree/master/doc/sample-records/volve). For more information on Volve, where the dataset definitions come from, visit [their website](https://www.equinor.com/energy/volve-data-sharing). Complete the following steps in order to create the manifest file:
230
+
ZGY conversion uses a manifest file that you'll upload to your storage account in order to run the conversion. This manifest file is created by using multiple JSON files and running a script. The JSON files for this process are stored [here](https://community.opengroup.org/osdu/platform/data-flow/ingestion/segy-to-zgy-conversion/-/tree/master/doc/sample-records/volve). For more information on Volve, such as where the dataset definitions come from, visit [their website](https://www.equinor.com/en/what-we-do/digitalisation-in-our-dna/volve-field-data-village-download.html). Complete the following steps in order to create the manifest file:
243
231
244
232
* Clone the [repo](https://community.opengroup.org/osdu/platform/data-flow/ingestion/segy-to-zgy-conversion/-/tree/master/) and navigate to the folder doc/sample-records/volve
245
-
* Edit the values in the `prepare-records.sh` bash script:
233
+
* Edit the values in the `prepare-records.sh` bash script. Recall that the format of the legal tag will be prefixed with the Microsoft Energy Data Services instance name and data partition name, so it looks like `<instancename>`-`<datapartitionname>`-`<legaltagname>`.
> Recall that the format of the legal tag will be prefixed with the Microsoft Energy Data Services instance name and data partition name, so it looks like `<instancename>`-`<datapartitionname>`-`<legaltagname>`.
240
+
* Run the `prepare-records.sh` script.
254
241
* The output will be a JSON array with all objects and will be saved in the `all_records.json` file.
255
242
* Save the `filecollection_segy_id` and the `work_product_id` values in that JSON file to use in the conversion step. That way the converter knows where to look for this contents of your `all_records.json`.
256
243
257
-
11. Insert the contents of your `all_records.json` file in storage for work-product, seismic trace data, seismic grid, and file collection (that is, copy and paste the contents of that file to the `--data-raw` field in the following command):
244
+
11. Insert the contents of your `all_records.json` file in storage for work-product, seismic trace data, seismic grid, and file collection. In other words, copy and paste the contents of that file to the `--data-raw` field in the following command. If the above steps have produced two sets, you can run this command twice, using each set once.
258
245
259
246
```bash
260
247
curl --location --request PUT '<url>/api/storage/v2/records' \
0 commit comments