You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/connector-oracle-cloud-storage.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
---
2
2
title: Copy data from Oracle Cloud Storage
3
-
description: Learn about how to copy data from Oracle Cloud Storage to supported sink data stores using a Azure Data Factory or Synapse Analytics pipeline.
3
+
description: Learn about how to copy data from Oracle Cloud Storage to supported sink data stores using an Azure Data Factory or Synapse Analytics pipeline.
Copy file name to clipboardExpand all lines: articles/energy-data-services/concepts-csv-parser-ingestion.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,25 +1,25 @@
1
1
---
2
-
title: Microsoft Energy Data Services Preview csv parser ingestion workflow concept #Required; page title is displayed in search results. Include the brand.
2
+
title: Microsoft Azure Data Manager for Energy Preview csv parser ingestion workflow concept #Required; page title is displayed in search results. Include the brand.
3
3
description: Learn how to use CSV parser ingestion. #Required; article description that is displayed in search results.
4
4
author: bharathim #Required; your GitHub user alias, with correct capitalization.
5
5
ms.author: bselvaraj #Required; microsoft alias of author; optional team alias.
6
6
ms.service: energy-data-services #Required; service per approved list. slug assigned by ACOM.
7
7
ms.topic: conceptual #Required; leave this attribute/value as-is.
8
-
ms.date: 08/18/2022
8
+
ms.date: 02/10/2023
9
9
ms.custom: template-concept #Required; leave this attribute/value as-is.
10
10
---
11
11
12
12
# CSV parser ingestion concepts
13
13
A CSV (comma-separated values) file is a comma delimited text file that is used to save data in a table structured format.
14
14
15
-
A CSV Parser [DAG](https://airflow.apache.org/docs/apache-airflow/1.10.12/concepts.html#dags) allows a customer to load data into Microsoft Energy Data Services Preview instance based on a custom schema that is, a schema that doesn't match the [OSDU™](https://osduforum.org) canonical schema. Customers must create and register the custom schema using the Schema service before loading the data.
15
+
A CSV Parser [DAG](https://airflow.apache.org/docs/apache-airflow/1.10.12/concepts.html#dags) allows a customer to load data into Microsoft Azure Data Manager for Energy Preview instance based on a custom schema that is, a schema that doesn't match the [OSDU™](https://osduforum.org) canonical schema. Customers must create and register the custom schema using the Schema service before loading the data.
16
16
17
17
A CSV Parser DAG implements an ELT (Extract Load and Transform) approach to data loading, that is, data is first extracted from the source system in a CSV format, and it's loaded into the Microsoft Energy Data Service Preview instance. It could then be transformed to the [OSDU™](https://osduforum.org) canonical schema using a mapping service.
18
18
19
19
[!INCLUDE [preview features callout](./includes/preview/preview-callout.md)]
20
20
21
21
## What does CSV ingestion do?
22
-
A CSV Parser DAG allows the customers to load the CSV data into the Microsoft Energy Data Services Preview instance. It parses each row of a CSV file and creates a storage metadata record. It performs `schema validation` to ensure that the CSV data conforms to the registered custom schema. It automatically performs `type coercion` on the columns based on the schema data type definition. It generates `unique id` for each row of the CSV record by combining source, entity type and a Base64 encoded string formed by concatenating natural key(s) in the data. It performs `unit conversion` by converting declared frame of reference information into appropriate persistable reference using the Unit service. It performs `CRS conversion` for spatially aware columns based on the Frame of Reference (FoR) information present in the schema. It creates `relationships` metadata as declared in the source schema. Finally, it `persists` the metadata record using the Storage service.
22
+
A CSV Parser DAG allows the customers to load the CSV data into the Microsoft Azure Data Manager for Energy Preview instance. It parses each row of a CSV file and creates a storage metadata record. It performs `schema validation` to ensure that the CSV data conforms to the registered custom schema. It automatically performs `type coercion` on the columns based on the schema data type definition. It generates `unique id` for each row of the CSV record by combining source, entity type and a Base64 encoded string formed by concatenating natural key(s) in the data. It performs `unit conversion` by converting declared frame of reference information into appropriate persistable reference using the Unit service. It performs `CRS conversion` for spatially aware columns based on the Frame of Reference (FoR) information present in the schema. It creates `relationships` metadata as declared in the source schema. Finally, it `persists` the metadata record using the Storage service.
23
23
24
24
## CSV parser ingestion components
25
25
@@ -41,11 +41,11 @@ To execute the CSV Parser DAG workflow, the user must have a valid authorization
41
41
The below workflow diagram illustrates the CSV Parser DAG workflow:
42
42
:::image type="content" source="media/concepts-csv-parser-ingestion/csv-ingestion-sequence-diagram.png" alt-text="Screenshot of the CSV ingestion sequence diagram." lightbox="media/concepts-csv-parser-ingestion/csv-ingestion-sequence-diagram-expanded.png":::
43
43
44
-
To execute the CSV Parser DAG workflow, the user must first create and register the schema using the workflow service. Once the schema is created, the user then uses the File service to upload the CSV file to the Microsoft Energy Data Services Preview instances, and also creates the storage record of file generic kind. The file service then provides a file ID to the user, which is used while triggering the CSV Parser workflow using the Workflow service. The Workflow service provides a run ID, which the user could use to track the status of the CSV Parser workflow run.
44
+
To execute the CSV Parser DAG workflow, the user must first create and register the schema using the workflow service. Once the schema is created, the user then uses the File service to upload the CSV file to the Microsoft Azure Data Manager for Energy Preview instances, and also creates the storage record of file generic kind. The file service then provides a file ID to the user, which is used while triggering the CSV Parser workflow using the Workflow service. The Workflow service provides a run ID, which the user could use to track the status of the CSV Parser workflow run.
45
45
46
46
OSDU™ is a trademark of The Open Group.
47
47
48
48
## Next steps
49
49
Advance to the CSV parser tutorial and learn how to perform a CSV parser ingestion
50
50
> [!div class="nextstepaction"]
51
-
> [Tutorial: Sample steps to perform a CSV parser ingestion](tutorial-csv-ingestion.md)
51
+
> [Tutorial: Sample steps to perform a CSV parser ingestion](tutorial-csv-ingestion.md)
Copy file name to clipboardExpand all lines: articles/energy-data-services/concepts-ddms.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -42,23 +42,23 @@ OSDU™ Technical Standard defines the following types of OSDU™ applic
42
42
43
43
## Who did we build this for?
44
44
45
-
**IT Developers** build systems to connect data to domain applications (internal and external – for example, Petrel) which enables data managers to deliver projects to geoscientists. The DDMS suite on Microsoft Energy Data Services helps automate these workflows and eliminates time spent managing updates.
45
+
**IT Developers** build systems to connect data to domain applications (internal and external – for example, Petrel) which enables data managers to deliver projects to geoscientists. The DDMS suite on Azure Data Manager for Energy Preview helps automate these workflows and eliminates time spent managing updates.
46
46
47
-
**Geoscientists** use domain applications for key Exploration and Production workflows such as Seismic interpretation and Well tie analysis. While these users won't directly interact with the DDMS, their expectations for data performance and accessibility will drive requirements for the DDMS in the Foundation Tier. Azure will enable geoscientists to stream cross domain data instantly in OSDU™ compatible applications (for example, Petrel) connected to Microsoft Energy Data Services.
47
+
**Geoscientists** use domain applications for key Exploration and Production workflows such as Seismic interpretation and Well tie analysis. While these users won't directly interact with the DDMS, their expectations for data performance and accessibility will drive requirements for the DDMS in the Foundation Tier. Azure will enable geoscientists to stream cross domain data instantly in OSDU™ compatible applications (for example, Petrel) connected to Azure Data Manager for Energy Preview.
48
48
49
49
**Data managers** spend a significant number of time fulfilling requests for data retrieval and delivery. The Seismic, Wellbore, and Petrel Data Services enable them to discover and manage data in one place while tracking version changes as derivatives are created.
50
50
51
51
## Platform landscape
52
52
53
-
Microsoft Energy Data Services is an OSDU™ compatible product, meaning that its landscape and release model are dependent on OSDU™.
53
+
Azure Data Manager for Energy Preview is an OSDU™ compatible product, meaning that its landscape and release model are dependent on OSDU™.
54
54
55
-
Currently, OSDU™ certification and release process are not fully defined yet and this topic should be defined as a part of the Microsoft Energy Data Services Foundation Architecture.
55
+
Currently, OSDU™ certification and release process are not fully defined yet and this topic should be defined as a part of the Azure Data Manager for Energy Preview Foundation Architecture.
56
56
57
-
OSDU™ R3 M8 is the base for the scope of the Microsoft Energy Data Services Foundation Private Preview – as a latest stable, tested version of the platform.
57
+
OSDU™ R3 M8 is the base for the scope of the Azure Data Manager for Energy Preview Foundation Private Preview – as a latest stable, tested version of the platform.
58
58
59
59
## Learn more: OSDU™ DDMS community principles
60
60
61
-
[OSDU™ community DDMS Overview](https://community.opengroup.org/osdu/documentation/-/wikis/OSDU™-(C)/Design-and-Implementation/Domain-&-Data-Management-Services#ddms-requirements) provides an extensive overview of DDMS motivation and community requirements from a user, technical, and business perspective. These principles are extended to Microsoft Energy Data Services.
61
+
[OSDU™ community DDMS Overview](https://community.opengroup.org/osdu/documentation/-/wikis/OSDU™-(C)/Design-and-Implementation/Domain-&-Data-Management-Services#ddms-requirements) provides an extensive overview of DDMS motivation and community requirements from a user, technical, and business perspective. These principles are extended to Azure Data Manager for Energy Preview.
Copy file name to clipboardExpand all lines: articles/energy-data-services/concepts-entitlements.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,11 +1,11 @@
1
1
---
2
-
title: Microsoft Energy Data Services Preview entitlement concepts #Required; page title is displayed in search results. Include the brand.
3
-
description: This article describes the various concepts regarding the entitlement services in Microsoft Energy Data Services Preview #Required; article description that is displayed in search results.
2
+
title: Microsoft Azure Data Manager for Energy Preview entitlement concepts #Required; page title is displayed in search results. Include the brand.
3
+
description: This article describes the various concepts regarding the entitlement services in Azure Data Manager for Energy Preview #Required; article description that is displayed in search results.
4
4
author: Lakshmisha-KS #Required; your GitHub user alias, with correct capitalization.
5
5
ms.author: lakshmishaks #Required; microsoft alias of author; optional team alias.
6
6
ms.service: energy-data-services #Required; service per approved list. slug assigned by ACOM.
7
7
ms.topic: conceptual #Required; leave this attribute/value as-is.
8
-
ms.date: 08/19/2022
8
+
ms.date: 02/10/2023
9
9
ms.custom: template-concept #Required; leave this attribute/value as-is.
10
10
---
11
11
@@ -17,7 +17,7 @@ Access management is a critical function for any service or resource. Entitlemen
17
17
18
18
## Groups
19
19
20
-
The entitlements service of Microsoft Energy Data Services allows you to create groups, and an entitlement group defines permissions on services/data sources for your Microsoft Energy Data Services instance. Users added by you to that group obtain the associated permissions.
20
+
The entitlements service of Azure Data Manager for Energy Preview allows you to create groups, and an entitlement group defines permissions on services/data sources for your Azure Data Manager for Energy Preview instance. Users added by you to that group obtain the associated permissions.
21
21
22
22
The main motivation for entitlements service is data authorization, but the functionality enables three use cases:
Copy file name to clipboardExpand all lines: articles/energy-data-services/concepts-index-and-search.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,16 +1,16 @@
1
1
---
2
-
title: Microsoft Energy Data Services Preview - index and search workflow concepts #Required; page title is displayed in search results. Include the brand.
2
+
title: Microsoft Azure Data Manager for Energy Preview - index and search workflow concepts #Required; page title is displayed in search results. Include the brand.
3
3
description: Learn how to use indexing and search workflows #Required; article description that is displayed in search results.
4
4
author: vivekkalra #Required; your GitHub user alias, with correct capitalization.
5
5
ms.author: vivekkalra #Required; microsoft alias of author; optional team alias.
6
6
ms.service: energy-data-services #Required; service per approved list. slug assigned by ACOM.
7
7
ms.topic: conceptual #Required; leave this attribute/value as-is.
8
-
ms.date: 08/23/2022
8
+
ms.date: 02/10/2023
9
9
ms.custom: template-concept #Required; leave this attribute/value as-is.
10
10
11
11
#Customer intent: As a developer, I want to understand indexing and search workflows so that I could search for ingested data in the platform.
12
12
---
13
-
# Microsoft Energy Data Services Preview indexing and search workflows
13
+
# Azure Data Manager for Energy Preview indexing and search workflows
14
14
15
15
All data and associated metadata ingested into the platform are indexed to enable search. The metadata is accessible to ensure awareness even when the data isn't available.
Copy file name to clipboardExpand all lines: articles/energy-data-services/concepts-manifest-ingestion.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
---
2
-
title: Microsoft Energy Data Services Preview manifest ingestion concepts #Required; page title is displayed in search results. Include the brand.
2
+
title: Microsoft Azure Data Manager for Energy Preview manifest ingestion concepts #Required; page title is displayed in search results. Include the brand.
3
3
description: This article describes manifest ingestion concepts #Required; article description that is displayed in search results.
4
4
author: bharathim #Required; your GitHub user alias, with correct capitalization.
5
5
ms.author: bselvaraj #Required; microsoft alias of author; optional team alias.
Manifest-based file ingestion provides end-users and systems a robust mechanism for loading metadata about datasets in Microsoft Energy Data Services Preview instance. This metadata is indexed by the system and allows the end-user to search the datasets.
13
+
Manifest-based file ingestion provides end-users and systems a robust mechanism for loading metadata about datasets in Azure Data Manager for Energy Preview instance. This metadata is indexed by the system and allows the end-user to search the datasets.
14
14
15
15
Manifest-based file ingestion is an opaque ingestion that do not parse or understand the file contents. It creates a metadata record based on the manifest and makes the record searchable.
16
16
@@ -41,7 +41,7 @@ Any arrays are ordered. should there be interdependencies, the dependent items m
41
41
42
42
## Manifest-based file ingestion workflow
43
43
44
-
Microsoft Energy Data Services Preview instance has out-of-the-box support for Manifest-based file ingestion workflow. `Osdu_ingest` Airflow DAG is pre-configured in your instance.
44
+
Azure Data Manager for Energy Preview instance has out-of-the-box support for Manifest-based file ingestion workflow. `Osdu_ingest` Airflow DAG is pre-configured in your instance.
The Manifest-based file ingestion workflow consists of the following components:
@@ -54,7 +54,7 @@ The Manifest-based file ingestion workflow consists of the following components:
54
54
***Search Service** is used to perform referential integrity check during the manifest ingestion process.
55
55
56
56
### Pre-requisites
57
-
Before running the Manifest-based file ingestion workflow, customers must ensure that the user accounts running the workflow have access to the core services (Search, Storage, Schema, Entitlement and Legal) and Workflow service (see [Entitlement roles](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/docs/osdu-entitlement-roles.md) for details). As part of Microsoft Energy Data Services instance provisioning, the OSDU™ standard schemas and associated reference data are pre-loaded. Customers must ensure that the user account used for ingesting the manifests is included in appropriate owners and viewers ACLs. Customers must ensure that manifests are configured with correct legal tags, owners and viewers ACLs, reference data, etc.
57
+
Before running the Manifest-based file ingestion workflow, customers must ensure that the user accounts running the workflow have access to the core services (Search, Storage, Schema, Entitlement and Legal) and Workflow service (see [Entitlement roles](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/docs/osdu-entitlement-roles.md) for details). As part of Azure Data Manager for Energy Preview instance provisioning, the OSDU™ standard schemas and associated reference data are pre-loaded. Customers must ensure that the user account used for ingesting the manifests is included in appropriate owners and viewers ACLs. Customers must ensure that manifests are configured with correct legal tags, owners and viewers ACLs, reference data, etc.
58
58
59
59
### Workflow sequence
60
60
The following illustration provides the Manifest-based file ingestion workflow:
0 commit comments