Skip to content

Commit 9a0bef3

Browse files
authored
Merge pull request #226688 from bmicro/workerbranch2
changing Microsoft Data Manager for Energy to Azure Data Manager for …
2 parents fae39d8 + 40b3ac7 commit 9a0bef3

37 files changed

+311
-309
lines changed

articles/data-factory/connector-oracle-cloud-storage.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
title: Copy data from Oracle Cloud Storage
3-
description: Learn about how to copy data from Oracle Cloud Storage to supported sink data stores using a Azure Data Factory or Synapse Analytics pipeline.
3+
description: Learn about how to copy data from Oracle Cloud Storage to supported sink data stores using an Azure Data Factory or Synapse Analytics pipeline.
44
titleSuffix: Azure Data Factory & Azure Synapse
55
author: jianleishen
66
ms.service: data-factory

articles/energy-data-services/concepts-csv-parser-ingestion.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,25 +1,25 @@
11
---
2-
title: Microsoft Energy Data Services Preview csv parser ingestion workflow concept #Required; page title is displayed in search results. Include the brand.
2+
title: Microsoft Azure Data Manager for Energy Preview csv parser ingestion workflow concept #Required; page title is displayed in search results. Include the brand.
33
description: Learn how to use CSV parser ingestion. #Required; article description that is displayed in search results.
44
author: bharathim #Required; your GitHub user alias, with correct capitalization.
55
ms.author: bselvaraj #Required; microsoft alias of author; optional team alias.
66
ms.service: energy-data-services #Required; service per approved list. slug assigned by ACOM.
77
ms.topic: conceptual #Required; leave this attribute/value as-is.
8-
ms.date: 08/18/2022
8+
ms.date: 02/10/2023
99
ms.custom: template-concept #Required; leave this attribute/value as-is.
1010
---
1111

1212
# CSV parser ingestion concepts
1313
A CSV (comma-separated values) file is a comma delimited text file that is used to save data in a table structured format.
1414

15-
A CSV Parser [DAG](https://airflow.apache.org/docs/apache-airflow/1.10.12/concepts.html#dags) allows a customer to load data into Microsoft Energy Data Services Preview instance based on a custom schema that is, a schema that doesn't match the [OSDU™](https://osduforum.org) canonical schema. Customers must create and register the custom schema using the Schema service before loading the data.
15+
A CSV Parser [DAG](https://airflow.apache.org/docs/apache-airflow/1.10.12/concepts.html#dags) allows a customer to load data into Microsoft Azure Data Manager for Energy Preview instance based on a custom schema that is, a schema that doesn't match the [OSDU™](https://osduforum.org) canonical schema. Customers must create and register the custom schema using the Schema service before loading the data.
1616

1717
A CSV Parser DAG implements an ELT (Extract Load and Transform) approach to data loading, that is, data is first extracted from the source system in a CSV format, and it's loaded into the Microsoft Energy Data Service Preview instance. It could then be transformed to the [OSDU™](https://osduforum.org) canonical schema using a mapping service.
1818

1919
[!INCLUDE [preview features callout](./includes/preview/preview-callout.md)]
2020

2121
## What does CSV ingestion do?
22-
A CSV Parser DAG allows the customers to load the CSV data into the Microsoft Energy Data Services Preview instance. It parses each row of a CSV file and creates a storage metadata record. It performs `schema validation` to ensure that the CSV data conforms to the registered custom schema. It automatically performs `type coercion` on the columns based on the schema data type definition. It generates `unique id` for each row of the CSV record by combining source, entity type and a Base64 encoded string formed by concatenating natural key(s) in the data. It performs `unit conversion` by converting declared frame of reference information into appropriate persistable reference using the Unit service. It performs `CRS conversion` for spatially aware columns based on the Frame of Reference (FoR) information present in the schema. It creates `relationships` metadata as declared in the source schema. Finally, it `persists` the metadata record using the Storage service.
22+
A CSV Parser DAG allows the customers to load the CSV data into the Microsoft Azure Data Manager for Energy Preview instance. It parses each row of a CSV file and creates a storage metadata record. It performs `schema validation` to ensure that the CSV data conforms to the registered custom schema. It automatically performs `type coercion` on the columns based on the schema data type definition. It generates `unique id` for each row of the CSV record by combining source, entity type and a Base64 encoded string formed by concatenating natural key(s) in the data. It performs `unit conversion` by converting declared frame of reference information into appropriate persistable reference using the Unit service. It performs `CRS conversion` for spatially aware columns based on the Frame of Reference (FoR) information present in the schema. It creates `relationships` metadata as declared in the source schema. Finally, it `persists` the metadata record using the Storage service.
2323

2424
## CSV parser ingestion components
2525

@@ -41,11 +41,11 @@ To execute the CSV Parser DAG workflow, the user must have a valid authorization
4141
The below workflow diagram illustrates the CSV Parser DAG workflow:
4242
:::image type="content" source="media/concepts-csv-parser-ingestion/csv-ingestion-sequence-diagram.png" alt-text="Screenshot of the CSV ingestion sequence diagram." lightbox="media/concepts-csv-parser-ingestion/csv-ingestion-sequence-diagram-expanded.png":::
4343

44-
To execute the CSV Parser DAG workflow, the user must first create and register the schema using the workflow service. Once the schema is created, the user then uses the File service to upload the CSV file to the Microsoft Energy Data Services Preview instances, and also creates the storage record of file generic kind. The file service then provides a file ID to the user, which is used while triggering the CSV Parser workflow using the Workflow service. The Workflow service provides a run ID, which the user could use to track the status of the CSV Parser workflow run.
44+
To execute the CSV Parser DAG workflow, the user must first create and register the schema using the workflow service. Once the schema is created, the user then uses the File service to upload the CSV file to the Microsoft Azure Data Manager for Energy Preview instances, and also creates the storage record of file generic kind. The file service then provides a file ID to the user, which is used while triggering the CSV Parser workflow using the Workflow service. The Workflow service provides a run ID, which the user could use to track the status of the CSV Parser workflow run.
4545

4646
OSDU™ is a trademark of The Open Group.
4747

4848
## Next steps
4949
Advance to the CSV parser tutorial and learn how to perform a CSV parser ingestion
5050
> [!div class="nextstepaction"]
51-
> [Tutorial: Sample steps to perform a CSV parser ingestion](tutorial-csv-ingestion.md)
51+
> [Tutorial: Sample steps to perform a CSV parser ingestion](tutorial-csv-ingestion.md)

articles/energy-data-services/concepts-ddms.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -42,23 +42,23 @@ OSDU™ Technical Standard defines the following types of OSDU™ applic
4242

4343
## Who did we build this for?
4444

45-
**IT Developers** build systems to connect data to domain applications (internal and external – for example, Petrel) which enables data managers to deliver projects to geoscientists. The DDMS suite on Microsoft Energy Data Services helps automate these workflows and eliminates time spent managing updates.
45+
**IT Developers** build systems to connect data to domain applications (internal and external – for example, Petrel) which enables data managers to deliver projects to geoscientists. The DDMS suite on Azure Data Manager for Energy Preview helps automate these workflows and eliminates time spent managing updates.
4646

47-
**Geoscientists** use domain applications for key Exploration and Production workflows such as Seismic interpretation and Well tie analysis. While these users won't directly interact with the DDMS, their expectations for data performance and accessibility will drive requirements for the DDMS in the Foundation Tier. Azure will enable geoscientists to stream cross domain data instantly in OSDU™ compatible applications (for example, Petrel) connected to Microsoft Energy Data Services.
47+
**Geoscientists** use domain applications for key Exploration and Production workflows such as Seismic interpretation and Well tie analysis. While these users won't directly interact with the DDMS, their expectations for data performance and accessibility will drive requirements for the DDMS in the Foundation Tier. Azure will enable geoscientists to stream cross domain data instantly in OSDU™ compatible applications (for example, Petrel) connected to Azure Data Manager for Energy Preview.
4848

4949
**Data managers** spend a significant number of time fulfilling requests for data retrieval and delivery. The Seismic, Wellbore, and Petrel Data Services enable them to discover and manage data in one place while tracking version changes as derivatives are created.
5050

5151
## Platform landscape
5252

53-
Microsoft Energy Data Services is an OSDU™ compatible product, meaning that its landscape and release model are dependent on OSDU™.
53+
Azure Data Manager for Energy Preview is an OSDU™ compatible product, meaning that its landscape and release model are dependent on OSDU™.
5454

55-
Currently, OSDU™ certification and release process are not fully defined yet and this topic should be defined as a part of the Microsoft Energy Data Services Foundation Architecture.
55+
Currently, OSDU™ certification and release process are not fully defined yet and this topic should be defined as a part of the Azure Data Manager for Energy Preview Foundation Architecture.
5656

57-
OSDU™ R3 M8 is the base for the scope of the Microsoft Energy Data Services Foundation Private Preview – as a latest stable, tested version of the platform.
57+
OSDU™ R3 M8 is the base for the scope of the Azure Data Manager for Energy Preview Foundation Private Preview – as a latest stable, tested version of the platform.
5858

5959
## Learn more: OSDU™ DDMS community principles
6060

61-
[OSDU™ community DDMS Overview](https://community.opengroup.org/osdu/documentation/-/wikis/OSDU™-(C)/Design-and-Implementation/Domain-&-Data-Management-Services#ddms-requirements) provides an extensive overview of DDMS motivation and community requirements from a user, technical, and business perspective. These principles are extended to Microsoft Energy Data Services.
61+
[OSDU™ community DDMS Overview](https://community.opengroup.org/osdu/documentation/-/wikis/OSDU™-(C)/Design-and-Implementation/Domain-&-Data-Management-Services#ddms-requirements) provides an extensive overview of DDMS motivation and community requirements from a user, technical, and business perspective. These principles are extended to Azure Data Manager for Energy Preview.
6262

6363
## DDMS requirements
6464

articles/energy-data-services/concepts-entitlements.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11
---
2-
title: Microsoft Energy Data Services Preview entitlement concepts #Required; page title is displayed in search results. Include the brand.
3-
description: This article describes the various concepts regarding the entitlement services in Microsoft Energy Data Services Preview #Required; article description that is displayed in search results.
2+
title: Microsoft Azure Data Manager for Energy Preview entitlement concepts #Required; page title is displayed in search results. Include the brand.
3+
description: This article describes the various concepts regarding the entitlement services in Azure Data Manager for Energy Preview #Required; article description that is displayed in search results.
44
author: Lakshmisha-KS #Required; your GitHub user alias, with correct capitalization.
55
ms.author: lakshmishaks #Required; microsoft alias of author; optional team alias.
66
ms.service: energy-data-services #Required; service per approved list. slug assigned by ACOM.
77
ms.topic: conceptual #Required; leave this attribute/value as-is.
8-
ms.date: 08/19/2022
8+
ms.date: 02/10/2023
99
ms.custom: template-concept #Required; leave this attribute/value as-is.
1010
---
1111

@@ -17,7 +17,7 @@ Access management is a critical function for any service or resource. Entitlemen
1717

1818
## Groups
1919

20-
The entitlements service of Microsoft Energy Data Services allows you to create groups, and an entitlement group defines permissions on services/data sources for your Microsoft Energy Data Services instance. Users added by you to that group obtain the associated permissions.
20+
The entitlements service of Azure Data Manager for Energy Preview allows you to create groups, and an entitlement group defines permissions on services/data sources for your Azure Data Manager for Energy Preview instance. Users added by you to that group obtain the associated permissions.
2121

2222
The main motivation for entitlements service is data authorization, but the functionality enables three use cases:
2323

articles/energy-data-services/concepts-index-and-search.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,16 @@
11
---
2-
title: Microsoft Energy Data Services Preview - index and search workflow concepts #Required; page title is displayed in search results. Include the brand.
2+
title: Microsoft Azure Data Manager for Energy Preview - index and search workflow concepts #Required; page title is displayed in search results. Include the brand.
33
description: Learn how to use indexing and search workflows #Required; article description that is displayed in search results.
44
author: vivekkalra #Required; your GitHub user alias, with correct capitalization.
55
ms.author: vivekkalra #Required; microsoft alias of author; optional team alias.
66
ms.service: energy-data-services #Required; service per approved list. slug assigned by ACOM.
77
ms.topic: conceptual #Required; leave this attribute/value as-is.
8-
ms.date: 08/23/2022
8+
ms.date: 02/10/2023
99
ms.custom: template-concept #Required; leave this attribute/value as-is.
1010

1111
#Customer intent: As a developer, I want to understand indexing and search workflows so that I could search for ingested data in the platform.
1212
---
13-
# Microsoft Energy Data Services Preview indexing and search workflows
13+
# Azure Data Manager for Energy Preview indexing and search workflows
1414

1515
All data and associated metadata ingested into the platform are indexed to enable search. The metadata is accessible to ensure awareness even when the data isn't available.
1616

articles/energy-data-services/concepts-manifest-ingestion.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: Microsoft Energy Data Services Preview manifest ingestion concepts #Required; page title is displayed in search results. Include the brand.
2+
title: Microsoft Azure Data Manager for Energy Preview manifest ingestion concepts #Required; page title is displayed in search results. Include the brand.
33
description: This article describes manifest ingestion concepts #Required; article description that is displayed in search results.
44
author: bharathim #Required; your GitHub user alias, with correct capitalization.
55
ms.author: bselvaraj #Required; microsoft alias of author; optional team alias.
@@ -10,7 +10,7 @@ ms.custom: template-concept #Required; leave this attribute/value as-is.
1010
---
1111

1212
# Manifest-based ingestion concepts
13-
Manifest-based file ingestion provides end-users and systems a robust mechanism for loading metadata about datasets in Microsoft Energy Data Services Preview instance. This metadata is indexed by the system and allows the end-user to search the datasets.
13+
Manifest-based file ingestion provides end-users and systems a robust mechanism for loading metadata about datasets in Azure Data Manager for Energy Preview instance. This metadata is indexed by the system and allows the end-user to search the datasets.
1414

1515
Manifest-based file ingestion is an opaque ingestion that do not parse or understand the file contents. It creates a metadata record based on the manifest and makes the record searchable.
1616

@@ -41,7 +41,7 @@ Any arrays are ordered. should there be interdependencies, the dependent items m
4141

4242
## Manifest-based file ingestion workflow
4343

44-
Microsoft Energy Data Services Preview instance has out-of-the-box support for Manifest-based file ingestion workflow. `Osdu_ingest` Airflow DAG is pre-configured in your instance.
44+
Azure Data Manager for Energy Preview instance has out-of-the-box support for Manifest-based file ingestion workflow. `Osdu_ingest` Airflow DAG is pre-configured in your instance.
4545

4646
### Manifest-based file ingestion workflow components
4747
The Manifest-based file ingestion workflow consists of the following components:
@@ -54,7 +54,7 @@ The Manifest-based file ingestion workflow consists of the following components:
5454
* **Search Service** is used to perform referential integrity check during the manifest ingestion process.
5555

5656
### Pre-requisites
57-
Before running the Manifest-based file ingestion workflow, customers must ensure that the user accounts running the workflow have access to the core services (Search, Storage, Schema, Entitlement and Legal) and Workflow service (see [Entitlement roles](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/docs/osdu-entitlement-roles.md) for details). As part of Microsoft Energy Data Services instance provisioning, the OSDU™ standard schemas and associated reference data are pre-loaded. Customers must ensure that the user account used for ingesting the manifests is included in appropriate owners and viewers ACLs. Customers must ensure that manifests are configured with correct legal tags, owners and viewers ACLs, reference data, etc.
57+
Before running the Manifest-based file ingestion workflow, customers must ensure that the user accounts running the workflow have access to the core services (Search, Storage, Schema, Entitlement and Legal) and Workflow service (see [Entitlement roles](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/docs/osdu-entitlement-roles.md) for details). As part of Azure Data Manager for Energy Preview instance provisioning, the OSDU™ standard schemas and associated reference data are pre-loaded. Customers must ensure that the user account used for ingesting the manifests is included in appropriate owners and viewers ACLs. Customers must ensure that manifests are configured with correct legal tags, owners and viewers ACLs, reference data, etc.
5858

5959
### Workflow sequence
6060
The following illustration provides the Manifest-based file ingestion workflow:

0 commit comments

Comments
 (0)