Skip to content

Commit 90a8be9

Browse files
author
Bharathi Selvaraj
committed
Replace trademark with registered trademark for OSDU
1 parent bc88783 commit 90a8be9

20 files changed

+83
-83
lines changed

articles/energy-data-services/concepts-csv-parser-ingestion.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -12,9 +12,9 @@ ms.custom: template-concept
1212
# CSV parser ingestion concepts
1313
A CSV (comma-separated values) file is a comma delimited text file that is used to save data in a table structured format.
1414

15-
A CSV Parser [DAG](https://airflow.apache.org/docs/apache-airflow/1.10.12/concepts.html#dags) allows a customer to load data into Microsoft Azure Data Manager for Energy instance based on a custom schema that is, a schema that doesn't match the [OSDU™](https://osduforum.org) Well Known Schema (WKS). Customers must create and register the custom schema using the Schema service before loading the data.
15+
A CSV Parser [DAG](https://airflow.apache.org/docs/apache-airflow/1.10.12/concepts.html#dags) allows a customer to load data into Microsoft Azure Data Manager for Energy instance based on a custom schema that is, a schema that doesn't match the [OSDU®](https://osduforum.org) Well Known Schema (WKS). Customers must create and register the custom schema using the Schema service before loading the data.
1616

17-
A CSV Parser DAG implements an ELT (Extract Load and Transform) approach to data loading, that is, data is first extracted from the source system in a CSV format, and it's loaded into the Azure Data Manager for Energy instance. It could then be transformed to the [OSDU™](https://osduforum.org) Well Known Schema using a mapping service.
17+
A CSV Parser DAG implements an ELT (Extract Load and Transform) approach to data loading, that is, data is first extracted from the source system in a CSV format, and it's loaded into the Azure Data Manager for Energy instance. It could then be transformed to the [OSDU®](https://osduforum.org) Well Known Schema using a mapping service.
1818

1919

2020
## What does CSV ingestion do?
@@ -42,7 +42,7 @@ The below workflow diagram illustrates the CSV Parser DAG workflow:
4242

4343
To execute the CSV Parser DAG workflow, the user must first create and register the schema using the workflow service. Once the schema is created, the user then uses the File service to upload the CSV file to the Microsoft Azure Data Manager for Energy instances, and also creates the storage record of file generic kind. The file service then provides a file ID to the user, which is used while triggering the CSV Parser workflow using the Workflow service. The Workflow service provides a run ID, which the user could use to track the status of the CSV Parser workflow run.
4444

45-
OSDU™ is a trademark of The Open Group.
45+
OSDU® is a trademark of The Open Group.
4646

4747
## Next steps
4848
Advance to the CSV parser tutorial and learn how to perform a CSV parser ingestion

articles/energy-data-services/concepts-ddms.md

Lines changed: 17 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -11,53 +11,53 @@ ms.custom: template-concept
1111

1212
# Domain data management service concepts
1313

14-
**Domain Data Management Service (DDMS)** – is a platform component that extends [OSDU™](https://osduforum.org) core data platform with domain specific model and optimizations. DDMS is a mechanism of a platform extension that:
14+
**Domain Data Management Service (DDMS)** – is a platform component that extends [OSDU®](https://osduforum.org) core data platform with domain specific model and optimizations. DDMS is a mechanism of a platform extension that:
1515

1616
* delivers optimized handling of data for each (non-overlapping) "domain."
1717
* pertains to a single vertical discipline or business area, for example, Petrophysics, Geophysics, Seismic
1818
* serves a functional aspect of one or more vertical disciplines or business areas, for example, Earth Model
19-
* delivers high performance capabilities not supported by OSDU™ generic normal APIs.
20-
* helps achieve the extension of OSDU™ scope to new business areas.
19+
* delivers high performance capabilities not supported by OSDU® generic normal APIs.
20+
* helps achieve the extension of OSDU® scope to new business areas.
2121
* may be developed in a distributed manner with separate resources/sponsors.
2222

23-
OSDU™ Technical Standard defines the following types of OSDU™ application types:
23+
OSDU® Technical Standard defines the following types of OSDU® application types:
2424

2525
| Application Type | Description |
2626
| --------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
27-
| OSDU™™ Embedded Applications | An application developed and managed within the OSDU™ Open-Source community that is built on and deployed as part of the OSDU™ Data Platform distribution. |
28-
| ISV Extension Applications | An application, developed and managed in the marketplace that is NOT part of THE OSDU™ Data Platform distributions, and when selected is deployed within the OSDU™ Data Platform as add-ons |
29-
| ISV third Party Applications | An application, developed and managed in the marketplace that integrates with the OSDU™ Data Platform, and runs outside the OSDU™ Data Platform |
27+
| OSDU®™ Embedded Applications | An application developed and managed within the OSDU® Open-Source community that is built on and deployed as part of the OSDU® Data Platform distribution. |
28+
| ISV Extension Applications | An application, developed and managed in the marketplace that is NOT part of THE OSDU® Data Platform distributions, and when selected is deployed within the OSDU® Data Platform as add-ons |
29+
| ISV third Party Applications | An application, developed and managed in the marketplace that integrates with the OSDU® Data Platform, and runs outside the OSDU® Data Platform |
3030

3131

3232
| Characteristics | Embedded | Extension | Third Party |
3333
| ----------------------------------------- | ---------------------------------- | --------------------------- | --------- |
34-
| Developed, managed, and deployed by | The OSDU™ Data Platform | ISV | ISV |
34+
| Developed, managed, and deployed by | The OSDU® Data Platform | ISV | ISV |
3535
| Software License | Apache 2 | ISV | ISV |
36-
| Mandatory as part of an OSDU™ distribution | Yes | No | No |
36+
| Mandatory as part of an OSDU® distribution | Yes | No | No |
3737
| Replaceable | Yes, with preservation of behavior | Yes | Yes |
38-
| Architecture Compliance | The OSDU™ Standard | The OSDU™ Standard | ISV |
38+
| Architecture Compliance | The OSDU® Standard | The OSDU® Standard | ISV |
3939
| Examples | OS CRS <br /> Wellbore DDMS | ESRI CRS <br /> Petrel DS | Petrel |
4040

4141

4242
## Who did we build this for?
4343

4444
**IT Developers** build systems to connect data to domain applications (internal and external – for example, Petrel) which enables data managers to deliver projects to geoscientists. The DDMS suite on Azure Data Manager for Energy helps automate these workflows and eliminates time spent managing updates.
4545

46-
**Geoscientists** use domain applications for key Exploration and Production workflows such as Seismic interpretation and Well tie analysis. While these users won't directly interact with the DDMS, their expectations for data performance and accessibility will drive requirements for the DDMS in the Foundation Tier. Azure will enable geoscientists to stream cross domain data instantly in OSDU&trade; compatible applications (for example, Petrel) connected to Azure Data Manager for Energy.
46+
**Geoscientists** use domain applications for key Exploration and Production workflows such as Seismic interpretation and Well tie analysis. While these users won't directly interact with the DDMS, their expectations for data performance and accessibility will drive requirements for the DDMS in the Foundation Tier. Azure will enable geoscientists to stream cross domain data instantly in OSDU&reg; compatible applications (for example, Petrel) connected to Azure Data Manager for Energy.
4747

4848
**Data managers** spend a significant number of time fulfilling requests for data retrieval and delivery. The Seismic, Wellbore, and Petrel Data Services enable them to discover and manage data in one place while tracking version changes as derivatives are created.
4949

5050
## Platform landscape
5151

52-
Azure Data Manager for Energy is an OSDU&trade; compatible product, meaning that its landscape and release model are dependent on OSDU&trade;.
52+
Azure Data Manager for Energy is an OSDU&reg; compatible product, meaning that its landscape and release model are dependent on OSDU&reg;.
5353

54-
Currently, OSDU&trade; certification and release process are not fully defined yet and this topic should be defined as a part of the Azure Data Manager for Energy Foundation Architecture.
54+
Currently, OSDU&reg; certification and release process are not fully defined yet and this topic should be defined as a part of the Azure Data Manager for Energy Foundation Architecture.
5555

56-
OSDU&trade; R3 M8 is the base for the scope of the Azure Data Manager for Energy Foundation Private – as a latest stable, tested version of the platform.
56+
OSDU&reg; R3 M8 is the base for the scope of the Azure Data Manager for Energy Foundation Private – as a latest stable, tested version of the platform.
5757

58-
## Learn more: OSDU&trade; DDMS community principles
58+
## Learn more: OSDU&reg; DDMS community principles
5959

60-
[OSDU&trade; community DDMS Overview](https://community.opengroup.org/osdu/documentation/-/wikis/OSDU&trade;-(C)/Design-and-Implementation/Domain-&-Data-Management-Services#ddms-requirements) provides an extensive overview of DDMS motivation and community requirements from a user, technical, and business perspective. These principles are extended to Azure Data Manager for Energy.
60+
[OSDU&reg; community DDMS Overview](https://community.opengroup.org/osdu/documentation/-/wikis/OSDU&reg;-(C)/Design-and-Implementation/Domain-&-Data-Management-Services#ddms-requirements) provides an extensive overview of DDMS motivation and community requirements from a user, technical, and business perspective. These principles are extended to Azure Data Manager for Energy.
6161

6262
## DDMS requirements
6363

@@ -85,7 +85,7 @@ A DDMS meets the following requirements, further classified into capability, arc
8585
| 18 | Workflow composability and customizations | | Openness and Extensibility |
8686
| 19 | Data-Centric Extensibility | | Openness and Extensibility |
8787

88-
OSDU&trade; is a trademark of The Open Group.
88+
OSDU&reg; is a trademark of The Open Group.
8989

9090
## Next steps
9191
Advance to the seismic DDMS sdutil tutorial to learn how to use sdutil to load seismic data into seismic store.

articles/energy-data-services/concepts-entitlements.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -91,7 +91,7 @@ For a full list of Entitlement API endpoints, see [OSDU entitlement service](htt
9191
> [!NOTE]
9292
> The OSDU documentation refers to v1 endpoints, but the scripts noted in this documentation refer to v2 endpoints, which work and have been successfully validated.
9393
94-
OSDU&trade; is a trademark of The Open Group.
94+
OSDU&reg; is a trademark of The Open Group.
9595

9696
## Next steps
9797

articles/energy-data-services/concepts-index-and-search.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -35,22 +35,22 @@ When the *recordChangedMessages* event is received by the `Indexer Service`, it
3535

3636
:::image type="content" source="media/concepts-index-and-search/concept-indexer-sequence.png" alt-text="Diagram that shows Indexing sequence flow.":::
3737

38-
For more information, see [Indexer service OSDU&trade; documentation](https://community.opengroup.org/osdu/platform/system/indexer-service/-/blob/release/0.15/docs/tutorial/IndexerService.md) provides information on indexer service
38+
For more information, see [Indexer service OSDU&reg; documentation](https://community.opengroup.org/osdu/platform/system/indexer-service/-/blob/release/0.15/docs/tutorial/IndexerService.md) provides information on indexer service
3939

4040
## Search workflow
4141

4242
`Search service` provides a mechanism for discovering indexed metadata documents. The Search API supports full-text search on string fields, range queries on date, numeric, or string field, etc. along with geo-spatial searches.
4343

4444
When metadata records are loaded onto the Platform using `Storage service`, we can configure permissions for viewers and owners of the metadata records under the *acl* field. The viewers and owners are assigned via groups as defined in the `Entitlement service`. When performing a search as a user, the matched metadata records will only show up for users who are assigned to the Group.
4545

46-
For a detailed tutorial on `Search service`, refer [Search service OSDU&trade; documentation](https://community.opengroup.org/osdu/platform/system/search-service/-/blob/release/0.15/docs/tutorial/SearchService.md)
46+
For a detailed tutorial on `Search service`, refer [Search service OSDU&reg; documentation](https://community.opengroup.org/osdu/platform/system/search-service/-/blob/release/0.15/docs/tutorial/SearchService.md)
4747

4848

4949
## Reindex workflow
5050
Reindex API allows users to reindex a kind without reingesting the records via storage API. For detailed information, refer to
51-
[Reindex OSDU&trade; documentation](https://community.opengroup.org/osdu/platform/system/indexer-service/-/blob/release/0.15/docs/tutorial/IndexerService.md#reindex)
51+
[Reindex OSDU&reg; documentation](https://community.opengroup.org/osdu/platform/system/indexer-service/-/blob/release/0.15/docs/tutorial/IndexerService.md#reindex)
5252

53-
OSDU&trade; is a trademark of The Open Group.
53+
OSDU&reg; is a trademark of The Open Group.
5454

5555
## Next steps
5656
<!-- Add a context sentence for the following links -->

articles/energy-data-services/concepts-manifest-ingestion.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ A manifest is a JSON document that has a pre-determined structure for capturing
2020

2121
You can find an example manifest json document [here](https://community.opengroup.org/osdu/data/data-definitions/-/tree/master/Examples/manifest#manifest-example).
2222

23-
The manifest schema has containers for the following OSDU&trade; [Group types](https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/Guides/Chapters/02-GroupType.md#2-group-type):
23+
The manifest schema has containers for the following OSDU&reg; [Group types](https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/Guides/Chapters/02-GroupType.md#2-group-type):
2424

2525
* **ReferenceData** (*zero or more*) - A set of permissible values to be used by other (master or transaction) data fields. Examples include *Unit of Measure (feet)*, *Currency*, etc.
2626
* **MasterData** (*zero or more*) - A single source of basic business data used across multiple systems, applications, and/or process. Examples include *Wells* and *Wellbores*
@@ -45,15 +45,15 @@ Azure Data Manager for Energy instance has out-of-the-box support for Manifest-b
4545
### Manifest-based file ingestion workflow components
4646
The Manifest-based file ingestion workflow consists of the following components:
4747
* **Workflow Service** - A wrapper service running on top of the Airflow workflow engine.
48-
* **Airflow engine** - A workflow orchestration engine that executes workflows registered as DAGs (Directed Acyclic Graphs). Airflow is the chosen workflow engine by the [OSDU&trade;](https://osduforum.org/) community to orchestrate and run ingestion workflows. Airflow isn't directly exposed, instead its features are accessed through the workflow service.
48+
* **Airflow engine** - A workflow orchestration engine that executes workflows registered as DAGs (Directed Acyclic Graphs). Airflow is the chosen workflow engine by the [OSDU&reg;](https://osduforum.org/) community to orchestrate and run ingestion workflows. Airflow isn't directly exposed, instead its features are accessed through the workflow service.
4949
* **Storage Service** - A service that is used to save the manifest metadata records into the data platform.
50-
* **Schema Service** - A service that manages OSDU&trade; defined schemas in the data platform. Schemas are being referenced during the Manifest-based file ingestion.
50+
* **Schema Service** - A service that manages OSDU&reg; defined schemas in the data platform. Schemas are being referenced during the Manifest-based file ingestion.
5151
* **Entitlements Service** - A service that manages access groups. This service is used during the ingestion for verification of ingestion permissions. This service is also used during the metadata record retrieval for validation of "read" writes.
5252
* **Legal Service** - A service that validates compliance through legal tags.
5353
* **Search Service** is used to perform referential integrity check during the manifest ingestion process.
5454

5555
### Pre-requisites
56-
Before running the Manifest-based file ingestion workflow, customers must ensure that the user accounts running the workflow have access to the core services (Search, Storage, Schema, Entitlement and Legal) and Workflow service (see [Entitlement roles](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/docs/osdu-entitlement-roles.md) for details). As part of Azure Data Manager for Energy instance provisioning, the OSDU&trade; standard schemas and associated reference data are pre-loaded. Customers must ensure that the user account used for ingesting the manifests is included in appropriate owners and viewers ACLs. Customers must ensure that manifests are configured with correct legal tags, owners and viewers ACLs, reference data, etc.
56+
Before running the Manifest-based file ingestion workflow, customers must ensure that the user accounts running the workflow have access to the core services (Search, Storage, Schema, Entitlement and Legal) and Workflow service (see [Entitlement roles](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/docs/osdu-entitlement-roles.md) for details). As part of Azure Data Manager for Energy instance provisioning, the OSDU&reg; standard schemas and associated reference data are pre-loaded. Customers must ensure that the user account used for ingesting the manifests is included in appropriate owners and viewers ACLs. Customers must ensure that manifests are configured with correct legal tags, owners and viewers ACLs, reference data, etc.
5757

5858
### Workflow sequence
5959
The following illustration provides the Manifest-based file ingestion workflow:
@@ -65,7 +65,7 @@ The workflow service executes a series of manifest `syntax validation` like mani
6565

6666
Once the validations are successful, the system processes the content into storage by writing each valid entity into the data platform using the Storage Service API.
6767

68-
OSDU&trade; is a trademark of The Open Group.
68+
OSDU&reg; is a trademark of The Open Group.
6969

7070
## Next steps
7171
- [Tutorial: Sample steps to perform a manifest-based file ingestion](tutorial-manifest-ingestion.md)

0 commit comments

Comments
 (0)