Skip to content

Commit e4e9f84

Browse files
Merge pull request #265929 from ChaseCrum/CEOL-9
EOL message insert
2 parents abf22bb + 5790327 commit e4e9f84

9 files changed

+140
-110
lines changed

articles/azure-netapp-files/nfs-access-control-lists.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,9 @@ ms.author: anfdocs
1111

1212
# Understand NFSv4.x access control lists in Azure NetApp Files
1313

14+
> [!CAUTION]
15+
> This article references CentOS, a Linux distribution that is nearing End Of Life (EOL) status. Please consider your use and planning accordingly.
16+
1417
The NFSv4.x protocol can provide access control in the form of [access control lists (ACLs)](/windows/win32/secauthz/access-control-lists), which conceptually similar to ACLs used in [SMB via Windows NTFS permissions](network-attached-file-permissions-smb.md). An NFSv4.x ACL consists of individual [Access Control Entries (ACEs)](/windows/win32/secauthz/access-control-entries), each of which provides an access control directive to the server.
1518

1619
:::image type="content" source="./media/nfs-access-control-lists/access-control-entity-to-client-diagram.png" alt-text="Diagram of access control entity to Azure NetApp Files." lightbox="./media/nfs-access-control-lists/access-control-entity-to-client-diagram.png":::

articles/cloud-services/cloud-services-python-how-to-use-service-management.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,9 @@ ms.custom: compute-evergreen, devx-track-python
1212

1313
# Use service management from Python
1414

15+
> [!CAUTION]
16+
> This article references CentOS, a Linux distribution that is nearing End Of Life (EOL) status. Please consider your use and planning accordingly.
17+
1518
[!INCLUDE [Cloud Services (classic) deprecation announcement](includes/deprecation-announcement.md)]
1619

1720
This guide shows you how to programmatically perform common service management tasks from Python. The **ServiceManagementService** class in the [Azure SDK for Python](https://github.com/Azure/azure-sdk-for-python) supports programmatic access to much of the service management-related functionality that is available in the [Azure portal]. You can use this functionality to create, update, and delete cloud services, deployments, data management services, and virtual machines. This functionality can be useful in building applications that need programmatic access to service management.

articles/databox/data-box-deploy-copy-data-via-rest.md

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -11,9 +11,13 @@ ms.topic: tutorial
1111
ms.date: 12/29/2022
1212
ms.author: shaas
1313
#Customer intent: As an IT admin, I need to be able to copy data to Data Box to upload on-premises data from my server onto Azure.
14+
1415
---
1516

16-
# Tutorial: Use REST APIs to Copy data to Azure Data Box Blob storage
17+
# Tutorial: Use REST APIs to Copy data to Azure Data Box Blob storage
18+
19+
> [!CAUTION]
20+
> This article references CentOS, a Linux distribution that is nearing End Of Life (EOL) status. Please consider your use and planning accordingly.
1721
1822
This tutorial describes procedures to connect to Azure Data Box Blob storage via REST APIs over *http* or *https*. Once connected, the steps required to copy the data to Data Box Blob storage and prepare the Data Box to ship, are also described.
1923

@@ -69,7 +73,7 @@ Each of these steps is described in the following sections.
6973

7074
Connection to Azure Blob storage REST APIs over https requires the following steps:
7175

72-
* Download the certificate from Azure portal. This certificate is used for connecting to the web UI and Azure Blob storage REST APIs.
76+
* Download the certificate from Azure portal. This certificate is used for connecting to the web UI and Azure Blob storage REST APIs.
7377
* Import the certificate on the client or remote host
7478
* Add the device IP and blob service endpoint to the client or remote host
7579
* Configure third-party software and verify the connection
@@ -122,7 +126,7 @@ Follow these steps to import the `.cer` file into the root store of a Windows or
122126
123127
The method to import a certificate varies by distribution.
124128
125-
Several, such as Ubuntu and Debian, use the `update-ca-certificates` command.
129+
Several, such as Ubuntu and Debian, use the `update-ca-certificates` command.
126130
127131
* Rename the Base64-encoded certificate file to have a `.crt` extension and copy it into the `/usr/local/share/ca-certificates directory`.
128132
* Run the command `update-ca-certificates`.
@@ -134,7 +138,7 @@ Recent versions of RHEL, Fedora, and CentOS use the `update-ca-trust` command.
134138
135139
Consult the documentation specific to your distribution for details.
136140
137-
### Add device IP address and blob service endpoint
141+
### Add device IP address and blob service endpoint
138142
139143
Follow the same steps to [add device IP address and blob service endpoint when connecting over *http*](#add-device-ip-address-and-blob-service-endpoint).
140144

articles/databox/data-box-disk-deploy-set-up.md

Lines changed: 104 additions & 100 deletions
Large diffs are not rendered by default.

articles/databox/data-box-disk-system-requirements.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,9 @@ ms.author: shaas
1515

1616
# Azure Data Box Disk system requirements
1717

18+
> [!CAUTION]
19+
> This article references CentOS, a Linux distribution that is nearing End Of Life (EOL) status. Please consider your use and planning accordingly.
20+
1821
This article describes the important system requirements for your Microsoft Azure Data Box Disk solution and for the clients connecting to the Data Box Disk. We recommend that you review the information carefully before you deploy your Data Box Disk, and then refer back to it as necessary during the deployment and subsequent operation.
1922

2023
The system requirements include the supported platforms for clients connecting to disks, supported storage accounts, and storage types.

articles/databox/data-box-heavy-deploy-copy-data-via-rest.md

Lines changed: 10 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -10,9 +10,13 @@ ms.topic: tutorial
1010
ms.date: 07/03/2019
1111
ms.author: shaas
1212
#Customer intent: As an IT admin, I need to be able to copy data to Data Box Heavy to upload on-premises data from my server onto Azure.
13+
1314
---
1415

15-
# Tutorial: Copy data to Azure Data Box Blob storage via REST APIs
16+
# Tutorial: Copy data to Azure Data Box Blob storage via REST APIs
17+
18+
> [!CAUTION]
19+
> This article references CentOS, a Linux distribution that is nearing End Of Life (EOL) status. Please consider your use and planning accordingly.
1620
1721
This tutorial describes procedures to connect to Azure Data Box Blob storage via REST APIs over *http* or *https*. Once connected, the steps required to copy the data to Data Box Blob storage are described.
1822

@@ -32,7 +36,7 @@ Before you begin, make sure that:
3236
3. You've reviewed the [system requirements for Data Box Blob storage](data-box-system-requirements-rest.md) and are familiar with supported versions of APIs, SDKs, and tools.
3337
4. You've access to a host computer that has the data that you want to copy over to Data Box Heavy. Your host computer must
3438
- Run a [Supported operating system](data-box-system-requirements.md).
35-
- Be connected to a high-speed network. For fastest copy speeds, two 40-GbE connections (one per node) can be utilized in parallel. If you do not have 40-GbE connection available, we recommend that you have at least two 10-GbE connections (one per node).
39+
- Be connected to a high-speed network. For fastest copy speeds, two 40-GbE connections (one per node) can be utilized in parallel. If you do not have 40-GbE connection available, we recommend that you have at least two 10-GbE connections (one per node).
3640
5. [Download AzCopy 7.1.0](https://aka.ms/azcopyforazurestack20170417) on your host computer. You'll use AzCopy to copy data to Azure Data Box Blob storage from your host computer.
3741

3842

@@ -92,8 +96,8 @@ Use the Azure portal to download certificate.
9296
3. Under **Device credentials**, go to **API access** to device. Click **Download**. This action downloads a **\<your order name>.cer** certificate file. **Save** this file. You will install this certificate on the client or host computer that you will use to connect to the device.
9397

9498
![Download certificate in Azure portal](media/data-box-deploy-copy-data-via-rest/download-cert-1.png)
95-
96-
### Import certificate
99+
100+
### Import certificate
97101

98102
Accessing Data Box Blob storage over HTTPS requires a TLS/SSL certificate for the device. The way in which this certificate is made available to the client application varies from application to application and across operating systems and distributions. Some applications can access the certificate after it is imported into the system’s certificate store, while other applications do not make use of that mechanism.
99103

@@ -132,7 +136,7 @@ The method to import a certificate varies by distribution.
132136
> [!IMPORTANT]
133137
> For Data Box Heavy, you'll need to repeat all the connection instructions to connect to the second node.
134138
135-
Several, such as Ubuntu and Debian, use the `update-ca-certificates` command.
139+
Several, such as Ubuntu and Debian, use the `update-ca-certificates` command.
136140
137141
- Rename the Base64-encoded certificate file to have a `.crt` extension and copy it into the `/usr/local/share/ca-certificates directory`.
138142
- Run the command `update-ca-certificates`.
@@ -144,7 +148,7 @@ Recent versions of RHEL, Fedora, and CentOS use the `update-ca-trust` command.
144148
145149
Consult the documentation specific to your distribution for details.
146150
147-
### Add device IP address and blob service endpoint
151+
### Add device IP address and blob service endpoint
148152
149153
Follow the same steps to [add device IP address and blob service endpoint when connecting over *http*](#add-device-ip-address-and-blob-service-endpoint).
150154

articles/governance/machine-configuration/overview.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,9 @@ ms.topic: conceptual
66
---
77
# Understanding Azure Machine Configuration
88

9+
> [!CAUTION]
10+
> This article references CentOS, a Linux distribution that is nearing End Of Life (EOL) status. Please consider your use and planning accordingly.
11+
912
Azure Policy's machine configuration feature provides native capability to audit or configure
1013
operating system settings as code for machines running in Azure and hybrid
1114
[Arc-enabled machines][01]. You can use the feature directly per-machine, or orchestrate it at

articles/governance/policy/samples/guest-configuration-baseline-docker.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,9 @@ ms.custom: generated
77
---
88
# Docker security baseline
99

10+
> [!CAUTION]
11+
> This article references CentOS, a Linux distribution that is nearing End Of Life (EOL) status. Please consider your use and planning accordingly.
12+
1013
This article details the configuration settings for Docker hosts as applicable in the following
1114
implementations:
1215

articles/governance/policy/samples/guest-configuration-baseline-linux.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,9 @@ ms.custom: generated
77
---
88
# Linux security baseline
99

10+
> [!CAUTION]
11+
> This article references CentOS, a Linux distribution that is nearing End Of Life (EOL) status. Please consider your use and planning accordingly.
12+
1013
This article details the configuration settings for Linux guests as applicable in the following
1114
implementations:
1215

0 commit comments

Comments
 (0)