Skip to content

Commit 6141910

Browse files
Learn Build Service GitHub AppLearn Build Service GitHub App
authored andcommitted
Merging changes synced from https://github.com/MicrosoftDocs/azure-docs-pr (branch live)
2 parents 52f5dbb + a4ffa8e commit 6141910

File tree

8 files changed

+60
-215
lines changed

8 files changed

+60
-215
lines changed

articles/api-center/enable-managed-api-analysis-linting.md

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Managed API linting and analysis - Azure API Center
33
description: Enable managed linting of API definitions in your API center to analyze compliance of APIs with the organization's API style guide.
44
ms.service: azure-api-center
55
ms.topic: how-to
6-
ms.date: 08/23/2024
6+
ms.date: 11/01/2024
77
ms.author: danlep
88
author: dlepow
99
ms.custom:
@@ -30,9 +30,7 @@ In this scenario:
3030
* Currently, only OpenAPI specification documents in JSON or YAML format are analyzed.
3131
* By default, you enable analysis with the [`spectral:oas` ruleset](https://docs.stoplight.io/docs/spectral/4dec24461f3af-open-api-rules). To learn more about the built-in rules, see the [Spectral GitHub repo](https://github.com/stoplightio/spectral/blob/develop/docs/reference/openapi-rules.md).
3232
* Currently, you configure a single ruleset, and it's applied to all OpenAPI definitions in your API center.
33-
* The following are limits for maximum number of API definitions linted per 4 hours:
34-
* Free tier: 10
35-
* Standard tier: 100
33+
* There are [limits](../azure-resource-manager/management/azure-subscription-service-limits.md?toc=/azure/api-center/toc.json&bc=/azure/api-center/breadcrumb/toc.json#api-center-limits) for the maximum number of API definitions analyzed. Analysis can take a few minutes to up to 24 hours to complete.
3634

3735
## Prerequisites
3836

articles/api-center/includes/api-center-service-limits.md

Lines changed: 9 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ author: dlepow
77

88
ms.service: azure-api-center
99
ms.topic: include
10-
ms.date: 10/18/2024
10+
ms.date: 11/07/2024
1111
ms.author: danlep
1212
ms.custom: Include file
1313
---
@@ -20,14 +20,16 @@ ms.custom: Include file
2020
| Maximum number of deployments per API | 10 | 10 |
2121
| Maximum number of environments | 20 | 20 |
2222
| Maximum number of workspaces | 1 (Default) | 1 (Default) |
23-
| Maximum number of custom metadata properties per entity<sup>4</sup> | 10 | 20 |
23+
| Maximum number of custom metadata properties per entity<sup>3</sup> | 10 | 20 |
2424
| Maximum number of child properties in custom metadata property of type "object" | 10 |10 |
2525
| Maximum requests per minute (data plane) | 3,000 | 6,000 |
26-
| Maximum number of API definitions [linted](../enable-managed-api-analysis-linting.md) per 4 hours | 10 | 100 |
26+
| Maximum number of APIs accessed through data plane API | 5 | 10,000 |
27+
| Maximum number of API definitions [analyzed](../enable-managed-api-analysis-linting.md) | 10 | 2,000<sup>4</sup> |
2728
| Maximum number of linked API sources<sup>5</sup> | 1 | 3 |
29+
| Maximum number of APIs synchronized from a linked API source | 200 | 2,000<sup>4</sup> |
2830

29-
<sup>1</sup> Free plan provided for 90 days, then service is soft-deleted.<br/>
31+
<sup>1</sup> Free plan provided for 90 days, then service is soft-deleted. Use of full service features including API analysis and access through the data plane API is limited.<br/>
3032
<sup>2</sup> To increase a limit in the Standard plan, contact [support](https://azure.microsoft.com/support/options/).<br/>
31-
<sup>3</sup> In the Free plan, use of full service features including API analysis and access through the data plane API is limited to 5 APIs.<br/>
32-
<sup>4</sup> Custom metadata properties assigned to APIs, deployments, and environments.<br/>
33-
<sup>5</sup> Sources such as linked API Management instances. In the Free plan, synchronization from a linked API source is limited to 200 APIs and 5 API definitions.
33+
<sup>3</sup> Custom metadata properties assigned to APIs, deployments, and environments.<br/>
34+
<sup>4</sup> Process can take a few minutes to up to 24 hours to complete.<br/>
35+
<sup>5</sup> Sources such as linked API Management instances.

articles/api-center/synchronize-api-management-apis.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,8 +27,8 @@ When you link an API Management instance as an API source, the following happens
2727
API Management APIs automatically synchronize to the API center whenever existing APIs' settings change (for example, new versions are added), new APIs are created, or APIs are deleted. This synchronization is one-way from API Management to your Azure API center, meaning API updates in the API center aren't synchronized back to the API Management instance.
2828

2929
> [!NOTE]
30-
> * API updates in API Management can take a few minutes to synchronize to your API center.
3130
> * There are [limits](../azure-resource-manager/management/azure-subscription-service-limits.md?toc=/azure/api-center/toc.json&bc=/azure/api-center/breadcrumb/toc.json#api-center-limits) for the number of linked API Management instances (API sources).
31+
> * API updates in API Management can take a few minutes to up to 24 hours to synchronize to your API center.
3232
3333
### Entities synchronized from API Management
3434

articles/databox/data-box-deploy-copy-data-via-rest.md

Lines changed: 5 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -15,13 +15,6 @@ ms.author: shaas
1515

1616
# Tutorial: Use REST APIs to Copy data to Azure Data Box Blob storage
1717

18-
> [!IMPORTANT]
19-
> Azure Data Box now supports access tier assignment at the blob level. The steps contained within this tutorial reflect the updated data copy process and are specific to block blobs.
20-
>
21-
>For help with determining the appropriate access tier for your block blob data, refer to the [Determine appropriate access tiers for block blobs](#determine-appropriate-access-tiers-for-block-blobs) section. Follow the steps containined within the [Copy data to Data Box](#copy-data-to-data-box) section to copy your data to the appropriate access tier.
22-
>
23-
> The information contained within this section applies to orders placed after April 1, 2024.
24-
2518
> [!CAUTION]
2619
> This article references CentOS, a Linux distribution that is End Of Life (EOL) status. Please consider your use and planning accordingly. For more information, see the [CentOS End Of Life guidance](/azure/virtual-machines/workloads/centos/centos-end-of-life).
2720
@@ -44,7 +37,7 @@ Before you begin, make sure that:
4437
3. You review the [system requirements for Data Box Blob storage](data-box-system-requirements-rest.md) and are familiar with supported versions of APIs, SDKs, and tools.
4538
4. You have access to a host computer that has the data that you want to copy over to Data Box. Your host computer must:
4639
* Run a [Supported operating system](data-box-system-requirements.md).
47-
* Be connected to a high-speed network. We strongly recommend that you have at least one 10-GbE connection. If a 10-GbE connection isn't available, a 1-GbE data link can be used but the copy speeds are impacted.
40+
* Be connected to a high-speed network. We strongly recommend that you have at least one 10-GbE connection. You can use a 1-GbE data link if a 10-GbE connection isn't available, though copy speeds are impacted.
4841
5. [Download AzCopy V10](../storage/common/storage-use-azcopy-v10.md) on your host computer. AzCopy is used to copy data to Azure Data Box Blob storage from your host computer.
4942

5043
## Connect via http or https
@@ -98,7 +91,7 @@ Use the Azure portal to download certificate.
9891

9992
### Import certificate
10093

101-
Accessing Data Box Blob storage over HTTPS requires a TLS/SSL certificate for the device. The way in which this certificate is made available to the client application varies from application to application and across operating systems and distributions. Some applications can access the certificate after it's imported into the system's certificate store, while other applications don't make use of that mechanism.
94+
Accessing Data Box Blob storage over HTTPS requires a TLS/SSL certificate for the device. The way in which this certificate is made available to the client application varies from application to application and across operating systems and distributions. Some applications can access the certificate after importing it into the system's certificate store, while other applications don't make use of that mechanism.
10295

10396
Specific information for some applications is mentioned in this section. For more information on other applications, see the documentation for the application and the operating system used.
10497

@@ -152,27 +145,9 @@ Follow the same steps to [add device IP address and blob service endpoint when c
152145
153146
Follow the steps to [Configure partner software that you used while connecting over *http*](#verify-connection-and-configure-partner-software). The only difference is that you should leave the *Use http option* unchecked.
154147
155-
## Determine appropriate access tiers for block blobs
156-
157-
> [!IMPORTANT]
158-
> The information contained within this section applies to orders placed after April 1<sup>st</sup>, 2024.
159-
160-
Azure Storage allows you to store block blob data in multiple access tiers within the same storage account. This ability allows data to be organized and stored more efficiently based on how often it's accessed. The following table contains information and recommendations about Azure Storage access tiers.
161-
162-
| Tier | Recommendation | Best practice |
163-
|---------|----------------|---------------|
164-
| Hot | Useful for online data accessed or modified frequently. This tier has the highest storage costs, but the lowest access costs. | Data in this tier should be in regular and active use. |
165-
| Cool | Useful for online data accessed or modified infrequently. This tier has lower storage costs and higher access costs than the hot tier. | Data in this tier should be stored for at least 30 days. |
166-
| Cold | Useful for online data accessed or modified rarely but still requiring fast retrieval. This tier has lower storage costs and higher access costs than the cool tier.| Data in this tier should be stored for a minimum of 90 days. |
167-
| Archive | Useful for offline data rarely accessed and having lower latency requirements. | Data in this tier should be stored for a minimum of 180 days. Data removed from the archive tier within 180 days is subject to an early deletion charge. |
168-
169-
For more information about blob access tiers, see [Access tiers for blob data](../storage/blobs/access-tiers-overview.md). For more detailed best practices, see [Best practices for using blob access tiers](../storage/blobs/access-tiers-best-practices.md).
170-
171-
You can transfer your block blob data to the appropriate access tier by copying it to the corresponding folder within Data Box. This process is discussed in greater detail within the [Copy data to Azure Data Box](#copy-data-to-data-box) section.
172-
173148
## Copy data to Data Box
174149
175-
After connecting to one or more Data Box shares, the next step is to copy data. Before you begin the data copy, consider the following limitations:
150+
After one or more Data Box shares are connected, the next step is to copy data. Before you initiate data copy operations, consider the following limitations:
176151
177152
* While copying data, ensure that the data size conforms to the size limits described in the [Azure storage and Data Box limits](data-box-limits.md).
178153
* Simultaneous uploads by Data Box and another non-Data Box application could potentially result in upload job failures and data corruption.
@@ -200,8 +175,8 @@ The first step is to create a container, because blobs are always uploaded into
200175
201176
![Blob Containers context menu, Create Blob Container](media/data-box-deploy-copy-data-via-rest/create-blob-container-1.png)
202177
203-
4. A text box appears below the **Blob Containers** folder. Enter the name for your blob container. See the [Create the container and set permissions](../storage/blobs/storage-quickstart-blobs-dotnet.md) for information on rules and restrictions on naming blob containers.
204-
5. Press **Enter** when done to create the blob container, or **Esc** to cancel. After the blob container is successfully created, it's displayed under the **Blob Containers** folder for the selected storage account.
178+
4. A text box appears below the **Blob Containers** folder. Enter the name for your blob container. See the [Create the container and set permissions](../storage/blobs/storage-quickstart-blobs-dotnet.md) for information on rules and restrictions on naming blob containers.
179+
5. Press **Enter** when done to create the blob container, or **Esc** to cancel. After successful creation, the blob container is displayed under the selected storage account's **Blob Containers** folder.
205180
206181
![Blob container created](media/data-box-deploy-copy-data-via-rest/create-blob-container-2.png)
207182
@@ -266,7 +241,6 @@ In this tutorial, you learned about Azure Data Box topics such as:
266241
>
267242
> * Prerequisites for copy data to Azure Data Box Blob storage using REST APIs
268243
> * Connecting to Data Box Blob storage via *http* or *https*
269-
> * Determining appropriate access tiers for block blobs
270244
> * Copy data to Data Box
271245
272246
Advance to the next tutorial to learn how to ship your Data Box back to Microsoft.

articles/iot-edge/tutorial-configure-est-server.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Tutorial - Configure Enrollment over Secure Transport Server (EST) for Az
33
description: This tutorial shows you how to set up an Enrollment over Secure Transport (EST) server for Azure IoT Edge.
44
author: PatAltimore
55
ms.author: patricka
6-
ms.date: 06/10/2024
6+
ms.date: 11/07/2024
77
ms.topic: tutorial
88
ms.service: azure-iot-edge
99
services: iot-edge
@@ -335,5 +335,5 @@ You can keep the resources and configurations that you created in this tutorial
335335
* To use EST server to issue Edge CA certificates, see [example configuration](https://github.com/Azure/iotedge/blob/main/edgelet/doc/est.md#edge-ca-certificate).
336336
* Using username and password to bootstrap authentication to EST server isn't recommended for production. Instead, consider using long-lived *bootstrap certificates* that can be stored onto the device during manufacturing [similar to the recommended approach for DPS](../iot-hub/iot-hub-x509ca-concept.md). To see how to configure bootstrap certificate for EST server, see [Authenticate a Device Using Certificates Issued Dynamically via EST](https://github.com/Azure/iotedge/blob/main/edgelet/doc/est.md).
337337
* EST server can be used to issue certificates for all devices in a hierarchy as well. Depending on if you have ISA-95 requirements, it may be necessary to run a chain of EST servers with one at every layer or use the API proxy module to forward the requests. To learn more, see [Kevin's blog](https://kevinsaye.wordpress.com/2021/07/21/deep-dive-creating-hierarchies-of-azure-iot-edge-devices-isa-95-part-3/).
338-
* For enterprise grade solutions, consider: [GlobalSign IoT Edge Enroll](https://www.globalsign.com/en/iot-edge-enroll) or [DigiCert IoT Device Manager](https://www.digicert.com/iot/iot-device-manager)
338+
* For enterprise grade solutions, consider: [GlobalSign IoT Edge Enroll](https://www.globalsign.com/en/iot-edge-enroll), [DigiCert IoT Device Manager](https://www.digicert.com/iot/iot-device-manager), and [Keytos EZCA](https://www.keytos.io/docs/azure-pki/azure-iot-hub/how-to-create-azure-iot-est-certificate-authority/).
339339
* To learn more about certificates, see [Understand how Azure IoT Edge uses certificates](iot-edge-certs.md).

articles/storsimple/storsimple-overview.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ manager: alkohli
66
ms.assetid: 7144d218-db21-4495-88fb-e3b24bbe45d1
77
ms.service: storsimple
88
ms.topic: article
9-
ms.date: 07/10/2023
9+
ms.date: 11/07/2024
1010
ms.author: alkohli
1111
ROBOTS: NOINDEX
1212
---
@@ -21,8 +21,8 @@ The following resources are available to help you migrate backup files or to cop
2121
|Resource |Description |
2222
|---------------------------|----------------------------|
2323
|[Azure StorSimple 8000 Series Copy Utility](https://aka.ms/storsimple-copy-utility) |Microsoft is providing a read-only data copy utility to recover and migrate your backup files from StorSimple cloud snapshots. The StorSimple 8000 Series Copy Utility is designed to run in your environment. You can install and configure the Utility, and then use your Service Encryption Key to authenticate and download your metadata from the cloud.|
24-
|Azure StorSimple 8000 Series Copy Utility documentation |Instructions for use of the Copy Utility. |
25-
|StorSimple archived documentation |Archived StorSimple articles from Microsoft technical documentation. |
24+
|[Azure StorSimple 8000 Series Copy Utility documentation](https://aka.ms/storsimple-copy-utility-docs) |Instructions for use of the Copy Utility. |
25+
|[StorSimple archived documentation](https://aka.ms/storsimple-archive-docs) |Archived StorSimple articles from Microsoft technical documentation. |
2626

2727
## Copy data and then decommission your appliance
2828

0 commit comments

Comments
 (0)