You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/api-center/enable-managed-api-analysis-linting.md
+2-4Lines changed: 2 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,7 @@ title: Managed API linting and analysis - Azure API Center
3
3
description: Enable managed linting of API definitions in your API center to analyze compliance of APIs with the organization's API style guide.
4
4
ms.service: azure-api-center
5
5
ms.topic: how-to
6
-
ms.date: 08/23/2024
6
+
ms.date: 11/01/2024
7
7
ms.author: danlep
8
8
author: dlepow
9
9
ms.custom:
@@ -30,9 +30,7 @@ In this scenario:
30
30
* Currently, only OpenAPI specification documents in JSON or YAML format are analyzed.
31
31
* By default, you enable analysis with the [`spectral:oas` ruleset](https://docs.stoplight.io/docs/spectral/4dec24461f3af-open-api-rules). To learn more about the built-in rules, see the [Spectral GitHub repo](https://github.com/stoplightio/spectral/blob/develop/docs/reference/openapi-rules.md).
32
32
* Currently, you configure a single ruleset, and it's applied to all OpenAPI definitions in your API center.
33
-
* The following are limits for maximum number of API definitions linted per 4 hours:
34
-
* Free tier: 10
35
-
* Standard tier: 100
33
+
* There are [limits](../azure-resource-manager/management/azure-subscription-service-limits.md?toc=/azure/api-center/toc.json&bc=/azure/api-center/breadcrumb/toc.json#api-center-limits) for the maximum number of API definitions analyzed. Analysis can take a few minutes to up to 24 hours to complete.
Copy file name to clipboardExpand all lines: articles/api-center/includes/api-center-service-limits.md
+9-7Lines changed: 9 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ author: dlepow
7
7
8
8
ms.service: azure-api-center
9
9
ms.topic: include
10
-
ms.date: 10/18/2024
10
+
ms.date: 11/07/2024
11
11
ms.author: danlep
12
12
ms.custom: Include file
13
13
---
@@ -20,14 +20,16 @@ ms.custom: Include file
20
20
| Maximum number of deployments per API | 10 | 10 |
21
21
| Maximum number of environments | 20 | 20 |
22
22
| Maximum number of workspaces | 1 (Default) | 1 (Default) |
23
-
| Maximum number of custom metadata properties per entity<sup>4</sup> | 10 | 20 |
23
+
| Maximum number of custom metadata properties per entity<sup>3</sup> | 10 | 20 |
24
24
| Maximum number of child properties in custom metadata property of type "object" | 10 |10 |
25
25
| Maximum requests per minute (data plane) | 3,000 | 6,000 |
26
-
| Maximum number of API definitions [linted](../enable-managed-api-analysis-linting.md) per 4 hours | 10 | 100 |
26
+
| Maximum number of APIs accessed through data plane API | 5 | 10,000 |
27
+
| Maximum number of API definitions [analyzed](../enable-managed-api-analysis-linting.md)| 10 | 2,000<sup>4</sup> |
27
28
| Maximum number of linked API sources<sup>5</sup> | 1 | 3 |
29
+
| Maximum number of APIs synchronized from a linked API source | 200 | 2,000<sup>4</sup> |
28
30
29
-
<sup>1</sup> Free plan provided for 90 days, then service is soft-deleted.<br/>
31
+
<sup>1</sup> Free plan provided for 90 days, then service is soft-deleted. Use of full service features including API analysis and access through the data plane API is limited.<br/>
30
32
<sup>2</sup> To increase a limit in the Standard plan, contact [support](https://azure.microsoft.com/support/options/).<br/>
31
-
<sup>3</sup> In the Free plan, use of full service features including API analysis and access through the data plane API is limited to 5 APIs.<br/>
32
-
<sup>4</sup> Custom metadata properties assigned to APIs, deployments, and environments.<br/>
33
-
<sup>5</sup> Sources such as linked API Management instances. In the Free plan, synchronization from a linked API source is limited to 200 APIs and 5 API definitions.
33
+
<sup>3</sup> Custom metadata properties assigned to APIs, deployments, and environments.<br/>
34
+
<sup>4</sup> Process can take a few minutes to up to 24 hours to complete.<br/>
35
+
<sup>5</sup> Sources such as linked API Management instances.
Copy file name to clipboardExpand all lines: articles/api-center/synchronize-api-management-apis.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,8 +27,8 @@ When you link an API Management instance as an API source, the following happens
27
27
API Management APIs automatically synchronize to the API center whenever existing APIs' settings change (for example, new versions are added), new APIs are created, or APIs are deleted. This synchronization is one-way from API Management to your Azure API center, meaning API updates in the API center aren't synchronized back to the API Management instance.
28
28
29
29
> [!NOTE]
30
-
> * API updates in API Management can take a few minutes to synchronize to your API center.
31
30
> * There are [limits](../azure-resource-manager/management/azure-subscription-service-limits.md?toc=/azure/api-center/toc.json&bc=/azure/api-center/breadcrumb/toc.json#api-center-limits) for the number of linked API Management instances (API sources).
31
+
> * API updates in API Management can take a few minutes to up to 24 hours to synchronize to your API center.
Copy file name to clipboardExpand all lines: articles/databox/data-box-deploy-copy-data-via-rest.md
+5-31Lines changed: 5 additions & 31 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -15,13 +15,6 @@ ms.author: shaas
15
15
16
16
# Tutorial: Use REST APIs to Copy data to Azure Data Box Blob storage
17
17
18
-
> [!IMPORTANT]
19
-
> Azure Data Box now supports access tier assignment at the blob level. The steps contained within this tutorial reflect the updated data copy process and are specific to block blobs.
20
-
>
21
-
>For help with determining the appropriate access tier for your block blob data, refer to the [Determine appropriate access tiers for block blobs](#determine-appropriate-access-tiers-for-block-blobs) section. Follow the steps containined within the [Copy data to Data Box](#copy-data-to-data-box) section to copy your data to the appropriate access tier.
22
-
>
23
-
> The information contained within this section applies to orders placed after April 1, 2024.
24
-
25
18
> [!CAUTION]
26
19
> This article references CentOS, a Linux distribution that is End Of Life (EOL) status. Please consider your use and planning accordingly. For more information, see the [CentOS End Of Life guidance](/azure/virtual-machines/workloads/centos/centos-end-of-life).
27
20
@@ -44,7 +37,7 @@ Before you begin, make sure that:
44
37
3. You review the [system requirements for Data Box Blob storage](data-box-system-requirements-rest.md) and are familiar with supported versions of APIs, SDKs, and tools.
45
38
4. You have access to a host computer that has the data that you want to copy over to Data Box. Your host computer must:
46
39
* Run a [Supported operating system](data-box-system-requirements.md).
47
-
* Be connected to a high-speed network. We strongly recommend that you have at least one 10-GbE connection. If a 10-GbE connection isn't available, a 1-GbE data link can be used but the copy speeds are impacted.
40
+
* Be connected to a high-speed network. We strongly recommend that you have at least one 10-GbE connection. You can use a 1-GbE data link if a 10-GbE connection isn't available, though copy speeds are impacted.
48
41
5.[Download AzCopy V10](../storage/common/storage-use-azcopy-v10.md) on your host computer. AzCopy is used to copy data to Azure Data Box Blob storage from your host computer.
49
42
50
43
## Connect via http or https
@@ -98,7 +91,7 @@ Use the Azure portal to download certificate.
98
91
99
92
### Import certificate
100
93
101
-
Accessing Data Box Blob storage over HTTPS requires a TLS/SSL certificate for the device. The way in which this certificate is made available to the client application varies from application to application and across operating systems and distributions. Some applications can access the certificate after it's imported into the system's certificate store, while other applications don't make use of that mechanism.
94
+
Accessing Data Box Blob storage over HTTPS requires a TLS/SSL certificate for the device. The way in which this certificate is made available to the client application varies from application to application and across operating systems and distributions. Some applications can access the certificate after importing it into the system's certificate store, while other applications don't make use of that mechanism.
102
95
103
96
Specific information for some applications is mentioned in this section. For more information on other applications, see the documentation for the application and the operating system used.
104
97
@@ -152,27 +145,9 @@ Follow the same steps to [add device IP address and blob service endpoint when c
152
145
153
146
Follow the steps to [Configure partner software that you used while connecting over *http*](#verify-connection-and-configure-partner-software). The only difference is that you should leave the *Use http option* unchecked.
154
147
155
-
## Determine appropriate access tiers for block blobs
156
-
157
-
> [!IMPORTANT]
158
-
> The information contained within this section applies to orders placed after April 1<sup>st</sup>, 2024.
159
-
160
-
Azure Storage allows you to store block blob data in multiple access tiers within the same storage account. This ability allows data to be organized and stored more efficiently based on how often it's accessed. The following table contains information and recommendations about Azure Storage access tiers.
161
-
162
-
| Tier | Recommendation | Best practice |
163
-
|---------|----------------|---------------|
164
-
| Hot | Useful for online data accessed or modified frequently. This tier has the highest storage costs, but the lowest access costs. | Data in this tier should be in regular and active use. |
165
-
| Cool | Useful for online data accessed or modified infrequently. This tier has lower storage costs and higher access costs than the hot tier. | Data in this tier should be stored for at least 30 days. |
166
-
| Cold | Useful for online data accessed or modified rarely but still requiring fast retrieval. This tier has lower storage costs and higher access costs than the cool tier.| Data in this tier should be stored for a minimum of 90 days. |
167
-
| Archive | Useful for offline data rarely accessed and having lower latency requirements. | Data in this tier should be stored for a minimum of 180 days. Data removed from the archive tier within 180 days is subject to an early deletion charge. |
168
-
169
-
For more information about blob access tiers, see [Access tiers for blob data](../storage/blobs/access-tiers-overview.md). For more detailed best practices, see [Best practices for using blob access tiers](../storage/blobs/access-tiers-best-practices.md).
170
-
171
-
You can transfer your block blob data to the appropriate access tier by copying it to the corresponding folder within Data Box. This process is discussed in greater detail within the [Copy data to Azure Data Box](#copy-data-to-data-box) section.
172
-
173
148
## Copy data to Data Box
174
149
175
-
After connecting to one or more Data Box shares, the next step is to copy data. Before you begin the data copy, consider the following limitations:
150
+
After one or more Data Box shares are connected, the next step is to copy data. Before you initiate data copy operations, consider the following limitations:
176
151
177
152
* While copying data, ensure that the data size conforms to the size limits described in the [Azure storage and Data Box limits](data-box-limits.md).
178
153
* Simultaneous uploads by Data Box and another non-Data Box application could potentially result in upload job failures and data corruption.
@@ -200,8 +175,8 @@ The first step is to create a container, because blobs are always uploaded into
4. A text box appears below the **Blob Containers** folder. Enter the name for your blob container. See the [Create the container and set permissions](../storage/blobs/storage-quickstart-blobs-dotnet.md) for information on rules and restrictions on naming blob containers.
204
-
5. Press **Enter** when done to create the blob container, or **Esc** to cancel. After the blob container is successfully created, it's displayed under the **Blob Containers** folder for the selected storage account.
178
+
4. A text box appears below the **Blob Containers** folder. Enter the name for your blob container. See the [Create the container and set permissions](../storage/blobs/storage-quickstart-blobs-dotnet.md) for information on rules and restrictions on naming blob containers.
179
+
5. Press **Enter** when done to create the blob container, or **Esc** to cancel. After successful creation, the blob container is displayed under the selected storage account's **Blob Containers** folder.
Copy file name to clipboardExpand all lines: articles/iot-edge/tutorial-configure-est-server.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,7 @@ title: Tutorial - Configure Enrollment over Secure Transport Server (EST) for Az
3
3
description: This tutorial shows you how to set up an Enrollment over Secure Transport (EST) server for Azure IoT Edge.
4
4
author: PatAltimore
5
5
ms.author: patricka
6
-
ms.date: 06/10/2024
6
+
ms.date: 11/07/2024
7
7
ms.topic: tutorial
8
8
ms.service: azure-iot-edge
9
9
services: iot-edge
@@ -335,5 +335,5 @@ You can keep the resources and configurations that you created in this tutorial
335
335
* To use EST server to issue Edge CA certificates, see [example configuration](https://github.com/Azure/iotedge/blob/main/edgelet/doc/est.md#edge-ca-certificate).
336
336
* Using username and password to bootstrap authentication to EST server isn't recommended for production. Instead, consider using long-lived *bootstrap certificates* that can be stored onto the device during manufacturing [similar to the recommended approach for DPS](../iot-hub/iot-hub-x509ca-concept.md). To see how to configure bootstrap certificate for EST server, see [Authenticate a Device Using Certificates Issued Dynamically via EST](https://github.com/Azure/iotedge/blob/main/edgelet/doc/est.md).
337
337
* EST server can be used to issue certificates forall devicesin a hierarchy as well. Depending on if you have ISA-95 requirements, it may be necessary to run a chain of EST servers with one at every layer or use the API proxy module to forward the requests. To learn more, see [Kevin's blog](https://kevinsaye.wordpress.com/2021/07/21/deep-dive-creating-hierarchies-of-azure-iot-edge-devices-isa-95-part-3/).
338
-
* For enterprise grade solutions, consider: [GlobalSign IoT Edge Enroll](https://www.globalsign.com/en/iot-edge-enroll) or [DigiCert IoT Device Manager](https://www.digicert.com/iot/iot-device-manager)
338
+
* For enterprise grade solutions, consider: [GlobalSign IoT Edge Enroll](https://www.globalsign.com/en/iot-edge-enroll), [DigiCert IoT Device Manager](https://www.digicert.com/iot/iot-device-manager), and [Keytos EZCA](https://www.keytos.io/docs/azure-pki/azure-iot-hub/how-to-create-azure-iot-est-certificate-authority/).
339
339
* To learn more about certificates, see [Understand how Azure IoT Edge uses certificates](iot-edge-certs.md).
|[Azure StorSimple 8000 Series Copy Utility](https://aka.ms/storsimple-copy-utility)|Microsoft is providing a read-only data copy utility to recover and migrate your backup files from StorSimple cloud snapshots. The StorSimple 8000 Series Copy Utility is designed to run in your environment. You can install and configure the Utility, and then use your Service Encryption Key to authenticate and download your metadata from the cloud.|
24
-
|Azure StorSimple 8000 Series Copy Utility documentation |Instructions for use of the Copy Utility. |
25
-
|StorSimple archived documentation |Archived StorSimple articles from Microsoft technical documentation. |
24
+
|[Azure StorSimple 8000 Series Copy Utility documentation](https://aka.ms/storsimple-copy-utility-docs)|Instructions for use of the Copy Utility. |
25
+
|[StorSimple archived documentation](https://aka.ms/storsimple-archive-docs)|Archived StorSimple articles from Microsoft technical documentation. |
0 commit comments