Skip to content

Commit b4fbff9

Browse files
committed
Merge branch 'main' of https://github.com/MicrosoftDocs/azure-docs-pr into yelevin/atomic-incidents
2 parents 5f6bbe8 + 8940fd7 commit b4fbff9

File tree

8 files changed

+110
-9
lines changed

8 files changed

+110
-9
lines changed

articles/cognitive-services/Anomaly-Detector/How-to/batch-inference.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -30,6 +30,8 @@ You could choose the batch inference API, or the streaming inference API for det
3030

3131
To perform batch inference, provide the blob URL containing the inference data, the start time, and end time. For inference data volume, at least `1 sliding window` length and at most **20000** timestamps.
3232

33+
To get better performance, we recommend you send out no more than 150,000 data points per batch inference. *(Data points = Number of variables * Number of timestamps)*
34+
3335
This inference is asynchronous, so the results aren't returned immediately. Notice that you need to save in a variable the link of the results in the **response header** which contains the `resultId`, so that you may know where to get the results afterwards.
3436

3537
Failures are usually caused by model issues or data issues. You can't perform inference if the model isn't ready or the data link is invalid. Make sure that the training data and inference data are consistent, meaning they should be **exactly** the same variables but with different timestamps. More variables, fewer variables, or inference with a different set of variables won't pass the data verification phase and errors will occur. Data verification is deferred so that you'll get error messages only when you query the results.
Lines changed: 97 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,97 @@
1+
---
2+
title: Service limits - Anomaly Detector service
3+
titleSuffix: Azure Cognitive Services
4+
description: Service limits for Anomaly Detector service, including Univariate Anomaly Detection and Multivariate Anomaly Detection.
5+
services: cognitive-services
6+
author: jr-MS
7+
manager: nitinme
8+
ms.service: cognitive-services
9+
ms.subservice: anomaly-detector
10+
ms.topic: conceptual
11+
ms.date: 1/31/2023
12+
ms.author: jingruhan
13+
ms.custom:
14+
---
15+
16+
# Anomaly Detector service quotas and limits
17+
18+
This article contains both a quick reference and detailed description of Azure Anomaly Detector service quotas and limits for all pricing tiers. It also contains some best practices to help avoid request throttling.
19+
20+
The quotas and limits apply to all the versions within Azure Anomaly Detector service.
21+
22+
## Univariate Anomaly Detection
23+
24+
|Quota<sup>1</sup>|Free (F0)|Standard (S0)|
25+
|--|--|--|
26+
| **All APIs per second** | 10 | 500 |
27+
28+
<sup>1</sup> All the quota and limit are defined for one Anomaly Detector resource.
29+
30+
## Multivariate Anomaly Detection
31+
32+
### API call per minute
33+
34+
|Quota<sup>1</sup>|Free (F0)<sup>2</sup>|Standard (S0)|
35+
|--|--|--|
36+
| **Training API per minute** | 1 | 20 |
37+
| **Get model API per minute** | 1 | 20 |
38+
| **Batch(async) inference API per minute** | 10 | 60 |
39+
| **Get inference results API per minute** | 10 | 60 |
40+
| **Last(sync) inference API per minute** | 10 | 60 |
41+
| **List model API per minute** | 1 | 20 |
42+
| **Delete model API per minute** | 1 | 20 |
43+
44+
<sup>1</sup> All quotas and limits are defined for one Anomaly Detector resource.
45+
46+
<sup>2</sup> For **Free (F0)** pricing tier see also monthly allowances at the [pricing page](https://azure.microsoft.com/pricing/details/cognitive-services/anomaly-detector/)
47+
48+
### Concurrent models and inference tasks
49+
|Quota<sup>1</sup>|Free (F0)|Standard (S0)|
50+
|--|--|--|
51+
| **Maximum models** *(created, running, ready, failed)*| 20 | 1000 |
52+
| **Maximum running models** *(created, running)* | 1 | 20 |
53+
| **Maximum running inference** *(created, running)* | 10 | 60 |
54+
55+
<sup>1</sup> All quotas and limits are defined for one Anomaly Detector resource. If you want to increase the limit, please contact [email protected] for further communication.
56+
57+
## How to increase the limit for your resource?
58+
59+
For the Standard pricing tier, this limit can be increased. Increasing the **concurrent request limit** doesn't directly affect your costs. Anomaly Detector service uses "Pay only for what you use" model. The limit defines how high the Service may scale before it starts throttle your requests.
60+
61+
The **concurrent request limit parameter** isn't visible via Azure portal, Command-Line tools, or API requests. To verify the current value, create an Azure Support Request.
62+
63+
If you would like to increase your limit, you can enable auto scaling on your resource. Follow this document to enable auto scaling on your resource [enable auto scaling](../autoscale.md). You can also submit an increase Transactions Per Second (TPS) support request.
64+
65+
### Have the required information ready
66+
67+
* Anomaly Detector resource ID
68+
69+
* Region
70+
71+
#### Retrieve resource ID and region
72+
73+
* Go to [Azure portal](https://portal.azure.com/)
74+
* Select the Anomaly Detector Resource for which you would like to increase the transaction limit
75+
* Select Properties (Resource Management group)
76+
* Copy and save the values of the following fields:
77+
* Resource ID
78+
* Location (your endpoint Region)
79+
80+
### Create and submit support request
81+
82+
To request a limit increase for your resource submit a **Support Request**:
83+
84+
1. Go to [Azure portal](https://portal.azure.com/)
85+
2. Select the Anomaly Detector Resource for which you would like to increase the limit
86+
3. Select New support request (Support + troubleshooting group)
87+
4. A new window will appear with auto-populated information about your Azure Subscription and Azure Resource
88+
5. Enter Summary (like "Increase Anomaly Detector TPS limit")
89+
6. In Problem type, select *"Quota or usage validation"*
90+
7. Select Next: Solutions
91+
8. Proceed further with the request creation
92+
9. Under the Details tab enters the following in the Description field:
93+
* A note, that the request is about Anomaly Detector quota.
94+
* Provide a TPS expectation you would like to scale to meet.
95+
* Azure resource information you collected.
96+
* Complete entering the required information and select Create button in *Review + create* tab
97+
* Note the support request number in Azure portal notifications. You'll be contacted shortly for further processing.

articles/cognitive-services/Anomaly-Detector/toc.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@
99
href: https://aka.ms/adpricing
1010
- name: What's new
1111
href: whats-new.md
12+
- name: Service limits
13+
href: service-limits.md
1214
- name: FAQ
1315
href: faq.yml
1416
- name: Quickstarts

articles/communication-services/concepts/interop/guest/capabilities.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ ms.subservice: teams-interop
1515
In this article, you will learn which capabilities are supported for Teams external users using Azure Communication Services SDKs in Teams meetings. You can find per platform availability in [voice and video calling capabilities](../../voice-video-calling/calling-sdk-features.md).
1616

1717

18-
| Group of features | Capability | JavaScript |
18+
| Group of features | Capability | Supported |
1919
| ----------------- | ------------------------------------------------------------------------------------------------------------------- | ---------- |
2020
| Core Capabilities | Join Teams meeting | ✔️ |
2121
| | Leave meeting | ✔️ |

articles/iot-hub/iot-hub-create-use-iot-toolkit.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,15 +13,15 @@ ms.author: junhan
1313

1414
[!INCLUDE [iot-hub-resource-manager-selector](../../includes/iot-hub-resource-manager-selector.md)]
1515

16-
This article shows you how to use the [Azure IoT Tools for Visual Studio Code](https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.azure-iot-tools) to create an Azure IoT hub. You can create one without an existing IoT project or create one from an existing IoT project.
16+
This article shows you how to use the [Azure IoT Tools for Visual Studio Code](https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.azure-iot-toolkit) to create an Azure IoT hub. You can create one without an existing IoT project or create one from an existing IoT project.
1717

1818
[!INCLUDE [updated-for-az](../../includes/updated-for-az.md)]
1919

2020
## Prerequisites
2121

2222
- [Visual Studio Code](https://code.visualstudio.com/)
2323

24-
- [Azure IoT Tools](https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.azure-iot-tools) installed for Visual Studio Code
24+
- [Azure IoT Tools](https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.azure-iot-toolkit) installed for Visual Studio Code
2525

2626
- An Azure resource group: [create a resource group](../azure-resource-manager/management/manage-resource-groups-portal.md#create-resource-groups) in the Azure portal
2727

@@ -107,4 +107,4 @@ Now that you've deployed an IoT hub using the Azure IoT Tools for Visual Studio
107107

108108
* [Use the Azure IoT Tools for Visual Studio Code for Azure IoT Hub device management](iot-hub-device-management-iot-toolkit.md)
109109

110-
* [See the Azure IoT Hub for VS Code wiki page](https://github.com/microsoft/vscode-azure-iot-toolkit/wiki).
110+
* [See the Azure IoT Hub for VS Code wiki page](https://github.com/microsoft/vscode-azure-iot-toolkit/wiki).

articles/iot-hub/iot-hub-vscode-iot-toolkit-cloud-device-messaging.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ In this article, you learn how to use Azure IoT Tools for Visual Studio Code to
2727

2828
* [Visual Studio Code](https://code.visualstudio.com/)
2929

30-
* [Azure IoT Tools for VS Code](https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.azure-iot-tools) or copy and paste this URL into a browser window: `vscode:extension/vsciot-vscode.azure-iot-tools`
30+
* [Azure IoT Tools for VS Code](https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.azure-iot-toolkit) or copy and paste this URL into a browser window: `vscode:extension/vsciot-vscode.azure-iot-toolkit`
3131

3232
## Sign in to access your IoT hub
3333

articles/postgresql/flexible-server/overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -113,7 +113,7 @@ One advantage of running your workload in Azure is global reach. The flexible se
113113
| Sweden Central | :heavy_check_mark: | :x: | :heavy_check_mark: | :x: |
114114
| Switzerland North | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: |
115115
| Switzerland West | :heavy_check_mark: | :x: | :heavy_check_mark: | :heavy_check_mark: |
116-
| UAE North | :heavy_check_mark: | :x: | :heavy_check_mark: | :x: |
116+
| UAE North | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :x: |
117117
| US Gov Arizona | :heavy_check_mark: | :x: | :heavy_check_mark: | :x: |
118118
| US Gov Virginia | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :x: |
119119
| UK South | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: |

articles/synapse-analytics/sql/resources-self-help-sql-on-demand.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -678,7 +678,7 @@ If your query returns NULL values instead of partitioning columns or can't find
678678

679679
The error `Inserting value to batch for column type DATETIME2 failed` indicates that the serverless pool can't read the date values from the underlying files. The datetime value stored in the Parquet or Delta Lake file can't be represented as a `DATETIME2` column.
680680

681-
Inspect the minimum value in the file by using Spark, and check that some dates are less than 0001-01-03. If you stored the files by using Spark 2.4, the datetime values before are written by using the Julian calendar that isn't aligned with the proleptic Gregorian calendar used in serverless SQL pools.
681+
Inspect the minimum value in the file by using Spark, and check that some dates are less than 0001-01-03. If you stored the files by using the Spark 2.4 version or with the higher Spark version that still uses legacy datetime storage format, the datetime values before are written by using the Julian calendar that isn't aligned with the proleptic Gregorian calendar used in serverless SQL pools.
682682

683683
There might be a two-day difference between the Julian calendar used to write the values in Parquet (in some Spark versions) and the proleptic Gregorian calendar used in serverless SQL pool. This difference might cause conversion to a negative date value, which is invalid.
684684

@@ -695,7 +695,7 @@ deltaTable.update(col("MyDateTimeColumn") < '0001-02-02', { "MyDateTimeColumn":
695695

696696
This change removes the values that can't be represented. The other date values might be properly loaded but incorrectly represented because there's still a difference between Julian and proleptic Gregorian calendars. You might see unexpected date shifts even for the dates before `1900-01-01` if you use Spark 3.0 or older versions.
697697

698-
Consider [migrating to Spark 3.1 or higher](https://spark.apache.org/docs/latest/sql-migration-guide.html). It uses a proleptic Gregorian calendar that's aligned with the calendar in serverless SQL pool. Reload your legacy data with the higher version of Spark, and use the following setting to correct the dates:
698+
Consider [migrating to Spark 3.1 or higher](https://spark.apache.org/docs/latest/sql-migration-guide.html) and switching to the proleptic Gregorian calendar. The latest Spark versions use by default a proleptic Gregorian calendar that's aligned with the calendar in serverless SQL pool. Reload your legacy data with the higher version of Spark, and use the following setting to correct the dates:
699699

700700
```spark
701701
spark.conf.set("spark.sql.legacy.parquet.int96RebaseModeInWrite", "CORRECTED")
@@ -1114,4 +1114,4 @@ You don't need to use separate databases to isolate data for different tenants.
11141114
- [Azure Synapse Analytics frequently asked questions](../overview-faq.yml)
11151115
- [Store query results to storage using serverless SQL pool in Azure Synapse Analytics](create-external-table-as-select.md)
11161116
- [Synapse Studio troubleshooting](../troubleshoot/troubleshoot-synapse-studio.md)
1117-
- [Troubleshoot a slow query on a dedicated SQL Pool](/troubleshoot/azure/synapse-analytics/dedicated-sql/troubleshoot-dsql-perf-slow-query)
1117+
- [Troubleshoot a slow query on a dedicated SQL Pool](/troubleshoot/azure/synapse-analytics/dedicated-sql/troubleshoot-dsql-perf-slow-query)

0 commit comments

Comments
 (0)