Skip to content

Commit 6627007

Browse files
authored
Merge pull request #197438 from MicrosoftDocs/repo_sync_working_branch
Confirm merge from repo_sync_working_branch to main to sync with https://github.com/MicrosoftDocs/azure-docs (branch main)
2 parents ce6c38a + 180bdde commit 6627007

File tree

11 files changed

+26
-60
lines changed

11 files changed

+26
-60
lines changed

articles/active-directory/develop/msal-net-token-cache-serialization.md

Lines changed: 3 additions & 34 deletions
Original file line numberDiff line numberDiff line change
@@ -278,39 +278,9 @@ You can also specify options to limit the size of the in-memory token cache:
278278

279279
#### Distributed caches
280280

281-
If you use `app.AddDistributedTokenCache`, the token cache is an adapter against the .NET `IDistributedCache` implementation. So you can choose between a distributed memory cache, a SQL Server cache, a Redis cache, or an Azure Cosmos DB cache. For details about the `IDistributedCache` implementations, see [Distributed memory cache](/aspnet/core/performance/caching/distributed).
281+
If you use `app.AddDistributedTokenCache`, the token cache is an adapter against the .NET `IDistributedCache` implementation. So you can choose between a SQL Server cache, a Redis cache, an Azure Cosmos DB cache, or any other cache implementing the [IDistributedCache](https://docs.microsoft.com/dotnet/api/microsoft.extensions.caching.distributed.idistributedcache?view=dotnet-plat-ext-6.0) interface.
282282
283-
Here's the code for a distributed in-memory token cache:
284-
285-
```CSharp
286-
// In-memory distributed token cache
287-
app.AddDistributedTokenCache(services =>
288-
{
289-
// In net462/net472, requires to reference Microsoft.Extensions.Caching.Memory
290-
services.AddDistributedMemoryCache();
291-
292-
// Distributed token caches have an L1/L2 mechanism.
293-
// L1 is in memory, and L2 is the distributed cache
294-
// implementation that you will choose below.
295-
// You can configure them to limit the memory of the
296-
// L1 cache, encrypt, and set eviction policies.
297-
services.Configure<MsalDistributedTokenCacheAdapterOptions>(options =>
298-
{
299-
// You can disable the L1 cache if you want
300-
options.DisableL1Cache = false;
301-
302-
// Or limit the memory (by default, this is 500 MB)
303-
options.sizeLimit = 1024 * 1024 * 1024, // 1 GB
304-
305-
// You can choose to encrypt the cache or not
306-
options.Encrypt = false;
307-
308-
// And you can set eviction policies for the distributed
309-
// cache
310-
options.SlidingExpiration = TimeSpan.FromHours(1);
311-
});
312-
});
313-
```
283+
For testing purposes only, you may want to use `services.AddDistributedMemoryCache()`, an in-memory implementation of `IDistributedCache`.
314284

315285
Here's the code for a SQL Server cache:
316286

@@ -320,8 +290,7 @@ Here's the code for a SQL Server cache:
320290
{
321291
services.AddDistributedSqlServerCache(options =>
322292
{
323-
// In net462/net472, requires to reference Microsoft.Extensions.Caching.Memory
324-
293+
325294
// Requires to reference Microsoft.Extensions.Caching.SqlServer
326295
options.ConnectionString = @"Data Source=(localdb)\MSSQLLocalDB;Initial Catalog=TestCache;Integrated Security=True;Connect Timeout=30;Encrypt=False;TrustServerCertificate=False;ApplicationIntent=ReadWrite;MultiSubnetFailover=False";
327296
options.SchemaName = "dbo";

articles/azure-monitor/logs/log-analytics-tutorial.md

Lines changed: 0 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -52,17 +52,6 @@ This is the simplest query that we can write. It just returns all the records in
5252

5353
You can see that we do have results. The number of records that the query has returned appears in the lower-right corner.
5454

55-
## Filter query results
56-
57-
Let's add a filter to the query to reduce the number of records that are returned. Select the **Filter** tab on the left pane. This tab shows columns in the query results that you can use to filter the results. The top values in those columns are displayed with the number of records that have that value. Select **200** under **ResultCode**, and then select **Apply & Run**.
58-
59-
:::image type="content" source="media/log-analytics-tutorial/query-filter-pane.png" alt-text="Screenshot that shows the query filter pane." lightbox="media/log-analytics-tutorial/query-filter-pane.png":::
60-
61-
A **where** statement is added to the query with the value that you selected. The results now include only records with that value, so you can see that the record count is reduced.
62-
63-
:::image type="content" source="media/log-analytics-tutorial/query-filter.png" alt-text="Screenshot that shows a filter being applied to the query." lightbox="media/log-analytics-tutorial/query-filter.png":::
64-
65-
6655
### Time range
6756

6857
All queries return records generated within a set time range. By default, the query returns records generated in the last 24 hours.

articles/cognitive-services/Translator/request-limits.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
title: Request limits - Translator
33
titleSuffix: Azure Cognitive Services
4-
description: This article lists request limits for the Translator. Charges are incurred based on character count, not request frequency with a limit of 5,000 characters per request. Character limits are subscription-based, with F0 limited to 2 million characters per hour.
4+
description: This article lists request limits for the Translator. Charges are incurred based on character count, not request frequency with a limit of 50,000 characters per request. Character limits are subscription-based, with F0 limited to 2 million characters per hour.
55
services: cognitive-services
66
author: laujan
77
manager: nitinme
@@ -18,13 +18,13 @@ This article provides throttling limits for the Translator translation, translit
1818

1919
## Character and array limits per request
2020

21-
Each translate request is limited to 10,000 characters, across all the target languages you are translating to. For example, sending a translate request of 3,000 characters to translate to three different languages results in a request size of 3000x3 = 9,000 characters, which satisfy the request limit. You're charged per character, not by the number of requests. It's recommended to send shorter requests.
21+
Each translate request is limited to 50,000 characters, across all the target languages you are translating to. For example, sending a translate request of 3,000 characters to translate to three different languages results in a request size of 3000x3 = 9,000 characters, which satisfy the request limit. You're charged per character, not by the number of requests. It's recommended to send shorter requests.
2222

2323
The following table lists array element and character limits for each operation of the Translator.
2424

2525
| Operation | Maximum Size of Array Element | Maximum Number of Array Elements | Maximum Request Size (characters) |
2626
|:----|:----|:----|:----|
27-
| Translate | 10,000| 100| 10,000 |
27+
| Translate | 50,000| 1,000| 50,000 |
2828
| Transliterate | 5,000| 10| 5,000 |
2929
| Detect | 50,000 |100 |50,000 |
3030
| BreakSentence | 50,000| 100 |50,000 |
@@ -76,4 +76,4 @@ When using the [BreakSentence](./reference/v3-0-break-sentence.md) function, sen
7676

7777
* [Pricing](https://azure.microsoft.com/pricing/details/cognitive-services/translator-text-api/)
7878
* [Regional availability](https://azure.microsoft.com/global-infrastructure/services/?products=cognitive-services)
79-
* [v3 Translator reference](./reference/v3-0-reference.md)
79+
* [v3 Translator reference](./reference/v3-0-reference.md)

articles/cognitive-services/language-service/custom-named-entity-recognition/language-support.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ Use this article to learn about the languages and regions currently supported by
2222
With custom NER, you can train a model in one language and test in another language. This feature is very powerful because it helps you save time and effort, instead of building separate projects for every language, you can handle multi-lingual dataset in one project. Your dataset doesn't have to be entirely in the same language but you have to specify this option at project creation. If you notice your model performing poorly in certain languages during the evaluation process, consider adding more data in this language to your training set.
2323

2424
> [!NOTE]
25-
> To enable support for multiple languages, you need to enable this option when [creating your project](how-to/create-project.md) or you can enbale it later form the project settings page.
25+
> To enable support for multiple languages, you need to enable this option when [creating your project](how-to/create-project.md) or you can enable it later form the project settings page.
2626
2727
## Language support
2828

articles/cognitive-services/language-service/toc.yml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -96,10 +96,10 @@ items:
9696
displayName: Model, Project, Class
9797
- name: Service limits
9898
href: custom-classification/service-limits.md
99-
- name: REST API (v3.2-preview.2)
99+
- name: REST API
100100
items:
101101
- name: Authoring API
102-
href: https://westus.dev.cognitive.microsoft.com/docs/services/language-authoring-clu-apis-2022-03-01-preview/operations/Projects_TriggerImportProjectJob
102+
href: https://westus.dev.cognitive.microsoft.com/docs/services/language-custom-text-authoring-apis-2022-03-01-preview/operations/Projects_TriggerExportProjectJob
103103
- name: Runtime prediction API
104104
href: https://aka.ms/ct-runtime-swagger
105105
- name: SDKs (v3.2-preview.2)
@@ -170,10 +170,10 @@ items:
170170
href: custom-named-entity-recognition/glossary.md
171171
- name: Service limits
172172
href: custom-named-entity-recognition/service-limits.md
173-
- name: REST API (v3.2-preview.2)
173+
- name: REST API
174174
items:
175175
- name: Authoring API
176-
href: https://westus.dev.cognitive.microsoft.com/docs/services/language-authoring-clu-apis-2022-03-01-preview/operations/Projects_TriggerImportProjectJob
176+
href: https://westus.dev.cognitive.microsoft.com/docs/services/language-custom-text-authoring-apis-2022-03-01-preview/operations/Projects_TriggerExportProjectJob
177177
- name: Runtime prediction API
178178
href: https://aka.ms/ct-runtime-swagger
179179
- name: SDKs (v3.2-preview.2)

articles/defender-for-cloud/overview-page.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ The **top menu bar** offers:
3333

3434
In the center of the page are the **feature tiles**, each linking to a high profile feature or dedicated dashboard:
3535

36-
- **Secure score** - Defender for Cloud continually assesses your resources, subscriptions, and organization for security issues. It then aggregates all the findings into a single score so that you can tell, at a glance, your current security situation: the higher the score, the lower the identified risk level. [Learn more](secure-score-security-controls.md).
36+
- **Security posture** - Defender for Cloud continually assesses your resources, subscriptions, and organization for security issues. It then aggregates all the findings into a single score so that you can tell, at a glance, your current security situation: the higher the score, the lower the identified risk level. [Learn more](secure-score-security-controls.md).
3737
- **Workload protections** - This is the cloud workload protection platform (CWPP) integrated within Defender for Cloud for advanced, intelligent protection of your workloads running on Azure, on-premises machines, or other cloud providers. For each resource type, there's a corresponding Microsoft Defender plan. The tile shows the coverage of your connected resources (for the currently selected subscriptions) and the recent alerts, color-coded by severity. Learn more about [the enhanced security features](enhanced-security-features-overview.md).
3838
- **Regulatory compliance** - Defender for Cloud provides insights into your compliance posture based on continuous assessments of your Azure environment. Defender for Cloud analyzes risk factors in your environment according to security best practices. These assessments are mapped to compliance controls from a supported set of standards. [Learn more](regulatory-compliance-dashboard.md).
3939
- **Firewall Manager** - This tile shows the status of your hubs and networks from [Azure Firewall Manager](../firewall-manager/overview.md).

articles/open-datasets/dataset-1000-genomes.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,8 @@ Phase 3 Analysis: A global reference for human genetic variation Nature 526, 68-
1818

1919
For details on data formats refer to http://www.internationalgenome.org/formats
2020

21+
**[NEW]** the dataset is also available in [parquet format](https://github.com/microsoft/genomicsnotebook/tree/main/vcf2parquet-conversion/1000genomes)
22+
2123
[!INCLUDE [Open Dataset usage notice](../../includes/open-datasets-usage-note.md)]
2224

2325
## Data source
@@ -40,6 +42,12 @@ West Central US: 'https://dataset1000genomes-secondary.blob.core.windows.net/dat
4042

4143
[SAS Token](../storage/common/storage-sas-overview.md): sv=2019-10-10&si=prod&sr=c&sig=9nzcxaQn0NprMPlSh4RhFQHcXedLQIcFgbERiooHEqM%3D
4244

45+
## Data Access: Curated 1000 genomes dataset in parquet format
46+
47+
East US: https://curated1000genomes.blob.core.windows.net/dataset
48+
49+
SAS Token: sv=2018-03-28&si=prod&sr=c&sig=BgIomQanB355O4FhxqBL9xUgKzwpcVlRZdBewO5%2FM4E%3D
50+
4351
## Use Terms
4452

4553
Following the final publications, data from the 1000 Genomes Project is publicly available without embargo to anyone for use under the terms provided by the dataset source ([http://www.internationalgenome.org/data](http://www.internationalgenome.org/data)). Use of the data should be cited per details available in the [FAQs]() from the 1000 Genome Project.

articles/purview/register-scan-synapse-workspace.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -227,7 +227,7 @@ GRANT REFERENCES ON DATABASE SCOPED CREDENTIAL::[scoped_credential] TO [PurviewA
227227

228228
> [!IMPORTANT]
229229
> Currently, we do not support setting up scans for an Azure Synapse workspace from the Microsoft Purview governance portal, if you cannot enable **Allow Azure services and resources to access this workspace** on your Azure Synapse workspaces. In this case:
230-
> - You can use [Microsoft Purview Rest API - Scans - Create Or Update](/rest/api/purview/scanningdataplane/scans/create-or-update/) to create a new scan for your Synapse workspaces including dedicated and serverless pools.
230+
> - You can use [Microsoft Purview REST API - Scans - Create Or Update](/rest/api/purview/scanningdataplane/scans/create-or-update/) to create a new scan for your Synapse workspaces including dedicated and serverless pools.
231231
> - You must use **SQL Auth** as authentication mechanism.
232232

233233
### Create and run scan

articles/service-bus-messaging/service-bus-tutorial-topics-subscriptions-portal.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -349,7 +349,7 @@ private async Task ReceiveMessages(string subscription)
349349
// to the broker for the specified amount of seconds and the broker returns messages as soon as they arrive. The client then initiates
350350
// a new connection. So in reality you would not want to break out of the loop.
351351
// Also note that the code shows how to batch receive, which you would do for performance reasons. For convenience you can also always
352-
// use the regular receive pump which we show in our Quick Start and in other github samples.
352+
// use the regular receive pump which we show in our Quick Start and in other GitHub samples.
353353
while (true)
354354
{
355355
try

articles/synapse-analytics/sql/develop-openrowset.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ The `OPENROWSET` function can optionally contain a `DATA_SOURCE` parameter to sp
2929
```sql
3030
SELECT *
3131
FROM OPENROWSET(BULK 'http://<storage account>.dfs.core.windows.net/container/folder/*.parquet',
32-
FORMAT = 'PARQUET') AS file
32+
FORMAT = 'PARQUET') AS [file]
3333
```
3434

3535
This is a quick and easy way to read the content of the files without pre-configuration. This option enables you to use the basic authentication option to access the storage (Azure AD passthrough for Azure AD logins and SAS token for SQL logins).
@@ -40,7 +40,7 @@ This is a quick and easy way to read the content of the files without pre-config
4040
SELECT *
4141
FROM OPENROWSET(BULK '/folder/*.parquet',
4242
DATA_SOURCE='storage', --> Root URL is in LOCATION of DATA SOURCE
43-
FORMAT = 'PARQUET') AS file
43+
FORMAT = 'PARQUET') AS [file]
4444
```
4545

4646

0 commit comments

Comments
 (0)