Skip to content

Commit c6f9008

Browse files
authored
Merge pull request #223328 from MicrosoftDocs/repo_sync_working_branch
Confirm merge from repo_sync_working_branch to main to sync with https://github.com/MicrosoftDocs/azure-docs (branch main)
2 parents 6b55427 + cb49f0e commit c6f9008

File tree

9 files changed

+41
-9
lines changed

9 files changed

+41
-9
lines changed

articles/active-directory/develop/tutorial-blazor-server.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ Finally, because the app calls a protected API (in this case Microsoft Graph), i
5454
Run the following command to download the templates for `Microsoft.Identity.Web`, which we'll make use of in this tutorial.
5555

5656
```dotnetcli
57-
dotnet new install Microsoft.Identity.Web.ProjectTemplates
57+
dotnet new --install Microsoft.Identity.Web.ProjectTemplates
5858
```
5959

6060
Then, run the following command to create the application. Replace the placeholders in the command with the proper information from your app's overview page and execute the command in a command shell. The output location specified with the `-o|--output` option creates a project folder if it doesn't exist and becomes part of the app's name.

articles/azure-functions/analyze-telemetry-data.md

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -103,6 +103,35 @@ traces
103103

104104
The runtime provides the `customDimensions.LogLevel` and `customDimensions.Category` fields. You can provide additional fields in logs that you write in your function code. For an example in C#, see [Structured logging](functions-dotnet-class-library.md#structured-logging) in the .NET class library developer guide.
105105

106+
## Query function invocations
107+
108+
Every function invocation is assigned a unique ID. `InvocationId` is included in the custom dimension and can be used to correlate all the logs from a particular function execution.
109+
110+
```kusto
111+
traces
112+
| project customDimensions["InvocationId"], message
113+
```
114+
115+
## Telemetry correlation
116+
117+
Logs from different functions can be correlated using `operation_Id`. Use the following query to return all the logs for a specific logical operation.
118+
119+
```kusto
120+
traces
121+
| where operation_Id == '45fa5c4f8097239efe14a2388f8b4e29'
122+
| project timestamp, customDimensions["InvocationId"], message
123+
| order by timestamp
124+
```
125+
126+
## Sampling percentage
127+
128+
Sampling configuration can be used to reduce the volume of telemetry. Use the following query to determine if sampling is operational or not. If you see that `RetainedPercentage` for any type is less than 100, then that type of telemetry is being sampled.
129+
130+
```kusto
131+
union requests,dependencies,pageViews,browserTimings,exceptions,traces
132+
| where timestamp > ago(1d)
133+
| summarize RetainedPercentage = 100/avg(itemCount) by bin(timestamp, 1h), itemType
134+
```
106135
## Query scale controller logs
107136

108137
_This feature is in preview._

articles/azure-monitor/app/azure-ad-authentication.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,7 @@ Using various authentication systems can be cumbersome and risky because it's di
1717

1818
The following are prerequisites to enable Azure AD authenticated ingestion.
1919

20+
- Must be in public cloud
2021
- Familiarity with:
2122
- [Managed identity](../../active-directory/managed-identities-azure-resources/overview.md).
2223
- [Service principal](../../active-directory/develop/howto-create-service-principal-portal.md).

articles/azure-monitor/essentials/prometheus-metrics-enable.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -269,7 +269,7 @@ The `aks-preview` extension needs to be installed using the following command. F
269269
```azurecli
270270
az extension add --name aks-preview
271271
```
272-
Use the following command to remove the agent from the cluster nodes and delete the recording rules created for the data being collected from the cluster. This doesn't remove the DCE, DCR, or the data already collected and stored in your Azure Monitor workspace.
272+
Use the following command to remove the agent from the cluster nodes and delete the recording rules created for the data being collected from the cluster along with the Data Collection Rule Associations (DCRA) that link the DCE or DCR with your cluster. This doesn't remove the DCE, DCR, or the data already collected and stored in your Azure Monitor workspace.
273273

274274
```azurecli
275275
az aks update --disable-azuremonitormetrics -n <cluster-name> -g <cluster-resource-group>

articles/cognitive-services/language-service/summarization/includes/quickstarts/python-sdk.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ ms.custom: ignite-fall-2021
1010

1111
# [Document summarization](#tab/document-summarization)
1212

13-
[Reference documentation](/python/api/azure-ai-textanalytics/azure.ai.textanalytics?preserve-view=true&view=azure-python-preview) | [Additional samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/textanalytics/azure-ai-textanalytics/samples) | [Package (PyPi)](https://pypi.org/project/azure-ai-textanalytics/5.2.0b1/) | [Library source code](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/textanalytics/azure-ai-textanalytics)
13+
[Reference documentation](/python/api/azure-ai-textanalytics/azure.ai.textanalytics?preserve-view=true&view=azure-python-preview) | [Additional samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/textanalytics/azure-ai-textanalytics/samples) | [Package (PyPi)](https://pypi.org/project/azure-ai-textanalytics/5.3.0b1/) | [Library source code](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/textanalytics/azure-ai-textanalytics)
1414

1515
# [Conversation summarization](#tab/conversation-summarization)
1616

@@ -43,7 +43,7 @@ After installing Python, you can install the client library with:
4343
# [Document summarization](#tab/document-summarization)
4444

4545
```console
46-
pip install azure-ai-textanalytics==5.2.0b4
46+
pip install azure-ai-textanalytics==5.3.0b1
4747
```
4848

4949
# [Conversation summarization](#tab/conversation-summarization)
@@ -251,7 +251,6 @@ resolution: Asked customer to try the following steps | Asked customer for the p
251251

252252
---
253253

254-
255254
## Clean up resources
256255

257256
If you want to clean up and remove a Cognitive Services subscription, you can delete the resource or resource group. Deleting the resource group also deletes any other resources associated with it.

articles/confidential-computing/quick-create-confidential-vm-arm-amd.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -165,6 +165,9 @@ Use this example to create a custom parameter file for a Linux-based confidentia
165165
az account set --subscription <subscription-id>
166166
```
167167
1. Grant confidential VM Service Principal `Confidential VM Orchestrator` to tenant
168+
169+
For this step you need to be a Global Admin or you need to have the User Access Administrator RBAC role.
170+
168171
```azurecli
169172
Connect-AzureAD -Tenant "your tenant ID"
170173
New-AzureADServicePrincipal -AppId bf7b6499-ff71-4aa2-97a4-f372087be7f0 -DisplayName "Confidential VM Orchestrator"

articles/cosmos-db/distributed-nosql.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,11 +28,11 @@ One of the challenges when maintaining a database system is that many database e
2828

2929
## Distributed databases
3030

31-
[Distributed databases](https://en.wikipedia.org/wiki/Distributed_database) refer to databases that scale across many different instances or locations. While many NoSQL databases are designed for scale, not all are necessarily distributed databases. Even more, many NoSQL databases require time and effort to distribute across redundant nodes for local-redundancy or globally for geo-redundancy. The planning, implementation, and networking requirements for a globally distribute database can be complex.
31+
[Distributed databases](https://en.wikipedia.org/wiki/Distributed_database) refer to databases that scale across many different instances or locations. While many NoSQL databases are designed for scale, not all are necessarily distributed databases. Even more, many NoSQL databases require time and effort to distribute across redundant nodes for local-redundancy or globally for geo-redundancy. The planning, implementation, and networking requirements for a globally distributed database can be complex.
3232

3333
## Azure Cosmos DB
3434

35-
With a distributed database that is also a NoSQL database, high transactional workloads suddenly became easier to build and manage.[Azure Cosmos DB](introduction.md) is a database platform that offers distributed data APIs in both NoSQL and relational variants. Specifically, many of the NoSQL APIs offer various consistency options that allow you to fine tune the level of consistency or availability that meets your real-world application requirements. Your database could be configured to offer high consistency with tradeoffs to speed and availability. Similarly, your database could be configured to offer the best performance with predictable tradeoffs to consistency and latency of your replicated data. Azure Cosmos DB will automatically and dynamically distribute your data across local instances or globally. Azure Cosmos DB can also provide ACID guarantees and scale throughput to map to your application’s requirements.
35+
With a distributed database that is also a NoSQL database, high transactional workloads suddenly became easier to build and manage. [Azure Cosmos DB](introduction.md) is a database platform that offers distributed data APIs in both NoSQL and relational variants. Specifically, many of the NoSQL APIs offer various consistency options that allow you to fine tune the level of consistency or availability that meets your real-world application requirements. Your database could be configured to offer high consistency with tradeoffs to speed and availability. Similarly, your database could be configured to offer the best performance with predictable tradeoffs to consistency and latency of your replicated data. Azure Cosmos DB will automatically and dynamically distribute your data across local instances or globally. Azure Cosmos DB can also provide ACID guarantees and scale throughput to map to your application’s requirements.
3636

3737
## Next steps
3838

articles/postgresql/flexible-server/how-to-autovacuum-tuning.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -115,7 +115,7 @@ Use the following query to list the tables in a database and identify the tables
115115
'pg_catalog'
116116
,'information_schema'
117117
)
118-
AND N.nspname ! ~ '^pg_toast'
118+
AND N.nspname !~ '^pg_toast'
119119
) AS av
120120
ORDER BY av_needed DESC ,n_dead_tup DESC;
121121
```

articles/synapse-analytics/guidance/implementation-success-assess-environment.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ ms.date: 05/31/2022
1313

1414
[!INCLUDE [implementation-success-context](includes/implementation-success-context.md)]
1515

16-
The first step when implementing Azure Synapse Analytics is to assessment your environment. An assessment provides you with the opportunity to gather all the available information about your existing environment, environmental requirements, project requirements, constraints, timelines, and pain points. This information will form the basis of later evaluations and checkpoint activities. It will prove invaluable when it comes time to validate and compare against the project solution as it's planned, designed, and developed. We recommend that you dedicate a good amount of time to gather all the information and be sure to have necessary discussions with relevant groups. Relevant groups can include project stakeholders, business users, solution designers, and subject matter experts (SMEs) of the existing solution and environment.
16+
The first step when implementing Azure Synapse Analytics is to conduct an assessment of your environment. An assessment provides you with the opportunity to gather all the available information about your existing environment, environmental requirements, project requirements, constraints, timelines, and pain points. This information will form the basis of later evaluations and checkpoint activities. It will prove invaluable when it comes time to validate and compare against the project solution as it's planned, designed, and developed. We recommend that you dedicate a good amount of time to gather all the information and be sure to have necessary discussions with relevant groups. Relevant groups can include project stakeholders, business users, solution designers, and subject matter experts (SMEs) of the existing solution and environment.
1717

1818
The assessment will become a guide to help you evaluate the solution design and make informed technology recommendations to implement Azure Synapse.
1919

0 commit comments

Comments
 (0)