Skip to content

Commit 741cbe8

Browse files
committed
Merge branch 'master' of https://github.com/MicrosoftDocs/azure-docs-pr into freshness_c66
2 parents 5416084 + d980f1f commit 741cbe8

File tree

10 files changed

+16
-16
lines changed

10 files changed

+16
-16
lines changed

articles/aks/troubleshooting.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ The reason for the warnings on the dashboard is that the cluster is now enabled
5353

5454
## I can't connect to the dashboard. What should I do?
5555

56-
The easiest way to access your service outside the cluster is to run `kubectl proxy`, which proxies requests sent to your localhost port 8001 to the Kubernetes API server. From there, the API server can proxy to your service: `http://localhost:8001/api/v1/namespaces/kube-system/services/kubernetes-dashboard/proxy/#!/node?namespace=default`.
56+
The easiest way to access your service outside the cluster is to run `kubectl proxy`, which proxies requests sent to your localhost port 8001 to the Kubernetes API server. From there, the API server can proxy to your service: `http://localhost:8001/api/v1/namespaces/kube-system/services/kubernetes-dashboard/proxy/`.
5757

5858
If you don't see the Kubernetes dashboard, check whether the `kube-proxy` pod is running in the `kube-system` namespace. If it isn't in a running state, delete the pod and it will restart.
5959

articles/azure-functions/functions-reference-python.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ Azure Functions expects a function to be a stateless method in your Python scrip
1717

1818
Data from triggers and bindings is bound to the function via method attributes using the `name` property defined in the *function.json* file. For example, the _function.json_ below describes a simple function triggered by an HTTP request named `req`:
1919

20-
:::code language="son" source="~/functions-quickstart-templates/Functions.Templates/Templates/HttpTrigger-Python/function.json":::
20+
:::code language="json" source="~/functions-quickstart-templates/Functions.Templates/Templates/HttpTrigger-Python/function.json":::
2121

2222
Based on this definition, the `__init__.py` file that contains the function code might look like the following example:
2323

articles/azure-resource-manager/management/resources-without-resource-group-limit.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,9 @@
22
title: Resources without 800 count limit
33
description: Lists the Azure resource types that can have more than 800 instances in a resource group.
44
ms.topic: conceptual
5-
ms.date: 04/06/2020
5+
author: davidsmatlak
6+
ms.author: v-dasmat
7+
ms.date: 05/04/2020
68
---
79

810
# Resources not limited to 800 instances per resource group
@@ -11,7 +13,6 @@ By default, you can deploy up to 800 instances of a resource type in each resour
1113

1214
For some resource types, you need to contact support to have the 800 instance limit removed. Those resource types are noted in this article.
1315

14-
1516
## Microsoft.Automation
1617

1718
* automationAccounts

articles/cosmos-db/monitor-cosmos-db.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -104,15 +104,15 @@ Following are queries that you can use to help you monitor your Azure Cosmos dat
104104

105105
```Kusto
106106
AzureDiagnostics
107-
| where ResourceProvider=="MICROSOFT.DOCUMENTDB" and Category=="DataPlaneRequests"
107+
| where ResourceProvider=="Microsoft.DocumentDb" and Category=="DataPlaneRequests"
108108
109109
```
110110
111111
* To query for all operations, grouped by resource:
112112
113113
```Kusto
114114
AzureActivity
115-
| where ResourceProvider=="MICROSOFT.DOCUMENTDB" and Category=="DataPlaneRequests"
115+
| where ResourceProvider=="Microsoft.DocumentDb" and Category=="DataPlaneRequests"
116116
| summarize count() by Resource
117117
118118
```
@@ -121,7 +121,7 @@ Following are queries that you can use to help you monitor your Azure Cosmos dat
121121
122122
```Kusto
123123
AzureActivity
124-
| where Caller == "[email protected]" and ResourceProvider=="MICROSOFT.DOCUMENTDB" and Category=="DataPlaneRequests"
124+
| where Caller == "[email protected]" and ResourceProvider=="Microsoft.DocumentDb" and Category=="DataPlaneRequests"
125125
| summarize count() by Resource
126126
```
127127

articles/media-services/azure-media-player/azure-media-player-localization.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,12 +39,12 @@ Azure Media Player currently supports the following languages with their corresp
3939
| French | fr | Norwegian - Nynorsk | nn | Chinese - simplified | zh-hans |
4040
| Galician | gl | Polish | pl | Chinese - traditional | zh-hant |
4141
| Hebrew | he | Portuguese - Brazil | pt-br | | |
42-
| Hindi | hu | Portuguese - Portugal | pt-pt | | |
42+
| Hindi | hi | Portuguese - Portugal | pt-pt | | |
4343

4444

4545
> [!NOTE]
4646
> If you do not want any localization to occur you must force the language to English
4747
4848
## Next steps ##
4949

50-
- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
50+
- [Azure Media Player Quickstart](azure-media-player-quickstart.md)

articles/migrate/migrate-support-matrix-vmware.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.date: 05/04/2020
77

88
# Support matrix for VMware assessment
99

10-
This article summarizes prerequisites and support requirements when you assess VMware VMs for migration to Azure, using the Azure Migrate:Server Assessment](migrate-services-overview.md#azure-migrate-server-assessment-tool) tool. If you want to migrate VMware VMs to Azure, review the [migration support matrix](migrate-support-matrix-vmware-migration.md).
10+
This article summarizes prerequisites and support requirements when you assess VMware VMs for migration to Azure, using the [Azure Migrate:Server Assessment](migrate-services-overview.md#azure-migrate-server-assessment-tool) tool. If you want to migrate VMware VMs to Azure, review the [migration support matrix](migrate-support-matrix-vmware-migration.md).
1111

1212
To assess VMware VMs, you create an Azure Migrate project, and then add the Server Assessment tool to the project. After the tool is added, you deploy the [Azure Migrate appliance](migrate-appliance.md). The appliance continuously discovers on-premises machines, and sends machine metadata and performance data to Azure. After discovery is complete, you gather discovered machines into groups, and run an assessment for a group.
1313

articles/storage/blobs/data-lake-storage-quickstart-create-databricks-account.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -73,7 +73,7 @@ In this section, you create an Azure Databricks workspace using the Azure portal
7373

7474
For more information on creating clusters, see [Create a Spark cluster in Azure Databricks](https://docs.azuredatabricks.net/user-guide/clusters/create.html).
7575

76-
## Create storage account container
76+
## Create notebook
7777

7878
In this section, you create a notebook in Azure Databricks workspace and then run code snippets to configure the storage account.
7979

articles/storage/blobs/storage-blob-static-website-how-to.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -154,7 +154,7 @@ Upload objects to the *$web* container from a source directory.
154154
This example assumes that you're running commands from Azure Cloud Shell session.
155155

156156
```azurecli-interactive
157-
az storage blob upload-batch -s <source-path> -d \$web --account-name <storage-account-name> --content-type 'text/html; charset=utf-8'
157+
az storage blob upload-batch -s <source-path> -d \$web --account-name <storage-account-name>
158158
```
159159

160160
* Replace the `<storage-account-name>` placeholder value with the name of your storage account.
@@ -173,7 +173,6 @@ Upload objects to the *$web* container from a source directory.
173173
```powershell
174174
# upload a file
175175
set-AzStorageblobcontent -File "<path-to-file>" `
176-
-Properties @{ ContentType = "text/html; charset=utf-8";} `
177176
-Container `$web `
178177
-Blob "<blob-name>" `
179178
-Context $ctx

articles/storage/blobs/storage-blob-static-website.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -36,8 +36,8 @@ Files in the **$web** container are case-sensitive, served through anonymous acc
3636
You can use any of these tools to upload content to the **$web** container:
3737

3838
> [!div class="checklist"]
39-
> * [Azure CLI](storage-blob-static-website-how-to.md#cli)
40-
> * [Azure PowerShell module](storage-blob-static-website-how-to.md#powershell)
39+
> * [Azure CLI](storage-blob-static-website-how-to.md?tabs=azure-cli)
40+
> * [Azure PowerShell module](storage-blob-static-website-how-to.md?tabs=azure-powershell)
4141
> * [AzCopy](../common/storage-use-azcopy-v10.md)
4242
> * [Azure Storage Explorer](https://azure.microsoft.com/features/storage-explorer/)
4343
> * [Azure Pipelines](https://azure.microsoft.com/services/devops/pipelines/)

articles/synapse-analytics/sql-data-warehouse/sql-data-warehouse-source-control-integration.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -75,4 +75,4 @@ This tutorial outlines how to integrate your SQL Server Data tools (SSDT) databa
7575

7676
## Next steps
7777

78-
- [Developing for Developing for SQL pool](sql-data-warehouse-overview-develop.md)
78+
- [Developing for SQL pool](sql-data-warehouse-overview-develop.md)

0 commit comments

Comments
 (0)