Skip to content

Commit 4895819

Browse files
authored
Merge pull request #102680 from MicrosoftDocs/master
1/29 AM Publish
2 parents 5d6ce6d + 7ed9545 commit 4895819

20 files changed

+213
-110
lines changed

.openpublishing.redirection.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14968,7 +14968,7 @@
1496814968
"redirect_document_id": false
1496914969
},
1497014970
{
14971-
"source_path": "articles/machine-learning/machine-learning/how-to-ui-sample-classification-predict-flight-delay.md",
14971+
"source_path": "articles/machine-learning/how-to-ui-sample-classification-predict-flight-delay.md",
1497214972
"redirect_url": "/azure/machine-learning/how-to-designer-sample-classification-flight-delay",
1497314973
"redirect_document_id": false
1497414974
},

articles/active-directory-b2c/quickstart-native-app-desktop.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ Azure Active Directory B2C (Azure AD B2C) provides cloud identity management to
2525

2626
- [Visual Studio 2019](https://www.visualstudio.com/downloads/) with the **ASP.NET and web development** workload.
2727
- A social account from either Facebook, Google, or Microsoft.
28-
- [Download a zip file](https://github.com/Azure-Samples/active-directory-b2c-dotnet-desktop/archive/master.zip) or clone the sample web app from GitHub.
28+
- [Download a zip file](https://github.com/Azure-Samples/active-directory-b2c-dotnet-desktop/archive/msalv3.zip) or clone the [Azure-Samples/active-directory-b2c-dotnet-desktop](https://github.com/Azure-Samples/active-directory-b2c-dotnet-desktop) repository from GitHub.
2929

3030
```
3131
git clone https://github.com/Azure-Samples/active-directory-b2c-dotnet-desktop.git

articles/active-directory/b2b/tutorial-bulk-invite.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,7 @@ ms.collection: M365-identity-device-management
2828

2929
> [!NOTE]
3030
> As of 12/22/2019, the Bulk invite users (Preview) feature has been temporarily disabled.
31+
> There is currently no known date for when this feature will be re-enabled.
3132
3233
If you use Azure Active Directory (Azure AD) B2B collaboration to work with external partners, you can invite multiple guest users to your organization at the same time. In this tutorial, you learn how to use the Azure portal to send bulk invitations to external users. Specifically, you do the following:
3334

articles/active-directory/cloud-provisioning/how-to-prerequisites.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,9 @@ You need the following to use Azure AD Connect cloud provisioning:
2525
- An on-premises server for the provisioning agent with Windows 2012 R2 or later.
2626
- On-premises firewall configurations.
2727

28+
>[!NOTE]
29+
>The provisioning agent can currently only be installed on English language servers. Installing an English language pack on a non-English server is not a valid workaround and will result in the agent failing to install.
30+
2831
The rest of the document provides step-by-step instructions for these prerequisites.
2932

3033
### In the Azure Active Directory admin center

articles/api-management/api-management-howto-log-event-hubs.md

Lines changed: 3 additions & 54 deletions
Original file line numberDiff line numberDiff line change
@@ -26,60 +26,9 @@ This article is a companion to the [Integrate Azure API Management with Event Hu
2626
For detailed steps on how to create an event hub and get connection strings that you need to send and receive events to and from the Event Hub, see [Create an Event Hubs namespace and an event hub using the Azure portal](https://docs.microsoft.com/azure/event-hubs/event-hubs-create).
2727

2828
## Create an API Management logger
29-
Now that you have an Event Hub, the next step is to configure a [Logger](https://docs.microsoft.com/rest/api/apimanagement/apimanagementrest/azure-api-management-rest-api-logger-entity) in your API Management service so that it can log events to the Event Hub.
29+
Now that you have an Event Hub, the next step is to configure a [Logger](https://docs.microsoft.com/rest/api/apimanagement/2019-01-01/logger) in your API Management service so that it can log events to the Event Hub.
3030

31-
API Management loggers are configured using the [API Management REST API](https://aka.ms/smapi). Before using the REST API for the first time, review the [prerequisites](https://docs.microsoft.com/rest/api/apimanagement/apimanagementrest/api-management-rest) and ensure that you have [enabled access to the REST API](https://docs.microsoft.com/rest/api/apimanagement/apimanagementrest/api-management-rest#EnableRESTAPI).
32-
33-
To create a logger, make an HTTP PUT request using the following URL template:
34-
35-
`https://{your service}.management.azure-api.net/loggers/{new logger name}?api-version=2017-03-01`
36-
37-
* Replace `{your service}` with the name of your API Management service instance.
38-
* Replace `{new logger name}` with the desired name for your new logger. You reference this name when you configure the [log-to-eventhub](/azure/api-management/api-management-advanced-policies#log-to-eventhub) policy
39-
40-
Add the following headers to the request:
41-
42-
* Content-Type : application/json
43-
* Authorization : SharedAccessSignature 58...
44-
* For instructions on generating the `SharedAccessSignature` see [Azure API Management REST API Authentication](https://docs.microsoft.com/rest/api/apimanagement/apimanagementrest/azure-api-management-rest-api-authentication).
45-
46-
Specify the request body using the following template:
47-
48-
```json
49-
{
50-
"loggerType" : "AzureEventHub",
51-
"description" : "Sample logger description",
52-
"credentials" : {
53-
"name" : "Name of the Event Hub from the portal",
54-
"connectionString" : "Endpoint=Event Hub Sender connection string"
55-
}
56-
}
57-
```
58-
59-
* `loggerType` must be set to `AzureEventHub`.
60-
* `description` provides an optional description of the logger and can be a zero length string if desired.
61-
* `credentials` contains the `name` and `connectionString` of your Azure Event Hub.
62-
63-
When you make the request, if the logger is created, a status code of `201 Created` is returned. A sample response based on the above sample request is shown below.
64-
65-
```json
66-
{
67-
"id": "/loggers/{new logger name}",
68-
"loggerType": "azureEventHub",
69-
"description": "Sample logger description",
70-
"credentials": {
71-
"name": "Name of the Event Hub from the Portal",
72-
"connectionString": "{{Logger-Credentials-xxxxxxxxxxxxxxx}}"
73-
},
74-
"isBuffered": true,
75-
"resourceId": null
76-
}
77-
```
78-
79-
> [!NOTE]
80-
> For other possible return codes and their reasons, see [Create a Logger](https://docs.microsoft.com/rest/api/apimanagement/apimanagementrest/azure-api-management-rest-api-logger-entity#PUT). To see how to perform other operations such as list, update, and delete, see the [Logger](https://docs.microsoft.com/rest/api/apimanagement/apimanagementrest/azure-api-management-rest-api-logger-entity) entity documentation.
81-
>
82-
>
31+
API Management loggers are configured using the [API Management REST API](https://aka.ms/apimapi). For detailed request examples, see [how to create Loggers](https://docs.microsoft.com/rest/api/apimanagement/2019-01-01/logger/createorupdate).
8332

8433
## Configure log-to-eventhubs policies
8534

@@ -112,7 +61,7 @@ Click **Save** to save the updated policy configuration. As soon as it is saved
11261
* [Receive messages with EventProcessorHost](../event-hubs/event-hubs-dotnet-standard-getstarted-receive-eph.md)
11362
* [Event Hubs programming guide](../event-hubs/event-hubs-programming-guide.md)
11463
* Learn more about API Management and Event Hubs integration
115-
* [Logger entity reference](https://docs.microsoft.com/rest/api/apimanagement/apimanagementrest/azure-api-management-rest-api-logger-entity)
64+
* [Logger entity reference](https://docs.microsoft.com/rest/api/apimanagement/2019-01-01/logger)
11665
* [log-to-eventhub policy reference](https://docs.microsoft.com/azure/api-management/api-management-advanced-policies#log-to-eventhub)
11766
* [Monitor your APIs with Azure API Management, Event Hubs, and Moesif](api-management-log-to-eventhub-sample.md)
11867
* Learn more about [integration with Azure Application Insights](api-management-howto-app-insights.md)

articles/azure-databricks/databricks-extract-load-sql-data-warehouse.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.reviewer: jasonh
77
ms.service: azure-databricks
88
ms.custom: mvc
99
ms.topic: tutorial
10-
ms.date: 06/20/2019
10+
ms.date: 01/29/2020
1111
---
1212
# Tutorial: Extract, transform, and load data by using Azure Databricks
1313

@@ -57,7 +57,7 @@ Complete these tasks before you begin this tutorial:
5757

5858
If you'd prefer to use an access control list (ACL) to associate the service principal with a specific file or directory, reference [Access control in Azure Data Lake Storage Gen2](../storage/blobs/data-lake-storage-access-control.md).
5959

60-
* When performing the steps in the [Get values for signing in](https://docs.microsoft.com/azure/active-directory/develop/howto-create-service-principal-portal#get-values-for-signing-in) section of the article, paste the tenant ID, app ID, and password values into a text file. You'll need those soon.
60+
* When performing the steps in the [Get values for signing in](https://docs.microsoft.com/azure/active-directory/develop/howto-create-service-principal-portal#get-values-for-signing-in) section of the article, paste the tenant ID, app ID, and secret values into a text file. You'll need those soon.
6161

6262
* Sign in to the [Azure portal](https://portal.azure.com/).
6363

@@ -165,23 +165,23 @@ In this section, you create a notebook in Azure Databricks workspace and then ru
165165
```scala
166166
val storageAccountName = "<storage-account-name>"
167167
val appID = "<app-id>"
168-
val password = "<password>"
168+
val secret = "<secret>"
169169
val fileSystemName = "<file-system-name>"
170170
val tenantID = "<tenant-id>"
171171

172172
spark.conf.set("fs.azure.account.auth.type." + storageAccountName + ".dfs.core.windows.net", "OAuth")
173173
spark.conf.set("fs.azure.account.oauth.provider.type." + storageAccountName + ".dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")
174174
spark.conf.set("fs.azure.account.oauth2.client.id." + storageAccountName + ".dfs.core.windows.net", "" + appID + "")
175-
spark.conf.set("fs.azure.account.oauth2.client.secret." + storageAccountName + ".dfs.core.windows.net", "" + password + "")
175+
spark.conf.set("fs.azure.account.oauth2.client.secret." + storageAccountName + ".dfs.core.windows.net", "" + secret + "")
176176
spark.conf.set("fs.azure.account.oauth2.client.endpoint." + storageAccountName + ".dfs.core.windows.net", "https://login.microsoftonline.com/" + tenantID + "/oauth2/token")
177177
spark.conf.set("fs.azure.createRemoteFileSystemDuringInitialization", "true")
178178
dbutils.fs.ls("abfss://" + fileSystemName + "@" + storageAccountName + ".dfs.core.windows.net/")
179179
spark.conf.set("fs.azure.createRemoteFileSystemDuringInitialization", "false")
180180
```
181181

182-
6. In this code block, replace the `<app-id>`, `<password>`, `<tenant-id>`, and `<storage-account-name>` placeholder values in this code block with the values that you collected while completing the prerequisites of this tutorial. Replace the `<file-system-name>` placeholder value with whatever name you want to give the file system.
182+
6. In this code block, replace the `<app-id>`, `<secret>`, `<tenant-id>`, and `<storage-account-name>` placeholder values in this code block with the values that you collected while completing the prerequisites of this tutorial. Replace the `<file-system-name>` placeholder value with whatever name you want to give the file system.
183183

184-
* The `<app-id>`, and `<password>` are from the app that you registered with active directory as part of creating a service principal.
184+
* The `<app-id>`, and `<secret>` are from the app that you registered with active directory as part of creating a service principal.
185185

186186
* The `<tenant-id>` is from your subscription.
187187

articles/azure-monitor/platform/action-groups.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -88,7 +88,7 @@ Send email to the members of the subscription's role.
8888
You may have a limited number of email actions in an Action Group. See the [rate limiting information](./../../azure-monitor/platform/alerts-rate-limiting.md) article.
8989

9090
### Function
91-
The function keys for Function Apps configured as actions are read through the Functions API, which currently requires v2 function apps to configure the app setting “AzureWebJobsSecretStorageType” to “files”. For more information, see [Changes to Key Management in Functions V2]( https://aka.ms/funcsecrets).
91+
Calls an existing HTTP trigger endpoint in [Azure Functions](../../azure-functions/functions-create-first-azure-function.md#create-a-function-app).
9292

9393
You may have a limited number of Function actions in an Action Group.
9494

articles/azure-monitor/platform/collect-custom-metrics-linux-telegraf.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ By using Azure Monitor, you can collect custom metrics via your application tele
1515

1616
## InfluxData Telegraf agent
1717

18-
[Telegraf](https://docs.influxdata.com/telegraf/v1.7/) is a plug-in-driven agent that enables the collection of metrics from over 150 different sources. Depending on what workloads run on your VM, you can configure the agent to leverage specialized input plug-ins to collect metrics. Examples are MySQL, NGINX, and Apache. By using output plug-ins, the agent can then write to destinations that you choose. The Telegraf agent has integrated directly with the Azure Monitor custom metrics REST API. It supports an Azure Monitor output plug-in. By using this plug-in, the agent can collect workload-specific metrics on your Linux VM and submit them as custom metrics to Azure Monitor.
18+
[Telegraf](https://docs.influxdata.com/telegraf/) is a plug-in-driven agent that enables the collection of metrics from over 150 different sources. Depending on what workloads run on your VM, you can configure the agent to leverage specialized input plug-ins to collect metrics. Examples are MySQL, NGINX, and Apache. By using output plug-ins, the agent can then write to destinations that you choose. The Telegraf agent has integrated directly with the Azure Monitor custom metrics REST API. It supports an Azure Monitor output plug-in. By using this plug-in, the agent can collect workload-specific metrics on your Linux VM and submit them as custom metrics to Azure Monitor.
1919

2020
![Telegraph agent overview](./media/collect-custom-metrics-linux-telegraf/telegraf-agent-overview.png)
2121

articles/blockchain/templates/hyperledger-fabric-consortium-azure-kubernetes-service.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -462,7 +462,7 @@ npm run queryCC -- -o $ORGNAME -u $USER_IDENTITY -n $CC_NAME -c $CHANNEL -f <que
462462
463463
```
464464
465-
Pass query function name and comma separated list of arguments in `<queryFunction>` and `<queryFuncArgs>` respectively. Again, taking `fabcar` chaincode as reference, to query all the cars in the world state set `<queryFunction>` to `"queryAllCars"` and `<queryArgs>' to `""`.
465+
Pass query function name and comma separated list of arguments in `<queryFunction>` and `<queryFuncArgs>` respectively. Again, taking `fabcar` chaincode as reference, to query all the cars in the world state set `<queryFunction>` to `"queryAllCars"` and `<queryArgs>` to `""`.
466466
467467
Refer command help for more details on the arguments passed in the command
468468

0 commit comments

Comments
 (0)