Skip to content

Commit 1a4b856

Browse files
committed
Merge branch 'master' of https://github.com/MicrosoftDocs/azure-docs-pr into fSD
2 parents e710771 + fb9c5e6 commit 1a4b856

14 files changed

+65
-64
lines changed

articles/active-directory/app-provisioning/use-scim-to-provision-users-and-groups.md

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1224,10 +1224,6 @@ To help drive awareness and demand of our joint integration, we recommend you up
12241224
)
12251225
* **Customer communication.** Alert customers of the new integration through your customer communication (monthly newsletters, email campaigns, product release notes).
12261226

1227-
### Allow IP addresses used by the Azure AD provisioning service to make SCIM requests
1228-
1229-
Certain apps allow inbound traffic to their app. In order for the Azure AD provisioning service to function as expected, the IP addresses used must be allowed. For a list of IP addresses for each service tag/region, see the JSON file - [Azure IP Ranges and Service TagsPublic Cloud](https://www.microsoft.com/download/details.aspx?id=56519). You can download and program these IPs into your firewall as needed. The reserved IP ranges for Azure AD provisioning can be found under "AzureActiveDirectoryDomainServices."
1230-
12311227
## Related articles
12321228

12331229
* [Automate user provisioning and deprovisioning to SaaS apps](user-provisioning.md)

articles/active-directory/develop/authentication-vs-authorization.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: Authentication vs. authorization | Azure
2+
title: Authentication vs authorization | Azure
33
titleSuffix: Microsoft identity platform
44
description: Learn about the basics of authentication and authorization in Microsoft identity platform (v2.0).
55
services: active-directory
@@ -10,14 +10,14 @@ ms.service: active-directory
1010
ms.subservice: develop
1111
ms.topic: conceptual
1212
ms.workload: identity
13-
ms.date: 05/18/2020
13+
ms.date: 05/22/2020
1414
ms.author: ryanwi
1515
ms.reviewer: jmprieur, saeeda, sureshja, hirsin
1616
ms.custom: aaddev, identityplatformtop40, scenarios:getting-started
1717
#Customer intent: As an application developer, I want to understand the basic concepts of authentication and authorization in Microsoft identity platform
1818
---
1919

20-
# Authentication vs. authorization
20+
# Authentication vs authorization
2121

2222
This article defines authentication and authorization and briefly covers how you can use the Microsoft identity platform to authenticate and authorize users in your web apps, web APIs, or apps calling protected web APIs. If you see a term you aren't familiar with, try our [glossary](developer-glossary.md) or our [Microsoft identity platform videos](identity-videos.md) which cover basic concepts.
2323

@@ -39,9 +39,9 @@ Microsoft identity platform simplifies authorization and authentication for appl
3939

4040
Following is a brief comparison of the various protocols used by Microsoft identity platform:
4141

42-
* **OAuth vs. OpenID Connect**: OAuth is used for authorization and OpenID Connect (OIDC) is used for authentication. OpenID Connect is built on top of OAuth 2.0, so the terminology and flow are similar between the two. You can even both authenticate a user (using OpenID Connect) and get authorization to access a protected resource that the user owns (using OAuth 2.0) in one request. For more information, see [OAuth 2.0 and OpenID Connect protocols](active-directory-v2-protocols.md) and [OpenID Connect protocol](v2-protocols-oidc.md).
43-
* **OAuth vs. SAML**: OAuth is used for authorization and SAML is used for authentication. See [Microsoft identity platform and OAuth 2.0 SAML bearer assertion flow](v2-saml-bearer-assertion.md) for more information on how the two protocols can be used together to both authenticate a user (using SAML) and get authorization to access a protected resource (using OAuth 2.0).
44-
* **OpenID Connect vs. SAML**: Both OpenID Connect and SAML are used to authenticate a user and are used to enable Single Sign On. SAML authentication is commonly used with identity providers such as Active Directory Federation Services (ADFS) federated to Azure AD and is therefore frequently used in enterprise applications. OpenID Connect is commonly used for apps that are purely in the cloud, such as mobile apps, web sites, and web APIs.
42+
* **OAuth vs OpenID Connect**: OAuth is used for authorization and OpenID Connect (OIDC) is used for authentication. OpenID Connect is built on top of OAuth 2.0, so the terminology and flow are similar between the two. You can even both authenticate a user (using OpenID Connect) and get authorization to access a protected resource that the user owns (using OAuth 2.0) in one request. For more information, see [OAuth 2.0 and OpenID Connect protocols](active-directory-v2-protocols.md) and [OpenID Connect protocol](v2-protocols-oidc.md).
43+
* **OAuth vs SAML**: OAuth is used for authorization and SAML is used for authentication. See [Microsoft identity platform and OAuth 2.0 SAML bearer assertion flow](v2-saml-bearer-assertion.md) for more information on how the two protocols can be used together to both authenticate a user (using SAML) and get authorization to access a protected resource (using OAuth 2.0).
44+
* **OpenID Connect vs SAML**: Both OpenID Connect and SAML are used to authenticate a user and are used to enable Single Sign On. SAML authentication is commonly used with identity providers such as Active Directory Federation Services (ADFS) federated to Azure AD and is therefore frequently used in enterprise applications. OpenID Connect is commonly used for apps that are purely in the cloud, such as mobile apps, web sites, and web APIs.
4545

4646
## Next steps
4747

articles/cognitive-services/LUIS/includes/sign-in-process.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ ms.author: diberry
1616

1717
A new user to LUIS needs to follow this procedure:
1818

19-
1. Sign in to [LUIS portal](https://www.luis.ai), select your country/region and agree to the terms of use. If you see **My Apps** instead, a LUIS resource already exists and you should skip ahead to create an app.
19+
1. Sign in to [LUIS portal](https://www.luis.ai), select your country/region and agree to the terms of use. If you see **My Apps** instead, a LUIS resource already exists and you should skip ahead to create an app. For supported regions, visit [authoring and publishing regions and the associated keys](https://docs.microsoft.com/azure/cognitive-services/luis/luis-reference-regions).
2020

2121
1. Select **Create Azure resource** then select **Create an authoring resource to migrate your apps to.**
2222

@@ -41,4 +41,4 @@ A new user to LUIS needs to follow this procedure:
4141

4242
1. Confirm by selecting **Continue**.
4343

44-
![Create authoring resource](../media/sign-in/sign-in-confirm-continue.png)
44+
![Create authoring resource](../media/sign-in/sign-in-confirm-continue.png)

articles/cost-management-billing/manage/change-credit-card.md

Lines changed: 8 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -23,9 +23,9 @@ If you have a Microsoft Customer Agreement, your payment methods are associated
2323

2424
## Manage credit cards for an Azure subscription
2525

26-
The following sections apply to customers who have a Microsoft Online Services Program billing account. Learn how to [check your billing account type](#check-the-type-of-your-account). If your billing account type is Microsoft Online Services Program, payment methods are associated with individual Azure subscriptions.
26+
The following sections apply to customers who have a Microsoft Online Services Program billing account. Learn how to [check your billing account type](#check-the-type-of-your-account). If your billing account type is Microsoft Online Services Program, payment methods are associated with individual Azure subscriptions. If you get an error after you add the credit card, see [Credit card declined at Azure sign-up](../../billing/billing-credit-card-fails-during-azure-sign-up.md).
2727

28-
### Change credit card for a subscription
28+
### Change credit card for a subscription by adding a new credit card
2929

3030
You can change the default credit of your Azure subscription to a new credit card or previously saved credit card in the Azure portal. You must be the Account Administrator to change the credit card. If more than one of your subscriptions have the same active payment method, then changing the active payment method on any of these subscriptions also updates the active payment method on the others.
3131

@@ -51,7 +51,7 @@ You can change your subscription's default credit card to a new one by following
5151

5252
1. Select **Next**.
5353

54-
If you get an error after you add the credit card, see [Credit card declined at Azure sign-up](../../billing/billing-credit-card-fails-during-azure-sign-up.md).
54+
### Change credit card for a subscription to a previously saved credit card
5555

5656
You can also change your subscription's default credit card to a one that is already saved to your account by following these steps:
5757

@@ -110,6 +110,7 @@ If your credit card is the active payment method for any of your Microsoft subsc
110110
The following sections apply to customers who have a Microsoft Customer Agreement and signed up for Azure online with a credit card. [Learn how to check if you have a Microsoft Customer Agreement](#check-the-type-of-your-account).
111111

112112
### Change default credit card
113+
113114
If you have a Microsoft Customer Agreement, your credit card is associated with a billing profile. To change the payment method for a billing profile, you must be the person who signed up for Azure and created the billing account.
114115

115116
If you'd like to change your billing profile's default payment method to check/wire transfer, see [Pay for Azure subscriptions by invoice](pay-by-invoice.md).
@@ -153,9 +154,11 @@ To edit or delete a credit card, follow these steps:
153154
1. To delete your credit card, select **Delete** from the context menu.
154155

155156
## Troubleshooting
157+
156158
We do not support virtual or prepaid cards. If you are getting errors when adding or updating a valid credit card, try opening your browser in private mode.
157159

158160
## Frequently asked questions
161+
159162
The following sections answer commonly asked questions about changing your credit card information.
160163

161164
### My subscription is disabled. Why can't I remove my credit card now?
@@ -183,11 +186,13 @@ If you're [paying by invoice](pay-by-invoice.md), send your payment to the locat
183186
To add or update tax ID, update your profile in the [Azure Account Center](https://account.azure.com/Profile), then select **Tax record**. This tax ID is used for tax exemption calculations and appears on your invoice.
184187

185188
## Check the type of your account
189+
186190
[!INCLUDE [billing-check-mca](../../../includes/billing-check-account-type.md)]
187191

188192
## Need help? Contact us.
189193

190194
If you have questions or need help, [create a support request](https://go.microsoft.com/fwlink/?linkid=2083458).
191195

192196
## Next steps
197+
193198
- Learn about [Azure reservations](../reservations/save-compute-costs-reservations.md) to see if they can save you money.

articles/data-factory/author-visually.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ To open the **authoring canvas**, click on the pencil icon.
2626

2727
![Authoring Canvas](media/author-visually/authoring-canvas.png)
2828

29-
Here, you will author the pipelines, activities, datasets, linked services, data flows, triggers, and integration runtimes that comprise your factory. To get started building a pipeline using the authoring canvas, see [Copy data using the copy Activity](tutorial-copy-data-portal.md).
29+
Here, you author the pipelines, activities, datasets, linked services, data flows, triggers, and integration runtimes that comprise your factory. To get started building a pipeline using the authoring canvas, see [Copy data using the copy Activity](tutorial-copy-data-portal.md).
3030

3131
The default visual authoring experience is directly working with the Data Factory service. Azure Repos Git or GitHub integration is also supported to allow source control and collaboration for work on your data factory pipelines. To learn more about the differences between these authoring experiences, see [Source control in Azure Data Factory](source-control.md).
3232

@@ -36,7 +36,7 @@ For top-level resources such as pipelines, datasets, and data flows, high-level
3636

3737
![Authoring Canvas](media/author-visually/properties-pane.png)
3838

39-
The properties pane will only be open by default on resource creation. To edit it, click on the properties pane icon located in the top-right corner of the canvas.
39+
The properties pane only opens by default on resource creation. To edit it, click on the properties pane icon located in the top-right corner of the canvas.
4040

4141
## Expressions and functions
4242

articles/data-factory/concepts-data-flow-monitoring.md

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -14,17 +14,17 @@ ms.date: 04/17/2020
1414

1515
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
1616

17-
After you have completed building and debugging your data flow, you will want to schedule your data flow to execute on a schedule within the context of a pipeline. You can schedule the pipeline from Azure Data Factory using Triggers. Or you can use the Trigger Now option from the Azure Data Factory Pipeline Builder to execute a single-run execution to test your data flow within the pipeline context.
17+
After you have completed building and debugging your data flow, you want to schedule your data flow to execute on a schedule within the context of a pipeline. You can schedule the pipeline from Azure Data Factory using Triggers. Or you can use the Trigger Now option from the Azure Data Factory Pipeline Builder to execute a single-run execution to test your data flow within the pipeline context.
1818

19-
When you execute your pipeline, you will be able to monitor the pipeline and all of the activities contained in the pipeline including the Data Flow activity. Click on the monitor icon in the left-hand Azure Data Factory UI panel. You will see a screen similar to the one below. The highlighted icons will allow you to drill into the activities in the pipeline, including the Data Flow activity.
19+
When you execute your pipeline, you can monitor the pipeline and all of the activities contained in the pipeline including the Data Flow activity. Click on the monitor icon in the left-hand Azure Data Factory UI panel. You can see a screen similar to the one below. The highlighted icons allow you to drill into the activities in the pipeline, including the Data Flow activity.
2020

2121
![Data Flow Monitoring](media/data-flow/mon001.png "Data Flow Monitoring")
2222

23-
You will see statistics at this level as well including the run times and status. The Run ID at the activity level is different that the Run ID at the pipeline level. The Run ID at the previous level is for the pipeline. Clicking the eyeglasses will give you deep details on your data flow execution.
23+
You see statistics at this level as well including the run times and status. The Run ID at the activity level is different than the Run ID at the pipeline level. The Run ID at the previous level is for the pipeline. Selecting the eyeglasses gives you deep details on your data flow execution.
2424

2525
![Data Flow Monitoring](media/data-flow/mon002.png "Data Flow Monitoring")
2626

27-
When you are in the graphical node monitoring view, you will see a simplified view-only version of your data flow graph.
27+
When you're in the graphical node monitoring view, you can see a simplified view-only version of your data flow graph.
2828

2929
![Data Flow Monitoring](media/data-flow/mon003.png "Data Flow Monitoring")
3030

@@ -34,18 +34,18 @@ Here is a video overview of monitoring performance of your data flows from the A
3434
3535
## View Data Flow Execution Plans
3636

37-
When your Data Flow is executed in Spark, Azure Data Factory determines optimal code paths based on the entirety of your data flow. Additionally, the execution paths may occur on different scale-out nodes and data partitions. Therefore, the monitoring graph represents the design of your flow, taking into account the execution path of your transformations. When you click on individual nodes, you will see "groupings" that represent code that was executed together on the cluster. The timings and counts that you see represent those groups as opposed to the individual steps in your design.
37+
When your Data Flow is executed in Spark, Azure Data Factory determines optimal code paths based on the entirety of your data flow. Additionally, the execution paths may occur on different scale-out nodes and data partitions. Therefore, the monitoring graph represents the design of your flow, taking into account the execution path of your transformations. When you select individual nodes, you can see "groupings" that represent code that was executed together on the cluster. The timings and counts that you see represent those groups as opposed to the individual steps in your design.
3838

3939
![Data Flow Monitoring](media/data-flow/mon004.png "Data Flow Monitoring")
4040

41-
* When you click on the open space in the monitoring window, the stats in the bottom pane will display timing and row counts for each Sink and the transformations that led to the sink data for transformation lineage.
41+
* When you select the open space in the monitoring window, the stats in the bottom pane display timing and row counts for each Sink and the transformations that led to the sink data for transformation lineage.
4242

43-
* When you select individual transformations, you will receive additional feedback on the right-hand panel that shows partition stats, column counts, skewness (how evenly is the data distributed across partitions), and kurtosis (how spiky is the data).
43+
* When you select individual transformations, you receive additional feedback on the right-hand panel that shows partition stats, column counts, skewness (how evenly is the data distributed across partitions), and kurtosis (how spiky is the data).
4444

45-
* When you click on the Sink in the node view, you will see column lineage. There are three different methods that columns are accumulated throughout your data flow to land in the Sink. They are:
45+
* When you select the Sink in the node view, you can see column lineage. There are three different methods that columns are accumulated throughout your data flow to land in the Sink. They are:
4646

47-
* Computed: You use the column for conditional processing or within an expression in your data flow, but do not land it in the Sink
48-
* Derived: The column is a new column that you generated in your flow, i.e. it was not present in the Source
47+
* Computed: You use the column for conditional processing or within an expression in your data flow, but don't land it in the Sink
48+
* Derived: The column is a new column that you generated in your flow, that is, it was not present in the Source
4949
* Mapped: The column originated from the source and your are mapping it to a sink field
5050
* Data flow status: The current status of your execution
5151
* Cluster startup time: Amount of time to acquire the JIT Spark compute environment for your data flow execution
@@ -59,4 +59,4 @@ This icon means that the transformation data was already cached on the cluster,
5959

6060
![Data Flow Monitoring](media/data-flow/mon004.png "Data Flow Monitoring")
6161

62-
You will also see green circle icons in the transformation. They represent a count of the number of sinks that data is flowing into.
62+
You also see green circle icons in the transformation. They represent a count of the number of sinks that data is flowing into.

0 commit comments

Comments
 (0)