Skip to content

Commit 024306a

Browse files
authored
Merge pull request #42472 from MicrosoftDocs/master
5/29 PM Publish
2 parents ebd1bd0 + 7da5270 commit 024306a

File tree

238 files changed

+1838
-2395
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

238 files changed

+1838
-2395
lines changed

.openpublishing.redirection.json

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5770,6 +5770,16 @@
57705770
"redirect_url": "/azure/cosmos-db/sql-api-get-started",
57715771
"redirect_document_id": false
57725772
},
5773+
{
5774+
"source_path": "articles/cosmos-db/sql-api-index.md",
5775+
"redirect_url": "/azure/cosmos-db/sql-api-introduction",
5776+
"redirect_document_id": false
5777+
},
5778+
{
5779+
"source_path": "articles/cosmos-db/mongodb-index.md",
5780+
"redirect_url": "/azure/cosmos-db/mongodb-introduction",
5781+
"redirect_document_id": false
5782+
},
57735783
{
57745784
"source_path": "articles/cosmos-db/documentdb-index.md",
57755785
"redirect_url": "/azure/cosmos-db/sql-api-index",
@@ -6990,6 +7000,16 @@
69907000
"redirect_url": "https://docs.microsoft.com/rest/api/searchservice/add-scoring-profiles-to-a-search-index",
69917001
"redirect_document_id": false
69927002
},
7003+
{
7004+
"source_path": "articles/search/ref-create-indexer.md",
7005+
"redirect_url": "https://docs.microsoft.com/rest/api/searchservice/create-indexer",
7006+
"redirect_document_id": true
7007+
},
7008+
{
7009+
"source_path": "articles/search/ref-create-skillset.md",
7010+
"redirect_url": "https://docs.microsoft.com/rest/api/searchservice/create-skillset",
7011+
"redirect_document_id": true
7012+
},
69937013
{
69947014
"source_path": "articles/search/search-api-2017-11-11-preview.md",
69957015
"redirect_url": "/azure/search/search-api-preview",
@@ -21975,6 +21995,11 @@
2197521995
"source_path": "articles/cognitive-services/LUIS/luis-quickstart-ruby-add-utterance.md",
2197621996
"redirect_url": "/azure/cognitive-services/LUIS/luis-get-started-ruby-add-utterance",
2197721997
"redirect_document_id": true
21998+
},
21999+
{
22000+
"source_path": "articles/cognitive-services/LUIS/Add-entities.md",
22001+
"redirect_url": "/azure/cognitive-services/LUIS/luis-how-to-add-entities",
22002+
"redirect_document_id": true
2197822003
}
2197922004
]
2198022005
}

articles/active-directory/authentication/concept-sspr-howitworks.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -151,6 +151,8 @@ Custom security questions are not localized for different locales. All custom qu
151151

152152
The maximum length of a custom security question is 200 characters.
153153

154+
To view the password reset portal and questions in a different localized language append "?mkt=<Locale>" to the end of the password reset URL with the example that follows localizing to Spanish [https://passwordreset.microsoftonline.com/?mkt=es-us](https://passwordreset.microsoftonline.com/?mkt=es-us).
155+
154156
### Security question requirements
155157

156158
* The minimum answer character limit is three characters.

articles/active-directory/managed-service-identity/bread/toc.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,9 +3,9 @@
33
topicHref: /azure/index
44
items:
55
- name: Active Directory
6-
tocHref: /azure/active-directory/index
6+
tocHref: /azure/active-directory/
77
topicHref: /azure/active-directory/index
88
items:
99
- name: Managed Service Identity
10-
tocHref: /azure/active-directory/managed-service-identity/index
10+
tocHref: /azure/active-directory/managed-service-identity/
1111
topicHref: /azure/active-directory/managed-service-identity/index

articles/active-directory/managed-service-identity/services-support-msi.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ The following Azure services support Managed Service Identity.
2626
| Azure Functions | Preview | September 2017 | [Azure portal](/azure/app-service/app-service-managed-service-identity#using-the-azure-portal)<br>[Azure Resource Manager template](/azure/app-service/app-service-managed-service-identity#using-an-azure-resource-manager-template) | [.NET](/azure/app-service/app-service-managed-service-identity#asal)<br>[REST](/azure/app-service/app-service-managed-service-identity#using-the-rest-protocol) |
2727
| Azure Data Factory V2 | Preview | November 2017 | [Azure portal](~/articles/data-factory/data-factory-service-identity.md#generate-service-identity)<br>[PowerShell](~/articles/data-factory/data-factory-service-identity.md#generate-service-identity-using-powershell)<br>[REST](~/articles/data-factory/data-factory-service-identity.md#generate-service-identity-using-rest-api)<br>[SDK](~/articles/data-factory/data-factory-service-identity.md#generate-service-identity-using-sdk) |
2828
| Azure API Management | Preview | October 2017 | [Azure Resource Manager template](/azure/api-management/api-management-howto-use-managed-service-identity) |
29-
| Azure Storage | Preview | May 2018 | [Azure portal](/azure/storage/common/storage-auth-aad.md)<br>[PowerShell](/azure/storage/common/storage-auth-aad.md)<br>[Azure CLI](/azure/storage/common/storage-auth-aad.md) | [REST](https://docs.microsoft.com/rest/api/storageservices/authenticate-with-azure-active-directory)<br>[.NET](/azure/storage/common/storage-auth-aad-app.md)<br>[Java](/azure/storage/common/storage-auth-aad.md)<br>[Python](/azure/storage/common/storage-auth-aad.md) |
29+
| Azure Storage | Preview | May 2018 | [Azure portal](/azure/storage/common/storage-auth-aad)<br>[PowerShell](/azure/storage/common/storage-auth-aad)<br>[Azure CLI](/azure/storage/common/storage-auth-aad) | [REST](https://docs.microsoft.com/rest/api/storageservices/authenticate-with-azure-active-directory)<br>[.NET](/azure/storage/common/storage-auth-aad-app)<br>[Java](/azure/storage/common/storage-auth-aad)<br>[Python](/azure/storage/common/storage-auth-aad) |
3030

3131
## Azure services that support Azure AD authentication
3232

articles/aks/networking-overview.md

Lines changed: 29 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -35,15 +35,41 @@ Nodes in an AKS cluster configured for Advanced networking use the [Azure Contai
3535
Advanced networking provides the following benefits:
3636

3737
* Deploy your AKS cluster into an existing VNet, or create a new VNet and subnet for your cluster.
38-
* Every pod in the cluster is assigned an IP address in the VNet, and can directly communicate with other pods in the cluster, and other VMs in the VNet.
38+
* Every pod in the cluster is assigned an IP address in the VNet, and can directly communicate with other pods in the cluster, and other nodes in the VNet.
3939
* A pod can connect to other services in a peered VNet, and to on-premises networks over ExpressRoute and site-to-site (S2S) VPN connections. Pods are also reachable from on-premises.
4040
* Expose a Kubernetes service externally or internally through the Azure Load Balancer. Also a feature of Basic networking.
4141
* Pods in a subnet that have service endpoints enabled can securely connect to Azure services, for example Azure Storage and SQL DB.
4242
* Use user-defined routes (UDR) to route traffic from pods to a Network Virtual Appliance.
4343
* Pods can access resources on the public Internet. Also a feature of Basic networking.
4444

4545
> [!IMPORTANT]
46-
> Each node in an AKS cluster configured for Advanced networking can host a maximum of **30 pods**. Each VNet provisioned for use with the Azure CNI plugin is limited to **4096 IP addresses** (/20).
46+
> Each node in an AKS cluster configured for Advanced networking can host a maximum of **30 pods**. Each VNet provisioned for use with the Azure CNI plugin is limited to **4096 configured IP addresses**.
47+
48+
## Advanced networking prerequisites
49+
50+
* The VNet for the AKS cluster must allow outbound internet connectivity.
51+
* Do not create more than one AKS cluster in the same subnet.
52+
* Advanced networking for AKS does not support VNets that use Azure Private DNS Zones.
53+
* AKS clusters may not use `169.254.0.0/16`, `172.30.0.0/16`, or `172.31.0.0/16` for the Kubernetes service address range.
54+
* The service principal used for the AKS cluster must have `Owner` permissions to the resource group containing the existing VNet.
55+
56+
## Plan IP addressing for your cluster
57+
58+
Clusters configured with Advanced networking require additional planning. The size of your VNet and its subnet must accommodate both the number of pods you plan to run as well as the number of nodes for the cluster.
59+
60+
IP addresses for the pods and the cluster's nodes are assigned from the specified subnet within the VNet. Each node is configured with a primary IP, which is the IP of the node and 30 additional IP addresses pre-configured by Azure CNI that are assigned to pods scheduled to the node. When you scale out your cluster, each node is similarly configured with IP addresses from the subnet.
61+
62+
The IP address plan for an AKS cluster consists of a VNet, at least one subnet for nodes and pods, and a Kubernetes service address range.
63+
64+
| Address range / Azure resource | Limits and sizing |
65+
| --------- | ------------- |
66+
| Virtual network | Azure VNet can be as large as /8 but may only have 4096 configured IP addresses. |
67+
| Subnet | Must be large enough to accommodate the nodes and Pods. To calculate your minimum subnet size: (Number of nodes) + (Number of nodes * Pods per node). For a 50 node cluster: (50) + (50 * 30) = 1,550, your subnet would need to be a /21 or larger. |
68+
| Kubernetes service address range | This range should not be used by any network element on or connected to this VNet. Service address CIDR must be smaller than /12. |
69+
| Kubernetes DNS service IP address | IP address within the Kubernetes service address range that will be used by cluster service discovery (kube-dns). |
70+
| Docker bridge address | IP address (in CIDR notation) used as the Docker bridge IP address on nodes. Default of 172.17.0.1/16. |
71+
72+
As mentioned previously, each VNet provisioned for use with the Azure CNI plugin is limited to **4096 configured IP addresses**. Each node in a cluster configured for Advanced networking can host a maximum of **30 pods**.
4773

4874
## Configure advanced networking
4975

@@ -63,14 +89,6 @@ The following screenshot from the Azure portal shows an example of configuring t
6389

6490
![Advanced networking configuration in the Azure portal][portal-01-networking-advanced]
6591

66-
## Plan IP addressing for your cluster
67-
68-
Clusters configured with Advanced networking require additional planning. The size of your VNet and its subnet must accommodate the number of pods you plan to run simultaneously in the cluster, as well as your scaling requirements.
69-
70-
IP addresses for the pods and the cluster's nodes are assigned from the specified subnet within the VNet. Each node is configured with a primary IP, which is the IP of the node itself, and 30 additional IP addresses pre-configured by Azure CNI that are assigned to pods scheduled to the node. When you scale out your cluster, each node is similarly configured with IP addresses from the subnet.
71-
72-
As mentioned previously, each VNet provisioned for use with the Azure CNI plugin is limited to **4096 IP addresses** (/20). Each node in a cluster configured for Advanced networking can host a maximum of **30 pods**.
73-
7492
## Frequently asked questions
7593

7694
The following questions and answers apply to the **Advanced** networking configuration.
@@ -89,7 +107,7 @@ The following questions and answers apply to the **Advanced** networking configu
89107

90108
* *Is the maximum number of pods deployable to a node configurable?*
91109

92-
By default, each node can host a maximum of 30 pods. You can currently change the maximum value only by modifying the `maxPods` property when deploying a cluster with a Resource Manager template.
110+
By default, each node can host a maximum of 30 pods. You can change the maximum value only by modifying the `maxPods` property when deploying a cluster with a Resource Manager template.
93111

94112
* *How do I configure additional properties for the subnet that I created during AKS cluster creation? For example, service endpoints.*
95113

articles/api-management/api-management-api-import-restrictions.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,10 +21,11 @@ ms.author: apipm
2121
## About this list
2222
When importing an API, you might come across some restrictions or identify issues that need to be rectified before you can successfully import. This article documents these, organized by the import format of the API.
2323

24-
## <a name="open-api"> </a>Open API/Swagger
25-
If you are receiving errors importing your Open API document, ensure you have validated it - either using the designer in the Azure portal (Design - Front End - Open API Specification Editor), or with a third-party tool such as <a href="http://www.swagger.io">Swagger Editor</a>.
24+
## <a name="open-api"> </a>OpenAPI/Swagger
25+
If you are receiving errors importing your OpenAPI document, ensure you have validated it - either using the designer in the Azure portal (Design - Front End - OpenAPI Specification Editor), or with a third-party tool such as <a href="http://www.swagger.io">Swagger Editor</a>.
2626

2727
* Only JSON format for OpenAPI is supported.
28+
* Required parameters across both path and query must have unique names. (In OpenAPI a parameter name only needs to be unique within a location, e.g. path, query, header. However, in API Management we allow operations to be discriminated by both path and query parameters (which OpenAPI does not support). Therefore we require parameter names to be unique within the entire URL template.)
2829
* Schemas referenced using **$ref** properties can't contain other **$ref** properties.
2930
* **$ref** pointers can't reference external files.
3031
* **x-ms-paths** and **x-servers** are the only supported extensions.

articles/azure-databricks/databricks-extract-load-sql-data-warehouse.md

Lines changed: 50 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -13,11 +13,10 @@ ms.devlang: na
1313
ms.topic: tutorial
1414
ms.tgt_pltfrm: na
1515
ms.workload: "Active"
16-
ms.date: 03/23/2018
16+
ms.date: 05/29/2018
1717
ms.author: nitinme
1818

1919
---
20-
2120
# Tutorial: Extract, transform, and load data using Azure Databricks
2221

2322
In this tutorial, you perform an ETL (extract, transform, and load data) operation using Azure Databricks. You extract data from Azure Data Lake Store into Azure Databricks, run transformations on the data in Azure Databricks, and then load the transformed data into Azure SQL Data Warehouse.
@@ -49,15 +48,15 @@ Before you start with this tutorial, make sure to meet the following requirement
4948
- Create a database master key for the Azure SQL Data Warehouse. Follow the instructions at [Create a Database Master Key](https://docs.microsoft.com/sql/relational-databases/security/encryption/create-a-database-master-key).
5049
- Create an Azure Blob storage account, and a container within it. Also, retrieve the access key to access the storage account. Follow the instructions at [Quickstart: Create an Azure Blog storage account](../storage/blobs/storage-quickstart-blobs-portal.md).
5150

52-
## Log in to the Azure portal
51+
## Log in to the Azure Portal
5352

5453
Log in to the [Azure portal](https://portal.azure.com/).
5554

5655
## Create an Azure Databricks workspace
5756

5857
In this section, you create an Azure Databricks workspace using the Azure portal.
5958

60-
1. In the Azure portal, select **Create a resource** > **Data + Analytics** > **Azure Databricks**.
59+
1. In the Azure portal, select **Create a resource** > **Data + Analytics** > **Azure Databricks**.
6160

6261
![Databricks on Azure portal](./media/databricks-extract-load-sql-data-warehouse/azure-databricks-on-portal.png "Databricks on Azure portal")
6362

@@ -192,22 +191,6 @@ When programmatically logging in, you need to pass the tenant ID with your authe
192191

193192
![tenant ID](./media/databricks-extract-load-sql-data-warehouse/copy-directory-id.png)
194193

195-
### Associate service principal with Azure Data Lake Store
196-
197-
In this section, you associate the Azure Data Lake Store account with the Azure Active Directory service principal you created. This ensures that you can access the Data Lake Store account from Azure Databricks.
198-
199-
1. From the [Azure portal](https://portal.azure.com), select the Data Lake Store account you created.
200-
201-
2. From the left pane, select **Access Control** > **Add**.
202-
203-
![Add Data Lake Store access](./media/databricks-extract-load-sql-data-warehouse/add-adls-access.png "Add Data Lake Store access")
204-
205-
3. In **Add permissions**, select a role that you want to assign to the service principal. For this tutorial, select **Owner**. For **Assign access to**, select **Azure AD, user, group, or application**. For **Select** enter the name of the service principal you created to filter down the number of service principals to select from.
206-
207-
![Select service principal](./media/databricks-extract-load-sql-data-warehouse/select-service-principal.png "Select service principal")
208-
209-
Select the service principal you created earlier, and then select **Save**. The service principal is now associated with the Azure Data Lake Store account.
210-
211194
## Upload data to Data Lake Store
212195

213196
In this section, you upload a sample data file to Data Lake Store. You use this file later in Azure Databricks to run some transformations. The sample data (**small_radio_json.json**) that you use in this tutorial is available in this [Github repo](https://github.com/Azure/usql/blob/master/Examples/Samples/Data/json/radiowebsite/small_radio_json.json).
@@ -228,6 +211,53 @@ In this section, you upload a sample data file to Data Lake Store. You use this
228211

229212
5. In this tutorial, you uploaded the data file to the root of the Data Lake Store. So, the file is now available at `adl://<YOUR_DATA_LAKE_STORE_ACCOUNT_NAME>.azuredatalakestore.net/small_radio_json.json`.
230213

214+
## Associate service principal with Azure Data Lake Store
215+
216+
In this section, you associate the data in Azure Data Lake Store account with the Azure Active Directory service principal you created. This ensures that you can access the Data Lake Store account from Azure Databricks. For the scenario in this article, you read the data in Data Lake Store to populate a table in SQL Data Warehouse. According to [Overview of Access Control in Data Lake Store](../data-lake-store/data-lake-store-access-control.md#common-scenarios-related-to-permissions), to have read access on a file in Data Lake Store, you must have:
217+
218+
- **Execute** permissions on all the folders in the folder structure leading up to the file.
219+
- **Read** permissions on the file itself.
220+
221+
Perform the following steps to grant these permissions.
222+
223+
1. From the [Azure portal](https://portal.azure.com), select the Data Lake Store account you created, and then select **Data Explorer**.
224+
225+
![Launch Data Explorer](./media/databricks-extract-load-sql-data-warehouse/azure-databricks-data-explorer.png "Launch Data Explorer")
226+
227+
2. In this scenario, because the sample data file is at the root of the folder structure, you only need to assign **Execute** permissions at the folder root. To do so, from the root of data explorer, select **Access**.
228+
229+
![Add ACLs for the folder](./media/databricks-extract-load-sql-data-warehouse/add-adls-access-folder-1.png "Add ACLs for the folder")
230+
231+
3. Under **Access**, select **Add**.
232+
233+
![Add ACLs for the folder](./media/databricks-extract-load-sql-data-warehouse/add-adls-access-folder-2.png "Add ACLs for the folder")
234+
235+
4. Under **Assign permissions**, click **Select user or group** and search for the Azure Active Directory service principal you created earlier.
236+
237+
![Add Data Lake Store access](./media/databricks-extract-load-sql-data-warehouse/add-adls-access-folder-3.png "Add Data Lake Store access")
238+
239+
Select the AAD service principal you want to assign and click **Select**.
240+
241+
5. Under **Assign permissions**, click **Select permissions** > **Execute**. Keep the other default values and select **OK** under **Select permissions** and then under **Assign permissions**.
242+
243+
![Add Data Lake Store access](./media/databricks-extract-load-sql-data-warehouse/add-adls-access-folder-4.png "Add Data Lake Store access")
244+
245+
6. Go back to the Data Explorer and now click the file on which you want to assign the read permission. Under **File Preview**, select **Access**.
246+
247+
![Add Data Lake Store access](./media/databricks-extract-load-sql-data-warehouse/add-adls-access-file-1.png "Add Data Lake Store access")
248+
249+
7. Under **Access** select **Add**. Under **Assign permissions**, click **Select user or group** and search for the Azure Active Directory service principal you created earlier.
250+
251+
![Add Data Lake Store access](./media/databricks-extract-load-sql-data-warehouse/add-adls-access-folder-3.png "Add Data Lake Store access")
252+
253+
Select the AAD service principal you want to assign and click **Select**.
254+
255+
8. Under **Assign permissions**, click **Select permissions** > **Read**. Select **OK** under **Select permissions** and then under **Assign permissions**.
256+
257+
![Add Data Lake Store access](./media/databricks-extract-load-sql-data-warehouse/add-adls-access-file-2.png "Add Data Lake Store access")
258+
259+
The service principal now has sufficient permissions to read the sample data file from Azure Data Lake Store.
260+
231261
## Extract data from Data Lake Store
232262

233263
In this section, you create a notebook in Azure Databricks workspace and then run code snippets to extract data from Data Lake Store into Azure Databricks.
107 KB
Loading
16.8 KB
Loading
21.1 KB
Loading

0 commit comments

Comments
 (0)