Skip to content

Commit 0cb2a53

Browse files
committed
Merge branch 'master' of https://github.com/MicrosoftDocs/azure-docs-pr into heidist-master
2 parents c6dfb5b + 9672e63 commit 0cb2a53

10 files changed

+14
-22
lines changed

articles/frontdoor/quickstart-create-front-door.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ If you don't already have a web app, use the following steps to set up example w
3838

3939
1. Select **Web** > **Web App**.
4040

41-
![Create a web app in the Azure portal](media/quickstart-create-front-door/create-web-app-for-front-door.png)
41+
![Create a web app in the Azure portal](media/quickstart-create-front-door/create-web-app-azure-front-door.png)
4242

4343
1. In **Web App**, select the **Subscription** to use.
4444

@@ -56,7 +56,7 @@ If you don't already have a web app, use the following steps to set up example w
5656

5757
1. Select **Review + create**, review the **Summary**, and then select **Create**. It might take several minutes for the deployment to complete.
5858

59-
![Review summary for web app](media/quickstart-create-front-door/summary-for-web-app-for-front-door.png)
59+
![Review summary for web app](media/quickstart-create-front-door/web-app-summary-azure-front-door.png)
6060

6161
After your deployment is complete, create a second web app. Use the same procedure with the same values, except for the following values:
6262

@@ -83,7 +83,7 @@ Configure Azure Front Door to direct user traffic based on lowest latency betwee
8383

8484
1. For **Host name**, enter a globally unique hostname. This example uses *contoso-frontend*. Select **Add**.
8585

86-
![Add a frontend host for Azure Front Door](media/quickstart-create-front-door/add-frontend-host-for-front-door.png)
86+
![Add a frontend host for Azure Front Door](media/quickstart-create-front-door/add-frontend-host-azure-front-door.png)
8787

8888
Next, create a backend pool that contains your two web apps.
8989

@@ -99,7 +99,7 @@ Next, create a backend pool that contains your two web apps.
9999

100100
1. Select your subscription, again, and choose the second web app you created from **Backend host name**. Select **Add**.
101101

102-
![Add a backend host to your Front Door](media/quickstart-create-front-door/add-backend-host-to-pool-for-front-door.png)
102+
![Add a backend host to your Front Door](media/quickstart-create-front-door/add-backend-host-pool-azure-front-door.png)
103103

104104
Finally, add a routing rule. A routing rule maps your frontend host to the backend pool. The rule forwards a request for `contoso-frontend.azurefd.net` to **myBackendPool**.
105105

@@ -112,7 +112,7 @@ Finally, add a routing rule. A routing rule maps your frontend host to the backe
112112
113113
1. Select **Review + Create**, and then **Create**.
114114

115-
![Configured Azure Front Door](media/quickstart-create-front-door/configuration-of-front-door.png)
115+
![Configured Azure Front Door](media/quickstart-create-front-door/configuration-azure-front-door.png)
116116

117117
## View Azure Front Door in action
118118

@@ -137,7 +137,7 @@ To test instant global failover in action, try the following steps:
137137

138138
1. Refresh your browser. This time, you should see an error message.
139139

140-
![Both instances of the web app stopped](media/quickstart-create-front-door/service-has-been-stopped.png)
140+
![Both instances of the web app stopped](media/quickstart-create-front-door/web-app-stopped-message.png)
141141

142142
## Clean up resources
143143

articles/hdinsight/set-up-pyspark-interactive-environment.md

Lines changed: 4 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.author: hrasheed
77
ms.reviewer: jasonh
88
ms.service: hdinsight
99
ms.topic: conceptual
10-
ms.date: 04/14/2020
10+
ms.date: 04/23/2020
1111
---
1212

1313
# Set up the PySpark interactive environment for Visual Studio Code
@@ -18,9 +18,9 @@ We use **python/pip** command to build virtual environment in your Home path. If
1818

1919
1. Install [Python](https://www.python.org/downloads/) and [pip](https://pip.pypa.io/en/stable/installing/).
2020

21-
+ Install Python from [https://www.python.org/downloads/](https://www.python.org/downloads/).
22-
+ Install pip from [https://pip.pypa.io/en/stable/installing](https://pip.pypa.io/en/stable/installing/) (if it's not installed from the Python installation).
23-
+ Validate that Python and pip are installed successfully by using the following commands. (Optional)
21+
* Install Python from [https://www.python.org/downloads/](https://www.python.org/downloads/).
22+
* Install pip from [https://pip.pypa.io/en/stable/installing](https://pip.pypa.io/en/stable/installing/) (if it's not installed from the Python installation).
23+
* Validate that Python and pip are installed successfully by using the following commands. (Optional)
2424

2525
![Check Python pip version command](./media/set-up-pyspark-interactive-environment/check-python-pip-version.png)
2626

@@ -59,12 +59,4 @@ Restart VS Code, and then go back to the script editor that's running **HDInsigh
5959

6060
* [Use Azure HDInsight Tool for Visual Studio Code](hdinsight-for-vscode.md)
6161
* [Use Azure Toolkit for IntelliJ to create and submit Apache Spark Scala applications](spark/apache-spark-intellij-tool-plugin.md)
62-
* [Use Azure Toolkit for IntelliJ to debug Apache Spark applications remotely through SSH](spark/apache-spark-intellij-tool-debug-remotely-through-ssh.md)
63-
* [Use Azure Toolkit for IntelliJ to debug Apache Spark applications remotely through VPN](spark/apache-spark-intellij-tool-plugin-debug-jobs-remotely.md)
64-
* [Use HDInsight Tools in Azure Toolkit for Eclipse to create Apache Spark applications](spark/apache-spark-eclipse-tool-plugin.md)
65-
* [Use Apache Zeppelin notebooks with an Apache Spark cluster on HDInsight](spark/apache-spark-zeppelin-notebook.md)
66-
* [Kernels available for Jupyter notebook in an Apache Spark cluster for HDInsight](spark/apache-spark-jupyter-notebook-kernels.md)
67-
* [Use external packages with Jupyter notebooks](spark/apache-spark-jupyter-notebook-use-external-packages.md)
6862
* [Install Jupyter on your computer and connect to an HDInsight Spark cluster](spark/apache-spark-jupyter-notebook-install-locally.md)
69-
* [Visualize Apache Hive data with Microsoft Power BI in Azure HDInsight](hadoop/apache-hadoop-connect-hive-power-bi.md)
70-
* [Use Apache Zeppelin to run Apache Hive queries in Azure HDInsight](./interactive-query/hdinsight-connect-hive-zeppelin.md)

articles/hdinsight/spark/apache-spark-jupyter-notebook-install-locally.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.reviewer: jasonh
77
ms.service: hdinsight
88
ms.topic: conceptual
99
ms.custom: hdinsightactive
10-
ms.date: 04/02/2020
10+
ms.date: 04/23/2020
1111
---
1212

1313
# Install Jupyter notebook on your computer and connect to Apache Spark on HDInsight
@@ -177,5 +177,5 @@ Reasons to install Jupyter on your computer and then connect it to an Apache Spa
177177
## Next steps
178178
179179
* [Overview: Apache Spark on Azure HDInsight](apache-spark-overview.md)
180-
* [Apache Spark with BI: Analyze Apache Spark data using Power BI in HDInsight](apache-spark-use-bi-tools.md)
181-
* [Apache Spark with Machine Learning: Use Spark in HDInsight for analyzing building temperature using HVAC data](apache-spark-ipython-notebook-machine-learning.md)
180+
* [Kernels for Jupyter notebook on Apache Spark](apache-spark-jupyter-notebook-kernels.md)
181+
* [Use external packages with Jupyter notebooks in Apache Spark](apache-spark-jupyter-notebook-use-external-packages.md)

articles/key-vault/general/overview-soft-delete.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ You cannot reuse the name of a key vault that has been soft-deleted until the re
4141

4242
### Purge protection
4343

44-
Purge protection is an optional Key Vault behavior and is **not enabled by default**. It can be turned on via [CLI](soft-delete-cli.md#enabling-purge-protection) or [PowerShell](soft-delete-powershell.md#enabling-purge-protection).
44+
Purge protection is an optional Key Vault behavior and is **not enabled by default**. Purge protection can only be enabled once soft-delete is enabled. It can be turned on via [CLI](soft-delete-cli.md#enabling-purge-protection) or [PowerShell](soft-delete-powershell.md#enabling-purge-protection).
4545

4646
When purge protection is on, a vault or an object in the deleted state cannot be purged until the retention period has passed. Soft-deleted vaults and objects can still be recovered, ensuring that the retention policy will be followed.
4747

0 commit comments

Comments
 (0)