You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/frontdoor/quickstart-create-front-door.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -38,7 +38,7 @@ If you don't already have a web app, use the following steps to set up example w
38
38
39
39
1. Select **Web** > **Web App**.
40
40
41
-

41
+

42
42
43
43
1. In **Web App**, select the **Subscription** to use.
44
44
@@ -56,7 +56,7 @@ If you don't already have a web app, use the following steps to set up example w
56
56
57
57
1. Select **Review + create**, review the **Summary**, and then select **Create**. It might take several minutes for the deployment to complete.
58
58
59
-

59
+

60
60
61
61
After your deployment is complete, create a second web app. Use the same procedure with the same values, except for the following values:
62
62
@@ -83,7 +83,7 @@ Configure Azure Front Door to direct user traffic based on lowest latency betwee
83
83
84
84
1. For **Host name**, enter a globally unique hostname. This example uses *contoso-frontend*. Select **Add**.
85
85
86
-

86
+

87
87
88
88
Next, create a backend pool that contains your two web apps.
89
89
@@ -99,7 +99,7 @@ Next, create a backend pool that contains your two web apps.
99
99
100
100
1. Select your subscription, again, and choose the second web app you created from **Backend host name**. Select **Add**.
101
101
102
-

102
+

103
103
104
104
Finally, add a routing rule. A routing rule maps your frontend host to the backend pool. The rule forwards a request for `contoso-frontend.azurefd.net` to **myBackendPool**.
105
105
@@ -112,7 +112,7 @@ Finally, add a routing rule. A routing rule maps your frontend host to the backe
112
112
113
113
1. Select **Review + Create**, and then **Create**.
114
114
115
-

115
+

116
116
117
117
## View Azure Front Door in action
118
118
@@ -137,7 +137,7 @@ To test instant global failover in action, try the following steps:
137
137
138
138
1. Refresh your browser. This time, you should see an error message.
139
139
140
-

140
+

Copy file name to clipboardExpand all lines: articles/hdinsight/set-up-pyspark-interactive-environment.md
+4-12Lines changed: 4 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ ms.author: hrasheed
7
7
ms.reviewer: jasonh
8
8
ms.service: hdinsight
9
9
ms.topic: conceptual
10
-
ms.date: 04/14/2020
10
+
ms.date: 04/23/2020
11
11
---
12
12
13
13
# Set up the PySpark interactive environment for Visual Studio Code
@@ -18,9 +18,9 @@ We use **python/pip** command to build virtual environment in your Home path. If
18
18
19
19
1. Install [Python](https://www.python.org/downloads/) and [pip](https://pip.pypa.io/en/stable/installing/).
20
20
21
-
+ Install Python from [https://www.python.org/downloads/](https://www.python.org/downloads/).
22
-
+ Install pip from [https://pip.pypa.io/en/stable/installing](https://pip.pypa.io/en/stable/installing/) (if it's not installed from the Python installation).
23
-
+ Validate that Python and pip are installed successfully by using the following commands. (Optional)
21
+
* Install Python from [https://www.python.org/downloads/](https://www.python.org/downloads/).
22
+
* Install pip from [https://pip.pypa.io/en/stable/installing](https://pip.pypa.io/en/stable/installing/) (if it's not installed from the Python installation).
23
+
* Validate that Python and pip are installed successfully by using the following commands. (Optional)
24
24
25
25

26
26
@@ -59,12 +59,4 @@ Restart VS Code, and then go back to the script editor that's running **HDInsigh
59
59
60
60
*[Use Azure HDInsight Tool for Visual Studio Code](hdinsight-for-vscode.md)
61
61
*[Use Azure Toolkit for IntelliJ to create and submit Apache Spark Scala applications](spark/apache-spark-intellij-tool-plugin.md)
62
-
*[Use Azure Toolkit for IntelliJ to debug Apache Spark applications remotely through SSH](spark/apache-spark-intellij-tool-debug-remotely-through-ssh.md)
63
-
*[Use Azure Toolkit for IntelliJ to debug Apache Spark applications remotely through VPN](spark/apache-spark-intellij-tool-plugin-debug-jobs-remotely.md)
64
-
*[Use HDInsight Tools in Azure Toolkit for Eclipse to create Apache Spark applications](spark/apache-spark-eclipse-tool-plugin.md)
65
-
*[Use Apache Zeppelin notebooks with an Apache Spark cluster on HDInsight](spark/apache-spark-zeppelin-notebook.md)
66
-
*[Kernels available for Jupyter notebook in an Apache Spark cluster for HDInsight](spark/apache-spark-jupyter-notebook-kernels.md)
67
-
*[Use external packages with Jupyter notebooks](spark/apache-spark-jupyter-notebook-use-external-packages.md)
68
62
*[Install Jupyter on your computer and connect to an HDInsight Spark cluster](spark/apache-spark-jupyter-notebook-install-locally.md)
69
-
*[Visualize Apache Hive data with Microsoft Power BI in Azure HDInsight](hadoop/apache-hadoop-connect-hive-power-bi.md)
70
-
*[Use Apache Zeppelin to run Apache Hive queries in Azure HDInsight](./interactive-query/hdinsight-connect-hive-zeppelin.md)
Copy file name to clipboardExpand all lines: articles/hdinsight/spark/apache-spark-jupyter-notebook-install-locally.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ ms.reviewer: jasonh
7
7
ms.service: hdinsight
8
8
ms.topic: conceptual
9
9
ms.custom: hdinsightactive
10
-
ms.date: 04/02/2020
10
+
ms.date: 04/23/2020
11
11
---
12
12
13
13
# Install Jupyter notebook on your computer and connect to Apache Spark on HDInsight
@@ -177,5 +177,5 @@ Reasons to install Jupyter on your computer and then connect it to an Apache Spa
177
177
## Next steps
178
178
179
179
* [Overview: Apache Spark on Azure HDInsight](apache-spark-overview.md)
180
-
* [Apache Spark with BI: Analyze Apache Spark data using Power BI in HDInsight](apache-spark-use-bi-tools.md)
181
-
* [Apache Spark with Machine Learning: Use Spark in HDInsight for analyzing building temperature using HVAC data](apache-spark-ipython-notebook-machine-learning.md)
180
+
* [Kernels for Jupyter notebook on Apache Spark](apache-spark-jupyter-notebook-kernels.md)
181
+
* [Use external packages with Jupyter notebooks in Apache Spark](apache-spark-jupyter-notebook-use-external-packages.md)
Copy file name to clipboardExpand all lines: articles/key-vault/general/overview-soft-delete.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -41,7 +41,7 @@ You cannot reuse the name of a key vault that has been soft-deleted until the re
41
41
42
42
### Purge protection
43
43
44
-
Purge protection is an optional Key Vault behavior and is **not enabled by default**. It can be turned on via [CLI](soft-delete-cli.md#enabling-purge-protection) or [PowerShell](soft-delete-powershell.md#enabling-purge-protection).
44
+
Purge protection is an optional Key Vault behavior and is **not enabled by default**. Purge protection can only be enabled once soft-delete is enabled. It can be turned on via [CLI](soft-delete-cli.md#enabling-purge-protection) or [PowerShell](soft-delete-powershell.md#enabling-purge-protection).
45
45
46
46
When purge protection is on, a vault or an object in the deleted state cannot be purged until the retention period has passed. Soft-deleted vaults and objects can still be recovered, ensuring that the retention policy will be followed.
0 commit comments