Skip to content

Commit 24f6e45

Browse files
authored
Merge pull request #293760 from MicrosoftDocs/repo_sync_working_branch
Confirm merge from repo_sync_working_branch to main to sync with https://github.com/MicrosoftDocs/azure-docs (branch main)
2 parents 47918f4 + 879addd commit 24f6e45

File tree

10 files changed

+36
-36
lines changed

10 files changed

+36
-36
lines changed

articles/app-service/tutorial-php-mysql-app.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ ms.custom: mvc, cli-validate, devdivchpfy22, AppServiceConnectivity
1212

1313
# Tutorial: Deploy a PHP, MySQL, and Redis app to Azure App Service
1414

15-
This tutorial shows how to create a secure PHP app in Azure App Service that's connected to a MySQL database (using Azure Database for MySQL flexible server). You'll also deploy an Azure Cache for Redis to enable the caching code in your application. Azure App Service is a highly scalable, self-patching, web-hosting service that can easily deploy apps on Windows or Linux. When you're finished, you'll have a Laravel app running on Azure App Service on Linux.
15+
This tutorial shows how to create a secure PHP app in Azure App Service that's connected to a MySQL database (using Azure Database for MySQL Flexible Server). You'll also deploy an Azure Cache for Redis to enable the caching code in your application. Azure App Service is a highly scalable, self-patching, web-hosting service that can easily deploy apps on Windows or Linux. When you're finished, you'll have a Laravel app running on Azure App Service on Linux.
1616

1717
:::image type="content" source="./media/tutorial-php-mysql-app/azure-portal-browse-app-2.png" alt-text="Screenshot of the Azure app example titled Task List showing new tasks added.":::
1818

@@ -85,7 +85,7 @@ Sign in to the [Azure portal](https://portal.azure.com/) and follow these steps
8585
- **Virtual network** → Integrated with the App Service app and isolates back-end network traffic.
8686
- **Private endpoints** → Access endpoints for the database server and the Redis cache in the virtual network.
8787
- **Network interfaces** → Represents private IP addresses, one for each of the private endpoints.
88-
- **Azure Database for MySQL flexible server** → Accessible only from behind its private endpoint. A database and a user are created for you on the server.
88+
- **Azure Database for MySQL Flexible Server** → Accessible only from behind its private endpoint. A database and a user are created for you on the server.
8989
- **Azure Cache for Redis** → Accessible only from behind its private endpoint.
9090
- **Private DNS zones** → Enable DNS resolution of the database server and the Redis cache in the virtual network.
9191
:::column-end:::

articles/event-hubs/event-hubs-scalability.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ The throughput capacity of event hubs is controlled by **throughput units**. Thr
2222

2323
Beyond the capacity of the purchased throughput units, ingress is throttled and Event Hubs throws a [EventHubsException](/dotnet/api/azure.messaging.eventhubs.eventhubsexception) (with a Reason value of ServiceBusy). Egress doesn't produce throttling exceptions, but is still limited to the capacity of the purchased throughput units. If you receive publishing rate exceptions or are expecting to see higher egress, be sure to check how many throughput units you have purchased for the namespace. You can manage throughput units on the **Scale** page of the namespaces in the [Azure portal](https://portal.azure.com). You can also manage throughput units programmatically using the [Event Hubs APIs](./event-hubs-samples.md).
2424

25-
Throughput units are prepurchased and are billed per hour. Once purchased, throughput units are billed for a minimum of one hour. Up to 40 throughput units can be purchased for an Event Hubs namespace and are shared across all event hubs in that namespace.
25+
Throughput units are prepurchased and are billed per hour. Once purchased, throughput units are billed for a minimum of one hour. Up to 40 throughput units can be purchased for an Event Hubs namespace and are shared across all event hubs in that namespace. The total ingress and egress capacity of these throughput units is also shared among all partitions and consumers within each event hub, meaning multiple consumers reading from the same partition must share the available bandwidth.
2626

2727
The **Auto-inflate** feature of Event Hubs automatically scales up by increasing the number of throughput units, to meet usage needs. Increasing throughput units prevents throttling scenarios, in which:
2828

articles/event-hubs/includes/event-hubs-partition-count.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,6 @@ ms.custom: "include file"
1010

1111
---
1212

13-
As [partition](../event-hubs-features.md#partitions) is a data organization mechanism that allows you to publish and consume data in a parallel manner. We recommend that you balance scaling units (throughput units for the standard tier, processing units for the premium tier, or capacity units for the dedicated tier) and partitions to achieve optimal scale. In general, we recommend a maximum throughput of 1 MB/s per partition. Therefore, a rule of thumb for calculating the number of partitions would be to divide the maximum expected throughput by 1 MB/s. For example, if your use case requires 20 MB/s, we recommend that you choose at least 20 partitions to achieve the optimal throughput.
13+
A [partition](../event-hubs-features.md#partitions) is a data organization mechanism that enables parallel publishing and consumption. While it supports parallel processing and scaling, total capacity remains limited by the namespace’s scaling allocation. We recommend that you balance scaling units (throughput units for the standard tier, processing units for the premium tier, or capacity units for the dedicated tier) and partitions to achieve optimal scale. In general, we recommend a maximum throughput of 1 MB/s per partition. Therefore, a rule of thumb for calculating the number of partitions would be to divide the maximum expected throughput by 1 MB/s. For example, if your use case requires 20 MB/s, we recommend that you choose at least 20 partitions to achieve the optimal throughput.
1414

1515
However, if you have a model in which your application has an affinity to a particular partition, increasing the number of partitions isn't beneficial. For more information, see [availability and consistency](../event-hubs-availability-and-consistency.md).

articles/hdinsight/hdinsight-hadoop-windows-tools.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -46,15 +46,15 @@ Examples of tasks you can do with the .NET SDK in Visual Studio:
4646
* [Run Apache Hive queries using the .NET SDK](hadoop/apache-hadoop-use-hive-dotnet-sdk.md).
4747
* [Use C# user-defined functions with Apache Hive and Apache Pig streaming on Apache Hadoop](hadoop/apache-hadoop-hive-pig-udf-dotnet-csharp.md).
4848

49-
## Intellij IDEA and Eclipse IDE for Spark clusters
49+
## IntelliJ IDEA and Eclipse IDE for Spark clusters
5050

51-
Both [Intellij IDEA](https://www.jetbrains.com/idea/download) and the [Eclipse IDE](https://www.eclipse.org/downloads/) can be used to:
51+
Both [IntelliJ IDEA](https://www.jetbrains.com/idea/download) and the [Eclipse IDE](https://www.eclipse.org/downloads/) can be used to:
5252
* Develop and submit a Scala Spark application on an HDInsight Spark cluster.
5353
* Access Spark cluster resources.
5454
* Develop and run a Scala Spark application locally.
5555

5656
These articles show how:
57-
* Intellij IDEA: [Create Apache Spark applications using the Azure Toolkit for Intellij plug-in and the Scala SDK.](spark/apache-spark-intellij-tool-plugin.md)
57+
* IntelliJ IDEA: [Create Apache Spark applications using the Azure Toolkit for IntelliJ plug-in and the Scala SDK.](spark/apache-spark-intellij-tool-plugin.md)
5858
* Eclipse IDE or Scala IDE for Eclipse: [Create Apache Spark applications and the Azure Toolkit for Eclipse](spark/apache-spark-eclipse-tool-plugin.md)
5959

6060
## Notebooks on Spark for data scientists

articles/hdinsight/spark/apache-spark-intellij-tool-debug-remotely-through-ssh.md

Lines changed: 15 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@ This article provides step-by-step guidance on how to use HDInsight Tools in [Az
4040
* **Maven** for Scala project-creation wizard support.
4141
* **SBT** for managing the dependencies and building for the Scala project.
4242

43-
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-create-projectfor-debug-remotely.png" alt-text="Intellij Create New Project Spark." border="true":::
43+
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-create-projectfor-debug-remotely.png" alt-text="IntelliJ Create New Project Spark." border="true":::
4444

4545
1. Select **Next**.
4646

@@ -53,7 +53,7 @@ This article provides step-by-step guidance on how to use HDInsight Tools in [Az
5353
|Project SDK|If blank, select **New...** and navigate to your JDK.|
5454
|Spark Version|The creation wizard integrates the proper version for Spark SDK and Scala SDK. If the Spark cluster version is earlier than 2.0, select **Spark 1.x**. Otherwise, select **Spark 2.x.**. This example uses **Spark 2.3.0 (Scala 2.11.8)**.|
5555

56-
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-new-project.png" alt-text="Intellij New Project select Spark version." border="true":::
56+
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-new-project.png" alt-text="IntelliJ New Project select Spark version." border="true":::
5757

5858
1. Select **Finish**. It may take a few minutes before the project becomes available. Watch the bottom right-hand corner for progress.
5959

@@ -65,11 +65,11 @@ This article provides step-by-step guidance on how to use HDInsight Tools in [Az
6565

6666
1. Once local run completed, you can see the output file save to your current project explorer **data** > **__default__**.
6767

68-
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/spark-local-run-result.png" alt-text="Intellij Project local run result." border="true":::
68+
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/spark-local-run-result.png" alt-text="IntelliJ Project local run result." border="true":::
6969

7070
1. Our tools have set the default local run configuration automatically when you perform the local run and local debug. Open the configuration **[Spark on HDInsight] XXX** on the upper right corner, you can see the **[Spark on HDInsight]XXX** already created under **Apache Spark on HDInsight**. Switch to **Locally Run** tab.
7171

72-
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/local-run-configuration.png" alt-text="Intellij Run debug configurations local run." border="true":::
72+
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/local-run-configuration.png" alt-text="IntelliJ Run debug configurations local run." border="true":::
7373

7474
- [Environment variables](#prerequisites): If you already set the system environment variable **HADOOP_HOME** to **C:\WinUtils**, it can auto detect that no need to manually add.
7575
- [WinUtils.exe Location](#prerequisites): If you have not set the system environment variable, you can find the location by clicking its button.
@@ -89,35 +89,35 @@ This article provides step-by-step guidance on how to use HDInsight Tools in [Az
8989

9090
1. In the **Run/Debug Configurations** dialog box, select the plus sign (**+**). Then select the **Apache Spark on HDInsight** option.
9191

92-
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-add-new-Configuration.png" alt-text="Intellij Add new configuration." border="true":::
92+
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-add-new-Configuration.png" alt-text="IntelliJ Add new configuration." border="true":::
9393

9494
1. Switch to **Remotely Run in Cluster** tab. Enter information for **Name**, **Spark cluster**, and **Main class name**. Then Click **Advanced configuration (Remote Debugging)**. Our tools support debug with **Executors**. The **numExecutors**, the default value is 5. You'd better not set higher than 3.
9595

96-
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-run-debug-configurations.png" alt-text="Intellij Run debug configurations." border="true":::
96+
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-run-debug-configurations.png" alt-text="IntelliJ Run debug configurations." border="true":::
9797

9898
1. In the **Advanced Configuration (Remote Debugging)** part, select **Enable Spark remote debug**. Enter the SSH username, and then enter a password or use a private key file. If you want to perform remote debug, you need to set it. There is no need to set it if you just want to use remote run.
9999

100-
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-enable-spark-remote-debug.png" alt-text="Intellij Advanced Configuration enable spark remote debug." border="true":::
100+
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-enable-spark-remote-debug.png" alt-text="IntelliJ Advanced Configuration enable spark remote debug." border="true":::
101101

102102
1. The configuration is now saved with the name you provided. To view the configuration details, select the configuration name. To make changes, select **Edit Configurations**.
103103

104104
1. After you complete the configurations settings, you can run the project against the remote cluster or perform remote debugging.
105105

106-
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/perform-remote-run-button.png" alt-text="Intellij Debug Remote Spark Job Remote run button." border="true":::
106+
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/perform-remote-run-button.png" alt-text="IntelliJ Debug Remote Spark Job Remote run button." border="true":::
107107

108108
1. Click the **Disconnect** button that the submission logs not appear in the left panel. However, it is still running on the backend.
109109

110-
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/spark-remote-run-result.png" alt-text="Intellij Debug Remote Spark Job Remote run result." border="true":::
110+
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/spark-remote-run-result.png" alt-text="IntelliJ Debug Remote Spark Job Remote run result." border="true":::
111111

112112
## Perform remote debugging
113113

114114
1. Set up breaking points, and then Click the **Remote debug** icon. The difference with remote submission is that SSH username/password need to be configured.
115115

116-
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-debug-icon.png" alt-text="Intellij Debug Remote Spark Job debug icon." border="true":::
116+
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-debug-icon.png" alt-text="IntelliJ Debug Remote Spark Job debug icon." border="true":::
117117

118118
1. When the program execution reaches the breaking point, you see a **Driver** tab and two **Executor** tabs in the **Debugger** pane. Select the **Resume Program** icon to continue running the code, which then reaches the next breakpoint. You need to switch to the correct **Executor** tab to find the target executor to debug. You can view the execution logs on the corresponding **Console** tab.
119119

120-
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-debugger-tab.png" alt-text="Intellij Debug Remote Spark Job Debugging tab." border="true":::
120+
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-debugger-tab.png" alt-text="IntelliJ Debug Remote Spark Job Debugging tab." border="true":::
121121

122122
### Perform remote debugging and bug fixing
123123

@@ -127,21 +127,21 @@ This article provides step-by-step guidance on how to use HDInsight Tools in [Az
127127

128128
1. Select the **Resume Program** icon to continue. The code stops at the second point. The exception is caught as expected.
129129

130-
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-throw-error.png" alt-text="Intellij Debug Remote Spark Job throw error." border="true":::
130+
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-throw-error.png" alt-text="IntelliJ Debug Remote Spark Job throw error." border="true":::
131131

132132
1. Select the **Resume Program** icon again. The **HDInsight Spark Submission** window displays a "job run failed" error.
133133

134-
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-error-submission.png" alt-text="Intellij Debug Remote Spark Job Error submission." border="true":::
134+
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-error-submission.png" alt-text="IntelliJ Debug Remote Spark Job Error submission." border="true":::
135135

136136
1. To dynamically update the variable value by using the IntelliJ debugging capability, select **Debug** again. The **Variables** pane appears again.
137137

138138
1. Right-click the target on the **Debug** tab, and then select **Set Value**. Next, enter a new value for the variable. Then select **Enter** to save the value.
139139

140-
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-set-value1.png" alt-text="Intellij Debug Remote Spark Job set value." border="true":::
140+
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-set-value1.png" alt-text="IntelliJ Debug Remote Spark Job set value." border="true":::
141141

142142
1. Select the **Resume Program** icon to continue to run the program. This time, no exception is caught. You can see that the project runs successfully without any exceptions.
143143

144-
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-debug-without-exception.png" alt-text="Intellij Debug Remote Spark Job without exception." border="true":::
144+
:::image type="content" source="./media/apache-spark-intellij-tool-debug-remotely-through-ssh/hdinsight-debug-without-exception.png" alt-text="IntelliJ Debug Remote Spark Job without exception." border="true":::
145145

146146
## Next steps
147147

0 commit comments

Comments
 (0)