Skip to content

Commit 0236ece

Browse files
Merge pull request #125429 from ArieHein/Spelling-58
Spelling Fixes
2 parents 041b61a + 13be85c commit 0236ece

22 files changed

+28
-28
lines changed

articles/stream-analytics/stream-analytics-build-an-iot-solution-using-stream-analytics.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -196,7 +196,7 @@ Follow these steps to start the streaming job:
196196
## Report total time for each car
197197
The average time that's required for a car to pass through the toll helps to assess the efficiency of the process and the customer experience.
198198

199-
To find the total time, join the EntryTime stream with the ExitTime stream. Join the two input streams on the equal matching TollId and LicencePlate columns. The **JOIN** operator requires you to specify temporal leeway that describes the acceptable time difference between the joined events. Use the **DATEDIFF** function to specify that events should be no more than 15 minutes from each other. Also apply the **DATEDIFF** function to exit and entry times to compute the actual time that a car spends in the toll station. Note the difference of the use of **DATEDIFF** when it's used in a **SELECT** statement rather than a **JOIN** condition.
199+
To find the total time, join the EntryTime stream with the ExitTime stream. Join the two input streams on the equal matching TollId and LicensePlate columns. The **JOIN** operator requires you to specify temporal leeway that describes the acceptable time difference between the joined events. Use the **DATEDIFF** function to specify that events should be no more than 15 minutes from each other. Also apply the **DATEDIFF** function to exit and entry times to compute the actual time that a car spends in the toll station. Note the difference of the use of **DATEDIFF** when it's used in a **SELECT** statement rather than a **JOIN** condition.
200200

201201
```sql
202202
SELECT EntryStream.TollId, EntryStream.EntryTime, ExitStream.ExitTime, EntryStream.LicensePlate, DATEDIFF (minute, EntryStream.EntryTime, ExitStream.ExitTime) AS DurationInMinutes

articles/synapse-analytics/migration-guides/migrate-to-synapse-analytics-guide.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ For more assistance with completing this migration scenario, see the following r
6262
| --------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------- |
6363
| [Data Workload Assessment Model and Tool](https://www.microsoft.com/download/details.aspx?id=103130) | This tool provides suggested "best fit" target platforms, cloud readiness, and application or database remediation level for a given workload. It offers simple, one-click calculation and report generation that helps to accelerate large estate assessments by providing an automated and uniform target platform decision process. |
6464
| [Handling data encoding issues while loading data to Azure Synapse Analytics](https://azure.microsoft.com/blog/handling-data-encoding-issues-while-loading-data-to-sql-data-warehouse/) | This blog post provides insight on some of the data encoding issues you might encounter while using PolyBase to load data to dedicated SQL pools (formerly SQL data warehouse). This article also provides some options that you can use to overcome such issues and load the data successfully. |
65-
| [Getting table sizes in Azure Synapse Analytics dedicated SQL pool](https://github.com/Microsoft/DataMigrationTeam/blob/master/Whitepapers/Getting%20table%20sizes%20in%20SQL%20DW.pdf) | One of the key tasks that an architect must perform is to get metrics about a new environment post-migration. Examples include collecting load times from on-premises to the cloud and collecting PolyBase load times. One of the most important tasks is to determine the storage size indedicated SQL pools (formerly SQL data warehouse) compared to the customer's current platform. |
65+
| [Getting table sizes in Azure Synapse Analytics dedicated SQL pool](https://github.com/Microsoft/DataMigrationTeam/blob/master/Whitepapers/Getting%20table%20sizes%20in%20SQL%20DW.pdf) | One of the key tasks that an architect must perform is to get metrics about a new environment post-migration. Examples include collecting load times from on-premises to the cloud and collecting PolyBase load times. One of the most important tasks is to determine the storage size in dedicated SQL pools (formerly SQL data warehouse) compared to the customer's current platform. |
6666

6767
The Data SQL Engineering team developed these resources. This team's core charter is to unblock and accelerate complex modernization for data platform migration projects to Microsoft's Azure data platform.
6868

articles/synapse-analytics/quickstart-transform-data-using-spark-job-definition.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -93,7 +93,7 @@ On this panel, you can reference to the Spark job definition to run.
9393
|Driver size| Number of cores and memory to be used for driver given in the specified Apache Spark pool for the job.|
9494
|Spark configuration| Specify values for Spark configuration properties listed in the article: Spark Configuration - Application properties. Users can use default configuration and customized configuration. |
9595

96-
![spark job definition pipline settings](media/quickstart-transform-data-using-spark-job-definition/spark-job-definition-pipline-settings.png)
96+
![spark job definition pipeline settings](media/quickstart-transform-data-using-spark-job-definition/spark-job-definition-pipline-settings.png)
9797

9898
* You can add dynamic content by clicking the **Add Dynamic Content** button or by pressing the shortcut key <kbd>Alt</kbd>+<kbd>Shift</kbd>+<kbd>D</kbd>. In the **Add Dynamic Content** page, you can use any combination of expressions, functions, and system variables to add to dynamic content.
9999

articles/synapse-analytics/spark/apache-spark-azure-machine-learning-tutorial.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -142,7 +142,7 @@ ws = Workspace(workspace_name = workspace_name,
142142
```
143143

144144
## Convert a DataFrame to an Azure Machine Learning dataset
145-
To submit a remote experiment, convert your dataset into an Azure Machine Learning ```TabularDatset``` instance. [TabularDataset](/python/api/azureml-core/azureml.data.tabulardataset) represents data in a tabular format by parsing the provided files.
145+
To submit a remote experiment, convert your dataset into an Azure Machine Learning ```TabularDataset``` instance. [TabularDataset](/python/api/azureml-core/azureml.data.tabulardataset) represents data in a tabular format by parsing the provided files.
146146

147147
The following code gets the existing workspace and the default Azure Machine Learning datastore. It then passes the datastore and file locations to the path parameter to create a new ```TabularDataset``` instance.
148148

articles/synapse-analytics/spark/apache-spark-development-using-notebooks.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -344,8 +344,8 @@ You can use parameterized session configuration to replace values in the `%%conf
344344

345345
```
346346
{
347-
"activityParameterName": "paramterNameInPipelineNotebookActivity",
348-
"defaultValue": "defaultValueIfNoParamterFromPipelineNotebookActivity"
347+
"activityParameterName": "parameterNameInPipelineNotebookActivity",
348+
"defaultValue": "defaultValueIfNoParameterFromPipelineNotebookActivity"
349349
}
350350
```
351351

articles/synapse-analytics/spark/apache-spark-troubleshoot-library-errors.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ To view the full installation logs:
4646
3. Select the highlighted **Spark history server** option which would open the Spark history server details page in a new tab.
4747
![Screenshot that highlights the details of the failed system reserved library job.](./media/apache-spark-azure-portal-add-libraries/system-reserved-library-job-failure-details.png "View details of failed system library job")
4848
4. In this page, you will see 2 attempts, select **Attempt 1** as shown below.
49-
![Screenshot that highlights the executor details in the spark history server page for the failed system reserved library job.](./media/apache-spark-azure-portal-add-libraries/spark-history-server-executors.png "View executor detaols in spark history server page")
49+
![Screenshot that highlights the executor details in the spark history server page for the failed system reserved library job.](./media/apache-spark-azure-portal-add-libraries/spark-history-server-executors.png "View executor details in spark history server page")
5050
5. On the top navigation bar in the Spark history server page, switch to the **Executors** tab.
5151
![Screenshot that highlights the job details in the spark history server page for the failed system reserved library job.](./media/apache-spark-azure-portal-add-libraries/spark-history-server-page.png "View the job details in the spark history server page")
5252
6. Download the **stdout** and **stderr** log files to access the full library management output and error logs.

articles/synapse-analytics/sql-data-warehouse/performance-tuning-materialized-views.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -124,7 +124,7 @@ SELECT C, SUM(D)
124124
FROM T
125125
GROUP BY C
126126

127-
-- You could create a single mateiralized view of this form
127+
-- You could create a single materialized view of this form
128128

129129
SELECT A, C, SUM(B), SUM(D)
130130
FROM T

articles/traffic-manager/traffic-manager-powershell-arm.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -66,7 +66,7 @@ The cmdlet creates a Traffic Manager profile in Azure and returns a correspondin
6666

6767
## Get a Traffic Manager Profile
6868

69-
To retrieve an existing Traffic Manager profile object, use the `Get-AzTrafficManagerProfle` cmdlet:
69+
To retrieve an existing Traffic Manager profile object, use the `Get-AzTrafficManagerProfile` cmdlet:
7070

7171
```powershell
7272
$TmProfile = Get-AzTrafficManagerProfile -Name MyProfile -ResourceGroupName MyRG

articles/update-manager/troubleshoot.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ To review the logs related to all actions performed by the extension, on Windows
4848

4949
* `WindowsUpdateExtension.log`: Contains information related to the patch actions. This information includes the patches assessed and installed on the machine and any problems encountered in the process.
5050
* `cmd_execution_<numeric>_stdout.txt`: There's a wrapper above the patch action. It's used to manage the extension and invoke specific patch operation. This log contains information about the wrapper. For autopatching, the log has information on whether the specific patch operation was invoked.
51-
* `cmd_excution_<numeric>_stderr.txt`
51+
* `cmd_execution_<numeric>_stderr.txt`
5252

5353
---
5454

articles/virtual-network-manager/create-virtual-network-manager-cli.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -244,7 +244,7 @@ For the configuration to take effect, commit the configuration to the target reg
244244
az network manager post-commit \
245245
--network-manager-name "myAVNM" \
246246
--commit-type "Connectivity" \
247-
--configuration-ids "/subscriptions/<subscription_id>/resourceGroups/myANVMResourceGroup/providers/Microsoft.Network/networkManagers/myAVNM/connectivityConfigurations/connectivityconfig" \
247+
--configuration-ids "/subscriptions/<subscription_id>/resourceGroups/myAVNMResourceGroup/providers/Microsoft.Network/networkManagers/myAVNM/connectivityConfigurations/connectivityconfig" \
248248
--target-locations "westus" \
249249
--resource-group "myAVNMResourceGroup"
250250
```

0 commit comments

Comments
 (0)