Skip to content

Commit ddc32a9

Browse files
authored
Merge pull request #125172 from ArieHein/Spelling-Wave-42
Spelling Fixes
2 parents 927a364 + 0bb3491 commit ddc32a9

15 files changed

+18
-18
lines changed

articles/spring-apps/basic-standard/includes/application-observability/application-observability-with-basic-standard-plan.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -248,7 +248,7 @@ To set up an action group, use the following steps:
248248

249249
1. On the **Create action group** page, select the subscription and resource group you want to cover. Enter the following information:
250250

251-
- **Action group name**: Enter *email-notifacation*.
251+
- **Action group name**: Enter *email-notification*.
252252
- **Short name**: Enter *email*.
253253
- **Region**: Select the region you want to use.
254254

articles/static-web-apps/database-postgresql.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ To complete this tutorial, you need to have an existing Azure Database for Postg
3232

3333
| Resource | Description |
3434
|---|---|
35-
| [Azure Database for PostgreSQL Flexible Server](/azure/postgresql/flexible-server/quickstart-create-server-portal) or [Azure Database for PostgreSQL Single Server Database](/azure/postgresql/single-server/quickstart-create-server-database-portal) | If you don't already have one, follow the steps in the [create an Azure Database for PostgreSQL Flexible Server database](/azure/postgresql/flexible-server/quickstart-create-server-portal) guide, or in the [create an Azure Database for PostgeSQL Single Server database](/azure/postgresql/single-server/quickstart-create-server-database-portal) guide. If you plan to use a connection string authentication for Static Web Apps' database connections, ensure that you create your Azure Database for PostgreSQL Server with PostgreSQL authentication. You can change this value if you want to use managed identity later on. |
35+
| [Azure Database for PostgreSQL Flexible Server](/azure/postgresql/flexible-server/quickstart-create-server-portal) or [Azure Database for PostgreSQL Single Server Database](/azure/postgresql/single-server/quickstart-create-server-database-portal) | If you don't already have one, follow the steps in the [create an Azure Database for PostgreSQL Flexible Server database](/azure/postgresql/flexible-server/quickstart-create-server-portal) guide, or in the [create an Azure Database for PostgreSQL Single Server database](/azure/postgresql/single-server/quickstart-create-server-database-portal) guide. If you plan to use a connection string authentication for Static Web Apps' database connections, ensure that you create your Azure Database for PostgreSQL Server with PostgreSQL authentication. You can change this value if you want to use managed identity later on. |
3636
| [Existing static web app](getting-started.md) | If you don't already have one, follow the steps in the [getting started](getting-started.md) guide to create a *No Framework* static web app. |
3737
| [Azure Data Studio, with the PostgreSQL extension](/azure-data-studio/quickstart-postgres) | If you don't already have Azure Data Studio installed, follow the guide to install [Azure Data Studio, with the PostgreSQL extension](/azure-data-studio/quickstart-postgres). Alternatively, you may use any other tool to query your PostgreSQL database, such as PgAdmin. |
3838

articles/storage-mover/endpoint-manage.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -178,7 +178,7 @@ The following steps describe the process of creating a source endpoint.
178178
$storageMoverName = "[Storage mover resource's name]"
179179
$sourceHost = "[Source share's host name or IP address]"
180180
$sourceShare = "[Source share's name]"
181-
$targetResourceID = "/subscriptions/[GUID]/resourceGroups/demoResrouceGroup/"
181+
$targetResourceID = "/subscriptions/[GUID]/resourceGroups/demoResourceGroup/"
182182
$targetResourceID += "providers/Microsoft.Storage/storageAccounts/demoAccount"
183183
184184
## For SMB endpoints

articles/stream-analytics/no-code-power-bi-tutorial.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -151,7 +151,7 @@ Before you start, make sure you've completed the following steps:
151151
WHERE times >= DATEADD(day, -1, GETDATE())
152152
```
153153

154-
:::image type="content" source="./media/stream-analytics-no-code/power-bi-direct-query.png" alt-text="Screenshot that shows the configuration of Power BI Destop to connect to Azure Synapse SQL Database." lightbox="./media/stream-analytics-no-code/power-bi-direct-query.png":::
154+
:::image type="content" source="./media/stream-analytics-no-code/power-bi-direct-query.png" alt-text="Screenshot that shows the configuration of Power BI Desktop to connect to Azure Synapse SQL Database." lightbox="./media/stream-analytics-no-code/power-bi-direct-query.png":::
155155

156156
Switch to **Database** tab, and enter your credentials (user name and password) to connect to the database and run the query.
157157
1. Select **Load** to load data into the Power BI.

articles/stream-analytics/no-code-stream-processing.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,7 @@ The following screenshot shows a completed Stream Analytics job. It highlights a
7070

7171
:::image type="content" source="./media/no-code-stream-processing/created-stream-analytics-job.png" alt-text="Screenshot that shows the authoring interface sections." lightbox="./media/no-code-stream-processing/created-stream-analytics-job.png" :::
7272

73-
1. **Ribbon**: On the ribbon, sections follow the order of a classic analytics process: an event hub as input (also known as a data source), transformations (streaming Etract, Transform, and Load operations), outputs, a button to save your progress, and a button to start the job.
73+
1. **Ribbon**: On the ribbon, sections follow the order of a classic analytics process: an event hub as input (also known as a data source), transformations (streaming Extract, Transform, and Load operations), outputs, a button to save your progress, and a button to start the job.
7474
2. **Diagram view**: This is a graphical representation of your Stream Analytics job, from input to operations to outputs.
7575
3. **Side pane**: Depending on which component you selected in the diagram view, you see settings to modify input, transformation, or output.
7676
4. **Tabs for data preview, authoring errors, runtime logs, and metrics**: For each tile, the data preview shows you results for that step (live for inputs; on demand for transformations and outputs). This section also summarizes any authoring errors or warnings that you might have in your job when it's being developed. Selecting each error or warning selects that transform. It also provides the job metrics for you to monitor the running job's health.
@@ -417,7 +417,7 @@ You can select more metrics from the list. To understand all the metrics in deta
417417
You can save the job anytime while creating it. After you configure the streaming inputs, transformations, and streaming outputs for the job, you can start the job.
418418

419419
> [!NOTE]
420-
> Although the no-code editor on Azure Stream Analtyics portal is in preview, the Azure Stream Analytics service is generally available.
420+
> Although the no-code editor on Azure Stream Analytics portal is in preview, the Azure Stream Analytics service is generally available.
421421
422422
:::image type="content" source="./media/no-code-stream-processing/no-code-save-start.png" alt-text="Screenshot that shows the Save and Start buttons." lightbox="./media/no-code-stream-processing/no-code-save-start.png" :::
423423

articles/stream-analytics/power-bi-output.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ You can use [Power BI](/power-bi/fundamentals/power-bi-overview) as an output fo
1717
1818
> [!Important]
1919
> Real-time streaming in Power BI is deprecating. For more information about the retirement of real-time streaming in Power BI, see the [blog](https://powerbi.microsoft.com/en-us/blog/announcing-the-retirement-of-real-time-streaming-in-power-bi/) post.
20-
> Begining Oct 31,2027 users will not be able to create Stream Analytics jobs with Power BI output connector and the existing jobs running with Power BI connector will be stopped. Microsoft recommends users to explore Real-Time Intelligence in Microsoft Fabric. If you are interested in migrating to Fabric Real-Time Intelligence, you can use the guidance provided in this [blog](https://techcommunity.microsoft.com/blog/analyticsonazure/simplifying-migration-to-fabric-real-time-intelligence-for-power-bi-real-time-re/4283180) post. If you need more migration guidance from Microsoft, such as architecture review, clarification about specific capabilities, please fill out your request [here](https://forms.office.com/r/sQeaA8KLAZ).
20+
> Beginning Oct 31,2027 users will not be able to create Stream Analytics jobs with Power BI output connector and the existing jobs running with Power BI connector will be stopped. Microsoft recommends users to explore Real-Time Intelligence in Microsoft Fabric. If you are interested in migrating to Fabric Real-Time Intelligence, you can use the guidance provided in this [blog](https://techcommunity.microsoft.com/blog/analyticsonazure/simplifying-migration-to-fabric-real-time-intelligence-for-power-bi-real-time-re/4283180) post. If you need more migration guidance from Microsoft, such as architecture review, clarification about specific capabilities, please fill out your request [here](https://forms.office.com/r/sQeaA8KLAZ).
2121
2222
## Output configuration
2323

articles/stream-analytics/powerbi-output-managed-identity.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -107,7 +107,7 @@ Azure Resource Manager allows you to fully automate the deployment of your Strea
107107
2. After the job is created, use Azure Resource Manager to retrieve the job's full definition.
108108

109109
```azurecli
110-
az resource show --ids /subscriptions/<subsription-id>/resourceGroups/<resource-group>/providers/Microsoft.StreamAnalytics/StreamingJobs/<resource-name>
110+
az resource show --ids /subscriptions/<subscription-id>/resourceGroups/<resource-group>/providers/Microsoft.StreamAnalytics/StreamingJobs/<resource-name>
111111
```
112112

113113
The above command will return a response like the below:

articles/stream-analytics/sql-database-output-managed-identity.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -239,7 +239,7 @@ Storage Table Data Contributor role. If you do not give your job access, the job
239239
Repeat the steps if you selected user-assigned managed identity to connect ASA to Synapse:
240240
1. Create a contained database user. Replace ASA_Job_Name with User-Assigned Managed Identity. See the example below.
241241
```sql
242-
CREATE USER [User-Assigned Managed Identit] FROM EXTERNAL PROVIDER;
242+
CREATE USER [User-Assigned Managed Identity] FROM EXTERNAL PROVIDER;
243243
```
244244
2. Grant permissions to the User-Assigned Managed Identity. Replace ASA_Job_Name with User-Assigned Managed Identity.
245245

articles/stream-analytics/sql-reference-data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ Use the following steps to add Azure SQL Database as a reference input source us
3131

3232
![Inputs is selected in the left navigation pane. On Inputs, + Add reference input is selected, revealing a drop-down list that shows the values Blob storage and SQL Database.](./media/sql-reference-data/stream-analytics-inputs.png)
3333

34-
2. Fill out the Stream Analytics Input Configurations. Choose the database name, server name, username and password. If you want your reference data input to refresh periodically, choose “On” to specify the refresh rate in DD:HH:MM. If you have large data sets with a short refresh rate. Delta query enables you to track changes within your reference data by retreiving all of the rows in SQL Database that were inserted or deleted within a start time, @deltaStartTime, and an end time @deltaEndTime.
34+
2. Fill out the Stream Analytics Input Configurations. Choose the database name, server name, username and password. If you want your reference data input to refresh periodically, choose “On” to specify the refresh rate in DD:HH:MM. If you have large data sets with a short refresh rate. Delta query enables you to track changes within your reference data by retrieving all of the rows in SQL Database that were inserted or deleted within a start time, @deltaStartTime, and an end time @deltaEndTime.
3535

3636
Please see [delta query](sql-reference-data.md#delta-query).
3737

articles/stream-analytics/start-job.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ Start-AzStreamAnalyticsJob `
4848

4949
There are three options for **OutputStartMode**: *JobStartTime*, *CustomTime*, and *LastOutputEventTime*. If this property is absent, the default is *JobStartTime*. See above for more information about these options.
5050

51-
For more information on the `Start-AzStreamAnalyitcsJob` cmdlet, view the [Start-AzStreamAnalyticsJob reference](/powershell/module/az.streamanalytics/start-azstreamanalyticsjob).
51+
For more information on the `Start-AzStreamAnalyticsJob` cmdlet, view the [Start-AzStreamAnalyticsJob reference](/powershell/module/az.streamanalytics/start-azstreamanalyticsjob).
5252

5353
## Next steps
5454

0 commit comments

Comments
 (0)