Skip to content

Commit 11dfe60

Browse files
committed
almost final
1 parent 8d473f8 commit 11dfe60

File tree

3 files changed

+6
-3
lines changed

3 files changed

+6
-3
lines changed

articles/stream-analytics/includes/event-generator-app.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -134,7 +134,7 @@ The next step is to define an input source for the job to read data using the ev
134134
1. For **Subscription**, select the Azure subscription where you created the event hub. The event hub can be in same or a different subscription as the Stream Analytics job.
135135
1. For **Event Hubs namespace**, select the Event Hubs namespace you created in the previous section. All the namespaces available in your current subscription are listed in the dropdown.
136136
1. For **Event hub name**, select the event hub you created in the previous section. All the event hubs available in the selected namespace are listed in the dropdown.
137-
1. For **Event hub consumer group**, keep the **Create new** option selected so taht a new consumer group is created on the event hub. We recommend that you use a distinct consumer group for each Stream Analytics job. If no consumer group is specified, the Stream Analytics job uses the `$Default` consumer group. When a job contains a self-join or has multiple inputs, some inputs later might be read by more than one reader. This situation affects the number of readers in a single consumer group.
137+
1. For **Event hub consumer group**, keep the **Create new** option selected so that a new consumer group is created on the event hub. We recommend that you use a distinct consumer group for each Stream Analytics job. If no consumer group is specified, the Stream Analytics job uses the `$Default` consumer group. When a job contains a self-join or has multiple inputs, some inputs later might be read by more than one reader. This situation affects the number of readers in a single consumer group.
138138
1. For **Authentication mode**, select **Connection string**. It's easier to test the tutorial with this option.
139139
1. For **Event hub policy name**, select **Use existing**, and then select the policy you created earlier.
140140
1. Select **Save** at the bottom of the page.
36.3 KB
Loading

articles/stream-analytics/stream-analytics-with-azure-functions.md

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@ Before you start, make sure you've completed the following steps:
5252

5353
2. After you create the cache, under **Settings**, select **Access Keys**. Make a note of the **Primary connection string**.
5454

55-
![Screenshot of Azure Cache for Redis connection string](./media/stream-analytics-with-azure-functions/image2.png)
55+
:::image type="content" source="./media/stream-analytics-with-azure-functions/redis-cache-connection-string.png" alt-text="Screenshot showing the selection of the Access Key menu item.":::
5656

5757
## Create a function in Azure Functions that can write data to Azure Cache for Redis
5858

@@ -171,7 +171,10 @@ Before you start, make sure you've completed the following steps:
171171

172172
3. Provide a name for the output alias. In this tutorial, it's named **saop1**, but you can use any name of your choice. Fill in other details.
173173

174-
4. Open your Stream Analytics job, and update the query to the following. If you didn't name your output sink **saop1**, remember to change it in the query.
174+
4. Open your Stream Analytics job, and update the query to the following.
175+
176+
> [!IMPORTANT]
177+
> If you didn't name your output sink **saop1**, remember to change it in the query.
175178
176179
```sql
177180
SELECT

0 commit comments

Comments
 (0)