Skip to content

Commit 68c9890

Browse files
committed
Merge branch 'patch-1' of https://github.com/anboisve/azure-docs into public-87078
2 parents 3496dd7 + f7a8fc8 commit 68c9890

File tree

1 file changed

+20
-0
lines changed

1 file changed

+20
-0
lines changed

articles/stream-analytics/stream-analytics-real-time-fraud-detection.md

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -169,6 +169,26 @@ The next step is to define an input source for the job to read data using the ev
169169

170170
![Configure Azure Stream Analytics input](media/stream-analytics-real-time-fraud-detection/configure-stream-analytics-input.png)
171171

172+
## Create a consumer group
173+
174+
It is recommended to use a distinct consumer group for each Stream Analytics job. If no consumer group is specified, the Stream Analytics job uses the $Default consumer group. When a job contains a self-join or has multiple inputs, some inputs might be read by more than one reader downstream. This situation impacts the number of readers in a single consumer group.
175+
176+
Navigate to your Event Hubs Instance to add a new **Consumer group**.
177+
178+
1. In the **Entities** section of the Event Hubs Instance, select *Consumer Groups*.
179+
180+
2. Select **+ Consumer group**.
181+
182+
3. Provide a specified name for your new consumer group.
183+
|**Setting** |**Suggested value** |
184+
|---------|---------|
185+
|Name* | MyConsumerGroup |
186+
187+
4. Click **Create**.
188+
189+
![image](https://user-images.githubusercontent.com/70035300/151443244-6b342150-13c0-49d3-abe1-24a6e611aff7.png)
190+
191+
172192
## Configure job output
173193

174194
The last step is to define an output sink where the job can write the transformed data. In this tutorial, you output and visualize data with Power BI.

0 commit comments

Comments
 (0)