Skip to content

Commit bcab0a4

Browse files
committed
Freshness update
1 parent f3e02b0 commit bcab0a4

File tree

3 files changed

+14
-18
lines changed

3 files changed

+14
-18
lines changed

articles/stream-analytics/includes/event-generator-app.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -67,9 +67,8 @@ Before you start the TelcoGenerator app, you should configure it to send data to
6767
3. Update the `<appSettings>` element in the config file with the following details:
6868

6969
* Set the value of the **EventHubName** key to the value of the **EntityPath** at the end of the connection string.
70-
* Set the value of the **Microsoft.ServiceBus.ConnectionString** key to the connection string **without** the EntityPath value (`;EntityPath=myeventhub`) at the end. **Don't forget** to remove the semicolon that precedes the EntityPath value.
70+
* Set the value of the **Microsoft.ServiceBus.ConnectionString** key to the connection string to the namespace. If you use a connection string to an event hub, not a namespace, remove `EntityPath` value (`;EntityPath=myeventhub`) at the end. **Don't forget** to remove the semicolon that precedes the EntityPath value.
7171
4. Save the file.
72-
7372
5. Next open a command window and change to the folder where you unzipped the TelcoGenerator application. Then enter the following command:
7473

7574
```cmd

articles/stream-analytics/no-code-power-bi-tutorial.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -128,7 +128,7 @@ Before you start, make sure you've completed the following steps:
128128
:::image type="content" source="./media/stream-analytics-no-code/synapse-settings.png" alt-text="Screenshot of the Synapse tile settings." lightbox="./media/stream-analytics-no-code/synapse-settings.png":::
129129
1. Select **Synapse** tile and see the **Data preview** tab at the bottom of the page. You see the data flowing into the dedicated SQL pool.
130130

131-
:::image type="content" source="./media/stream-analytics-no-code/synapse-data-preview.png" alt-text="Screenshot of the Synapse tile settings." lightbox="./media/stream-analytics-no-code/synapse-data-preview.png":::
131+
:::image type="content" source="./media/stream-analytics-no-code/synapse-data-preview.png" alt-text="Screenshot that shows Data Preview for the Synapse tile." lightbox="./media/stream-analytics-no-code/synapse-data-preview.png":::
132132
1. Select **Save** in the top ribbon to save your job and then select **Start**.
133133
:::image type="content" source="./media/stream-analytics-no-code/start-job-button.png" alt-text="Screenshot that shows the Start button selected on the command bar." lightbox="./media/stream-analytics-no-code/start-job-button.png":::
134134
1. On the **Start Stream Analytics Job** page, select **Start** to run your job.

articles/stream-analytics/stream-analytics-with-azure-functions.md

Lines changed: 12 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -6,37 +6,34 @@ ms.author: ebnkruma
66
ms.service: stream-analytics
77
ms.topic: tutorial
88
ms.custom: "mvc, devx-track-csharp"
9-
ms.date: 02/27/2023
9+
ms.date: 03/29/2024
1010

1111
#Customer intent: As an IT admin/developer I want to run Azure Functions with Stream Analytics jobs.
1212
---
1313

1414
# Tutorial: Run Azure Functions from Azure Stream Analytics jobs
15+
In this tutorial, you create an Azure Stream Analytics job that reads events from Azure Event Hubs, runs a query on the event data, and then invokes an Azure function, which writes to an Azure Cache for Redis instance.
1516

16-
You can run Azure Functions from Azure Stream Analytics by configuring Functions as one of the sinks (outputs) to the Stream Analytics job. Functions are an event-driven, compute-on-demand experience that lets you implement code that is triggered by events occurring in Azure or third-party services. This ability of Functions to respond to triggers makes it a natural output to Stream Analytics jobs.
17-
18-
Stream Analytics invokes Functions through HTTP triggers. The Functions output adapter allows users to connect Functions to Stream Analytics, such that the events can be triggered based on Stream Analytics queries.
17+
:::image type="content" source="./media/stream-analytics-with-azure-functions/image1.png" alt-text="Screenshot that shows relationship between Azure services in the solution.":::
1918

2019
> [!NOTE]
21-
> Connection to Azure Functions inside a virtual network (VNet) from an Stream Analytics job that is running in a multi-tenant cluster is not supported.
20+
> - You can run Azure Functions from Azure Stream Analytics by configuring Functions as one of the sinks (outputs) to the Stream Analytics job. Functions are an event-driven, compute-on-demand experience that lets you implement code that is triggered by events occurring in Azure or third-party services. This ability of Functions to respond to triggers makes it a natural output to Stream Analytics jobs.
21+
> - Stream Analytics invokes Functions through HTTP triggers. The Functions output adapter allows users to connect Functions to Stream Analytics, such that the events can be triggered based on Stream Analytics queries.
22+
> - Connection to Azure Functions inside a virtual network (VNet) from an Stream Analytics job that is running in a multi-tenant cluster is not supported.
2223
2324
In this tutorial, you learn how to:
2425

2526
> [!div class="checklist"]
26-
> * Create and run a Stream Analytics job
27+
> * Create an Azure Event Hubs instance
2728
> * Create an Azure Cache for Redis instance
2829
> * Create an Azure Function
30+
> * Create a Stream Analytics job
31+
> * Configure event hub as input and function as output
32+
> * Run the Stream Analytics job
2933
> * Check Azure Cache for Redis for results
3034
3135
If you don’t have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.
3236

33-
## Configure a Stream Analytics job to run a function
34-
35-
This section demonstrates how to configure a Stream Analytics job to run a function that writes data to Azure Cache for Redis. The Stream Analytics job reads events from Azure Event Hubs, and runs a query that invokes the function. This function reads data from the Stream Analytics job, and writes it to Azure Cache for Redis.
36-
37-
![Diagram showing relationships among the Azure services](./media/stream-analytics-with-azure-functions/image1.png)
38-
39-
4037
## Prerequisites
4138

4239
Before you start, make sure you've completed the following steps:
@@ -174,7 +171,7 @@ Before you start, make sure you've completed the following steps:
174171
4. Open your Stream Analytics job, and update the query to the following.
175172

176173
> [!IMPORTANT]
177-
> If you didn't name your output sink **saop1**, remember to change it in the query.
174+
> The following sample script assumes that you used **CallStream** for input name and **saop1** for the output name. If you used different names, DON'T forget to update the query.
178175
179176
```sql
180177
SELECT
@@ -239,7 +236,7 @@ If a failure occurs while sending events to Azure Functions, Stream Analytics re
239236
> [!NOTE]
240237
> The timeout for HTTP requests from Stream Analytics to Azure Functions is set to 100 seconds. If your Azure Functions app takes more than 100 seconds to process a batch, Stream Analytics errors out and will rety for the batch.
241238
242-
Retrying for timeouts may result in duplicate events written to the output sink. When Stream Analytics retries for a failed batch, it retries for all the events in the batch. For example, consider a batch of 20 events that are sent to Azure Functions from Stream Analytics. Assume that Azure Functions takes 100 seconds to process the first 10 events in that batch. After 100 seconds, Stream Analytics suspends the request since it hasn't received a positive response from Azure Functions, and another request is sent for the same batch. The first 10 events in the batch are processed again by Azure Functions, which causes a duplicate.
239+
Retrying for timeouts might result in duplicate events written to the output sink. When Stream Analytics retries for a failed batch, it retries for all the events in the batch. For example, consider a batch of 20 events that are sent to Azure Functions from Stream Analytics. Assume that Azure Functions takes 100 seconds to process the first 10 events in that batch. After 100 seconds, Stream Analytics suspends the request since it hasn't received a positive response from Azure Functions, and another request is sent for the same batch. The first 10 events in the batch are processed again by Azure Functions, which causes a duplicate.
243240

244241
## Known issues
245242

0 commit comments

Comments
 (0)