You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/includes/event-generator-app.md
+1-2Lines changed: 1 addition & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -67,9 +67,8 @@ Before you start the TelcoGenerator app, you should configure it to send data to
67
67
3. Update the `<appSettings>` element in the config file with the following details:
68
68
69
69
* Set the value of the **EventHubName** key to the value of the **EntityPath** at the end of the connection string.
70
-
* Set the value of the **Microsoft.ServiceBus.ConnectionString** key to the connection string **without** the EntityPath value (`;EntityPath=myeventhub`) at the end. **Don't forget** to remove the semicolon that precedes the EntityPath value.
70
+
* Set the value of the **Microsoft.ServiceBus.ConnectionString** key to the connection string to the namespace. If you use a connection string to an event hub, not a namespace, remove `EntityPath` value (`;EntityPath=myeventhub`) at the end. **Don't forget** to remove the semicolon that precedes the EntityPath value.
71
71
4. Save the file.
72
-
73
72
5. Next open a command window and change to the folder where you unzipped the TelcoGenerator application. Then enter the following command:
Copy file name to clipboardExpand all lines: articles/stream-analytics/no-code-power-bi-tutorial.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -128,7 +128,7 @@ Before you start, make sure you've completed the following steps:
128
128
:::image type="content" source="./media/stream-analytics-no-code/synapse-settings.png" alt-text="Screenshot of the Synapse tile settings." lightbox="./media/stream-analytics-no-code/synapse-settings.png":::
129
129
1. Select**Synapse** tile and see the **Data preview** tab at the bottom of the page. You see the data flowing into the dedicated SQL pool.
130
130
131
-
:::image type="content" source="./media/stream-analytics-no-code/synapse-data-preview.png" alt-text="Screenshot of the Synapse tile settings." lightbox="./media/stream-analytics-no-code/synapse-data-preview.png":::
131
+
:::image type="content" source="./media/stream-analytics-no-code/synapse-data-preview.png" alt-text="Screenshot that shows Data Preview for the Synapse tile." lightbox="./media/stream-analytics-no-code/synapse-data-preview.png":::
132
132
1. Select**Save**in the top ribbon to save your job and then select**Start**.
133
133
:::image type="content" source="./media/stream-analytics-no-code/start-job-button.png" alt-text="Screenshot that shows the Start button selected on the command bar." lightbox="./media/stream-analytics-no-code/start-job-button.png":::
134
134
1. On the **Start Stream Analytics Job** page, select**Start** to run your job.
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-with-azure-functions.md
+12-15Lines changed: 12 additions & 15 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,37 +6,34 @@ ms.author: ebnkruma
6
6
ms.service: stream-analytics
7
7
ms.topic: tutorial
8
8
ms.custom: "mvc, devx-track-csharp"
9
-
ms.date: 02/27/2023
9
+
ms.date: 03/29/2024
10
10
11
11
#Customer intent: As an IT admin/developer I want to run Azure Functions with Stream Analytics jobs.
12
12
---
13
13
14
14
# Tutorial: Run Azure Functions from Azure Stream Analytics jobs
15
+
In this tutorial, you create an Azure Stream Analytics job that reads events from Azure Event Hubs, runs a query on the event data, and then invokes an Azure function, which writes to an Azure Cache for Redis instance.
15
16
16
-
You can run Azure Functions from Azure Stream Analytics by configuring Functions as one of the sinks (outputs) to the Stream Analytics job. Functions are an event-driven, compute-on-demand experience that lets you implement code that is triggered by events occurring in Azure or third-party services. This ability of Functions to respond to triggers makes it a natural output to Stream Analytics jobs.
17
-
18
-
Stream Analytics invokes Functions through HTTP triggers. The Functions output adapter allows users to connect Functions to Stream Analytics, such that the events can be triggered based on Stream Analytics queries.
17
+
:::image type="content" source="./media/stream-analytics-with-azure-functions/image1.png" alt-text="Screenshot that shows relationship between Azure services in the solution.":::
19
18
20
19
> [!NOTE]
21
-
> Connection to Azure Functions inside a virtual network (VNet) from an Stream Analytics job that is running in a multi-tenant cluster is not supported.
20
+
> - You can run Azure Functions from Azure Stream Analytics by configuring Functions as one of the sinks (outputs) to the Stream Analytics job. Functions are an event-driven, compute-on-demand experience that lets you implement code that is triggered by events occurring in Azure or third-party services. This ability of Functions to respond to triggers makes it a natural output to Stream Analytics jobs.
21
+
> - Stream Analytics invokes Functions through HTTP triggers. The Functions output adapter allows users to connect Functions to Stream Analytics, such that the events can be triggered based on Stream Analytics queries.
22
+
> - Connection to Azure Functions inside a virtual network (VNet) from an Stream Analytics job that is running in a multi-tenant cluster is not supported.
22
23
23
24
In this tutorial, you learn how to:
24
25
25
26
> [!div class="checklist"]
26
-
> * Create and run a Stream Analytics job
27
+
> * Create an Azure Event Hubs instance
27
28
> * Create an Azure Cache for Redis instance
28
29
> * Create an Azure Function
30
+
> * Create a Stream Analytics job
31
+
> * Configure event hub as input and function as output
32
+
> * Run the Stream Analytics job
29
33
> * Check Azure Cache for Redis for results
30
34
31
35
If you don’t have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.
32
36
33
-
## Configure a Stream Analytics job to run a function
34
-
35
-
This section demonstrates how to configure a Stream Analytics job to run a function that writes data to Azure Cache for Redis. The Stream Analytics job reads events from Azure Event Hubs, and runs a query that invokes the function. This function reads data from the Stream Analytics job, and writes it to Azure Cache for Redis.
36
-
37
-

38
-
39
-
40
37
## Prerequisites
41
38
42
39
Before you start, make sure you've completed the following steps:
@@ -174,7 +171,7 @@ Before you start, make sure you've completed the following steps:
174
171
4. Open your Stream Analytics job, and update the query to the following.
175
172
176
173
> [!IMPORTANT]
177
-
> If you didn't name your output sink **saop1**, remember to change it in the query.
174
+
> The following sample script assumes that you used **CallStream** for input name and **saop1** for the output name. If you used different names, DON'T forget to update the query.
178
175
179
176
```sql
180
177
SELECT
@@ -239,7 +236,7 @@ If a failure occurs while sending events to Azure Functions, Stream Analytics re
239
236
> [!NOTE]
240
237
> The timeout for HTTP requests from Stream Analytics to Azure Functions is set to 100 seconds. If your Azure Functions app takes more than 100 seconds to process a batch, Stream Analytics errors out and will rety for the batch.
241
238
242
-
Retrying for timeouts may result in duplicate events written to the output sink. When Stream Analytics retries for a failed batch, it retries for all the events in the batch. For example, consider a batch of 20 events that are sent to Azure Functions from Stream Analytics. Assume that Azure Functions takes 100 seconds to process the first 10 events in that batch. After 100 seconds, Stream Analytics suspends the request since it hasn't received a positive response from Azure Functions, and another request is sent for the same batch. The first 10 events in the batch are processed again by Azure Functions, which causes a duplicate.
239
+
Retrying for timeouts might result in duplicate events written to the output sink. When Stream Analytics retries for a failed batch, it retries for all the events in the batch. For example, consider a batch of 20 events that are sent to Azure Functions from Stream Analytics. Assume that Azure Functions takes 100 seconds to process the first 10 events in that batch. After 100 seconds, Stream Analytics suspends the request since it hasn't received a positive response from Azure Functions, and another request is sent for the same batch. The first 10 events in the batch are processed again by Azure Functions, which causes a duplicate.
0 commit comments