You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-troubleshoot-output.md
+14-21Lines changed: 14 additions & 21 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,41 +6,33 @@ ms.author: sidram
6
6
ms.reviewer: mamccrea
7
7
ms.service: stream-analytics
8
8
ms.topic: conceptual
9
-
ms.date: 12/07/2018
9
+
ms.date: 03/31/2020
10
10
ms.custom: seodec18
11
11
---
12
12
13
13
# Troubleshoot Azure Stream Analytics outputs
14
14
15
-
This page describes common issues with output connections and how to troubleshoot and address them.
15
+
This article describes common issues with Azure Stream Analytics output connections, how to troubleshoot output issues, and how to correct the issues. Many troubleshooting steps require diagnostic logs to be enabled for your Stream Analytics job. If you do not have diagnostic logs enabled, see [Troubleshoot Azure Stream Analytics by using diagnostics logs](stream-analytics-job-diagnostic-logs.md).
16
16
17
17
## Output not produced by job
18
+
18
19
1. Verify connectivity to outputs by using the **Test Connection** button for each output.
19
20
20
21
2. Look at [**Monitoring Metrics**](stream-analytics-monitoring.md) on the **Monitor** tab. Because the values are aggregated, the metrics are delayed by a few minutes.
21
-
- If Input Events > 0, the job is able to read input data. If Input Events is not > 0, then:
22
-
- To see whether the data source has valid data, check it by using [Service Bus Explorer](https://code.msdn.microsoft.com/windowsapps/Service-Bus-Explorer-f2abca5a). This check applies if the job is using Event Hub as input.
23
-
- Check to see whether the data serialization format and data encoding are as expected.
24
-
- If the job is using an Event Hub, check to see whether the body of the message is *Null*.
25
-
26
-
- If Data Conversion Errors > 0 and climbing, the following might be true:
27
-
- The output event does not conform to the schema of the target sink.
28
-
- The event schema might not match the defined or expected schema of the events in the query.
29
-
- The datatypes of some of the fields in the event might not match expectations.
30
-
31
-
- If Runtime Errors > 0, it means that the job can receive the data but is generating errors while processing the query.
32
-
- To find the errors, go to the [Audit Logs](../azure-resource-manager/management/view-activity-logs.md) and filter on *Failed* status.
33
-
34
-
- If InputEvents > 0 and OutputEvents = 0, it means that one of the following is true:
35
-
- Query processing resulted in zero output events.
36
-
- Events or its fields might be malformed, resulting in zero output after query processing.
37
-
- The job was unable to push data to the output sink for connectivity or authentication reasons.
22
+
* If Input Events are greater than 0, the job is able to read input data. If Input Events are not greater than 0, then there is an issue with the job's input. See [Troubleshoot input connections](stream-analytics-troubleshoot-input.md) to learn how to troubleshoot input connection issues.
23
+
* If Data Conversion Errors are greater than 0 and climbing, see [Azure Stream Analytics data errors](data-errors.md) for detailed information about data conversion errors.
24
+
* If Runtime Errors are greater than 0, your job can receive data but it's generating errors while processing the query. To find the errors, go to the [Audit Logs](../azure-resource-manager/management/view-activity-logs.md) and filter on *Failed* status.
25
+
* If InputEvents is greater than 0 and OutputEvents equals 0, one of the following is true:
26
+
* Query processing resulted in zero output events.
27
+
* Events or fields might be malformed, resulting in zero output after query processing.
28
+
* The job was unable to push data to the output sink for connectivity or authentication reasons.
38
29
39
-
-In all the previously mentioned error cases, operations log messages explain additional details (including what is happening), except in cases where the query logic filtered out all events. If the processing of multiple events generates errors, Stream Analytics logs the first three error messages of the same type within 10 minutes to Operations logs. It then suppresses additional identical errors with a message that reads "Errors are happening too rapidly, these are being suppressed."
30
+
In all the previously mentioned error cases, operations log messages explain additional details (including what is happening), except in cases where the query logic filtered out all events. If the processing of multiple events generates errors, the errors are aggregated every 10 minutes.
40
31
41
32
## Job output is delayed
42
33
43
34
### First output is delayed
35
+
44
36
When a Stream Analytics job is started, the input events are read, but there can be a delay in the output being produced in certain circumstances.
45
37
46
38
Large time values in temporal query elements can contribute to the output delay. To produce correct output over the large time windows, the streaming job starts up by reading data from the latest time possible (up to seven days ago) to fill the time window. During that time, no output is produced until the catch-up read of the outstanding input events is complete. This problem can surface when the system upgrades the streaming jobs, thus restarting the job. Such upgrades generally occur once every couple of months.
@@ -64,6 +56,7 @@ These factors impact the timeliness of the first output that is generated:
64
56
- For analytic functions, the output is generated for every event, there is no delay.
65
57
66
58
### Output falls behind
59
+
67
60
During normal operation of the job, if you find the job’s output is falling behind (longer and longer latency), you can pinpoint the root causes by examining these factors:
68
61
- Whether the downstream sink is throttled
69
62
- Whether the upstream source is throttled
@@ -83,12 +76,12 @@ Note the following observations when configuring IGNORE_DUP_KEY for several type
83
76
84
77
* You cannot set IGNORE_DUP_KEY on a primary key or a unique constraint that uses ALTER INDEX, you need to drop and recreate the index.
85
78
* You can set the IGNORE_DUP_KEY option using ALTER INDEX for a unique index, which is different from PRIMARY KEY/UNIQUE constraint and created using CREATE INDEX or INDEX definition.
79
+
86
80
* IGNORE_DUP_KEY doesn’t apply to column store indexes because you can’t enforce uniqueness on such indexes.
87
81
88
82
## Column names are lower-cased by Azure Stream Analytics
89
83
When using the original compatibility level (1.0), Azure Stream Analytics used to change column names to lower case. This behavior was fixed in later compatibility levels. In order to preserve the case, we advise customers to move to the compatibility level 1.1 and later. You can find more information on [Compatibility level for Azure Stream Analytics jobs](https://docs.microsoft.com/azure/stream-analytics/stream-analytics-compatibility-level).
90
84
91
-
92
85
## Get help
93
86
94
87
For further assistance, try our [Azure Stream Analytics forum](https://social.msdn.microsoft.com/Forums/azure/home?forum=AzureStreamAnalytics).
0 commit comments