Skip to content

Commit b179904

Browse files
Merge pull request #109391 from mamccrea/patch-41
Stream Analytics: Update stream-analytics-troubleshoot-output.md
2 parents d33f5c4 + 47bd6e2 commit b179904

File tree

1 file changed

+14
-21
lines changed

1 file changed

+14
-21
lines changed

articles/stream-analytics/stream-analytics-troubleshoot-output.md

Lines changed: 14 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -6,41 +6,33 @@ ms.author: sidram
66
ms.reviewer: mamccrea
77
ms.service: stream-analytics
88
ms.topic: conceptual
9-
ms.date: 12/07/2018
9+
ms.date: 03/31/2020
1010
ms.custom: seodec18
1111
---
1212

1313
# Troubleshoot Azure Stream Analytics outputs
1414

15-
This page describes common issues with output connections and how to troubleshoot and address them.
15+
This article describes common issues with Azure Stream Analytics output connections, how to troubleshoot output issues, and how to correct the issues. Many troubleshooting steps require diagnostic logs to be enabled for your Stream Analytics job. If you do not have diagnostic logs enabled, see [Troubleshoot Azure Stream Analytics by using diagnostics logs](stream-analytics-job-diagnostic-logs.md).
1616

1717
## Output not produced by job
18+
1819
1. Verify connectivity to outputs by using the **Test Connection** button for each output.
1920

2021
2. Look at [**Monitoring Metrics**](stream-analytics-monitoring.md) on the **Monitor** tab. Because the values are aggregated, the metrics are delayed by a few minutes.
21-
- If Input Events > 0, the job is able to read input data. If Input Events is not > 0, then:
22-
- To see whether the data source has valid data, check it by using [Service Bus Explorer](https://code.msdn.microsoft.com/windowsapps/Service-Bus-Explorer-f2abca5a). This check applies if the job is using Event Hub as input.
23-
- Check to see whether the data serialization format and data encoding are as expected.
24-
- If the job is using an Event Hub, check to see whether the body of the message is *Null*.
25-
26-
- If Data Conversion Errors > 0 and climbing, the following might be true:
27-
- The output event does not conform to the schema of the target sink.
28-
- The event schema might not match the defined or expected schema of the events in the query.
29-
- The datatypes of some of the fields in the event might not match expectations.
30-
31-
- If Runtime Errors > 0, it means that the job can receive the data but is generating errors while processing the query.
32-
- To find the errors, go to the [Audit Logs](../azure-resource-manager/management/view-activity-logs.md) and filter on *Failed* status.
33-
34-
- If InputEvents > 0 and OutputEvents = 0, it means that one of the following is true:
35-
- Query processing resulted in zero output events.
36-
- Events or its fields might be malformed, resulting in zero output after query processing.
37-
- The job was unable to push data to the output sink for connectivity or authentication reasons.
22+
* If Input Events are greater than 0, the job is able to read input data. If Input Events are not greater than 0, then there is an issue with the job's input. See [Troubleshoot input connections](stream-analytics-troubleshoot-input.md) to learn how to troubleshoot input connection issues.
23+
* If Data Conversion Errors are greater than 0 and climbing, see [Azure Stream Analytics data errors](data-errors.md) for detailed information about data conversion errors.
24+
* If Runtime Errors are greater than 0, your job can receive data but it's generating errors while processing the query. To find the errors, go to the [Audit Logs](../azure-resource-manager/management/view-activity-logs.md) and filter on *Failed* status.
25+
* If InputEvents is greater than 0 and OutputEvents equals 0, one of the following is true:
26+
* Query processing resulted in zero output events.
27+
* Events or fields might be malformed, resulting in zero output after query processing.
28+
* The job was unable to push data to the output sink for connectivity or authentication reasons.
3829

39-
- In all the previously mentioned error cases, operations log messages explain additional details (including what is happening), except in cases where the query logic filtered out all events. If the processing of multiple events generates errors, Stream Analytics logs the first three error messages of the same type within 10 minutes to Operations logs. It then suppresses additional identical errors with a message that reads "Errors are happening too rapidly, these are being suppressed."
30+
In all the previously mentioned error cases, operations log messages explain additional details (including what is happening), except in cases where the query logic filtered out all events. If the processing of multiple events generates errors, the errors are aggregated every 10 minutes.
4031

4132
## Job output is delayed
4233

4334
### First output is delayed
35+
4436
When a Stream Analytics job is started, the input events are read, but there can be a delay in the output being produced in certain circumstances.
4537

4638
Large time values in temporal query elements can contribute to the output delay. To produce correct output over the large time windows, the streaming job starts up by reading data from the latest time possible (up to seven days ago) to fill the time window. During that time, no output is produced until the catch-up read of the outstanding input events is complete. This problem can surface when the system upgrades the streaming jobs, thus restarting the job. Such upgrades generally occur once every couple of months.
@@ -64,6 +56,7 @@ These factors impact the timeliness of the first output that is generated:
6456
- For analytic functions, the output is generated for every event, there is no delay.
6557

6658
### Output falls behind
59+
6760
During normal operation of the job, if you find the job’s output is falling behind (longer and longer latency), you can pinpoint the root causes by examining these factors:
6861
- Whether the downstream sink is throttled
6962
- Whether the upstream source is throttled
@@ -83,12 +76,12 @@ Note the following observations when configuring IGNORE_DUP_KEY for several type
8376

8477
* You cannot set IGNORE_DUP_KEY on a primary key or a unique constraint that uses ALTER INDEX, you need to drop and recreate the index.
8578
* You can set the IGNORE_DUP_KEY option using ALTER INDEX for a unique index, which is different from PRIMARY KEY/UNIQUE constraint and created using CREATE INDEX or INDEX definition.
79+
8680
* IGNORE_DUP_KEY doesn’t apply to column store indexes because you can’t enforce uniqueness on such indexes.
8781

8882
## Column names are lower-cased by Azure Stream Analytics
8983
When using the original compatibility level (1.0), Azure Stream Analytics used to change column names to lower case. This behavior was fixed in later compatibility levels. In order to preserve the case, we advise customers to move to the compatibility level 1.1 and later. You can find more information on [Compatibility level for Azure Stream Analytics jobs](https://docs.microsoft.com/azure/stream-analytics/stream-analytics-compatibility-level).
9084

91-
9285
## Get help
9386

9487
For further assistance, try our [Azure Stream Analytics forum](https://social.msdn.microsoft.com/Forums/azure/home?forum=AzureStreamAnalytics).

0 commit comments

Comments
 (0)