Skip to content

Commit b8ae472

Browse files
authored
Update stream-analytics-troubleshoot-output.md
1 parent 56d4c47 commit b8ae472

File tree

1 file changed

+8
-11
lines changed

1 file changed

+8
-11
lines changed

articles/stream-analytics/stream-analytics-troubleshoot-output.md

Lines changed: 8 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -6,27 +6,22 @@ ms.author: sidram
66
ms.reviewer: mamccrea
77
ms.service: stream-analytics
88
ms.topic: conceptual
9-
ms.date: 12/07/2018
9+
ms.date: 03/27/2020
1010
ms.custom: seodec18
1111
---
1212

1313
# Troubleshoot Azure Stream Analytics outputs
1414

15-
This page describes common issues with output connections and how to troubleshoot and address them.
15+
This page describes common issues with output connections and how to troubleshoot and address them. Enable diagnostic logs as a best practice.
1616

1717
## Output not produced by job
1818
1. Verify connectivity to outputs by using the **Test Connection** button for each output.
1919

2020
2. Look at [**Monitoring Metrics**](stream-analytics-monitoring.md) on the **Monitor** tab. Because the values are aggregated, the metrics are delayed by a few minutes.
21-
- If Input Events > 0, the job is able to read input data. If Input Events is not > 0, then:
22-
- To see whether the data source has valid data, check it by using [Service Bus Explorer](https://code.msdn.microsoft.com/windowsapps/Service-Bus-Explorer-f2abca5a). This check applies if the job is using Event Hub as input.
23-
- Check to see whether the data serialization format and data encoding are as expected.
24-
- If the job is using an Event Hub, check to see whether the body of the message is *Null*.
2521

26-
- If Data Conversion Errors > 0 and climbing, the following might be true:
27-
- The output event does not conform to the schema of the target sink.
28-
- The event schema might not match the defined or expected schema of the events in the query.
29-
- The datatypes of some of the fields in the event might not match expectations.
22+
* If Input Events > 0, the job is able to read input data. If Input Events is not > 0, then there is an issue with the job input. Review the [input troubleshooting](stream-analytics-troubleshoot-input.md) page for more information.
23+
24+
* If Data Conversion Errors > 0 and climbing, review the [data errors]() page for more information.
3025

3126
- If Runtime Errors > 0, it means that the job can receive the data but is generating errors while processing the query.
3227
- To find the errors, go to the [Audit Logs](../azure-resource-manager/management/view-activity-logs.md) and filter on *Failed* status.
@@ -36,11 +31,12 @@ This page describes common issues with output connections and how to troubleshoo
3631
- Events or its fields might be malformed, resulting in zero output after query processing.
3732
- The job was unable to push data to the output sink for connectivity or authentication reasons.
3833

39-
- In all the previously mentioned error cases, operations log messages explain additional details (including what is happening), except in cases where the query logic filtered out all events. If the processing of multiple events generates errors, Stream Analytics logs the first three error messages of the same type within 10 minutes to Operations logs. It then suppresses additional identical errors with a message that reads "Errors are happening too rapidly, these are being suppressed."
34+
- In all the previously mentioned error cases, operations log messages explain additional details (including what is happening), except in cases where the query logic filtered out all events. If the processing of multiple events generates errors, the errors are aggregated every 10 minutes.
4035

4136
## Job output is delayed
4237

4338
### First output is delayed
39+
4440
When a Stream Analytics job is started, the input events are read, but there can be a delay in the output being produced in certain circumstances.
4541

4642
Large time values in temporal query elements can contribute to the output delay. To produce correct output over the large time windows, the streaming job starts up by reading data from the latest time possible (up to seven days ago) to fill the time window. During that time, no output is produced until the catch-up read of the outstanding input events is complete. This problem can surface when the system upgrades the streaming jobs, thus restarting the job. Such upgrades generally occur once every couple of months.
@@ -64,6 +60,7 @@ These factors impact the timeliness of the first output that is generated:
6460
- For analytic functions, the output is generated for every event, there is no delay.
6561

6662
### Output falls behind
63+
6764
During normal operation of the job, if you find the job’s output is falling behind (longer and longer latency), you can pinpoint the root causes by examining these factors:
6865
- Whether the downstream sink is throttled
6966
- Whether the upstream source is throttled

0 commit comments

Comments
 (0)