Skip to content

Commit 284e308

Browse files
authored
Merge pull request #121304 from an-emma/patch-10
Update stream-analytics-define-inputs.md
2 parents b719a29 + 946652b commit 284e308

File tree

1 file changed

+1
-3
lines changed

1 file changed

+1
-3
lines changed

articles/stream-analytics/stream-analytics-define-inputs.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -136,8 +136,6 @@ If an Azure Stream Analytics job is started using *Now* at 13:00, and a blob is
136136

137137
To process the data as a stream using a timestamp in the event payload, you must use the [TIMESTAMP BY](/stream-analytics-query/stream-analytics-query-language-reference) keyword. A Stream Analytics job pulls data from Azure Blob storage or Azure Data Lake Storage Gen2 input every second if the blob file is available. If the blob file is unavailable, there's an exponential backoff with a maximum time delay of 90 seconds.
138138

139-
CSV-formatted inputs require a header row to define fields for the data set, and all header row fields must be unique.
140-
141139
> [!NOTE]
142140
> Stream Analytics does not support adding content to an existing blob file. Stream Analytics will view each file only once, and any changes that occur in the file after the job has read the data are not processed. Best practice is to upload all the data for a blob file at once and then add additional newer events to a different, new blob file.
143141
@@ -199,4 +197,4 @@ For more information, see [Stream data from Kafka into Azure Stream Analytics (P
199197
[stream.analytics.introduction]: stream-analytics-introduction.md
200198
[stream.analytics.get.started]: stream-analytics-real-time-fraud-detection.md
201199
[stream.analytics.query.language.reference]: /stream-analytics-query/stream-analytics-query-language-reference
202-
[stream.analytics.rest.api.reference]: /rest/api/streamanalytics/
200+
[stream.analytics.rest.api.reference]: /rest/api/streamanalytics/

0 commit comments

Comments
 (0)