Skip to content

Commit b6cd8d1

Browse files
authored
Merge pull request #284781 from MicrosoftDocs/repo_sync_working_branch
Confirm merge from repo_sync_working_branch to main to sync with https://github.com/MicrosoftDocs/azure-docs (branch main)
2 parents cf5a445 + c6e33c3 commit b6cd8d1

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

articles/stream-analytics/sql-database-output.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -43,6 +43,10 @@ Partitioning needs to enabled and is based on the PARTITION BY clause in the que
4343

4444
You can configure the max message size by using **Max batch count**. The default maximum is 10,000 and the default minimum is 100 rows per single bulk insert. For more information, see [Azure SQL limits](/azure/azure-sql/database/resource-limits-logical-server). Every batch is initially bulk inserted with maximum batch count. Batch is split in half (until minimum batch count) based on retryable errors from SQL.
4545

46+
## Output data type mappings
47+
48+
As the schema of the target table in your SQL database must exactly match the fields and their types in your job's output, you can refer to [Data Types (Azure Stream Analytics)](/stream-analytics-query/data-types-azure-stream-analytics) for detailed type mappings between ASA and SQL.
49+
4650
## Limitation
4751

4852
Self-signed Secured Sockets Layer (SSL) certificate isn't supported when trying to connect Azure Stream Analytics jobs to SQL on VM.

0 commit comments

Comments
 (0)