You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/capture-event-hub-data-parquet.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,8 @@
1
1
---
2
2
title: Capture data from Azure Data Lake Storage Gen2 in Parquet format
3
3
description: Learn how to use the node code editor to automatically capture the streaming data in Event Hubs in an Azure Data Lake Storage Gen2 account in Parquet format.
Copy file name to clipboardExpand all lines: articles/stream-analytics/cluster-overview.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,8 @@
1
1
---
2
2
title: Overview of Azure Stream Analytics Clusters
3
3
description: Learn about single tenant dedicated offering of Stream Analytics Cluster.
4
-
author: sidramadoss
5
-
ms.author: sidram
4
+
author: ahartoon
5
+
ms.author: anboisve
6
6
ms.service: stream-analytics
7
7
ms.topic: overview
8
8
ms.custom: mvc, event-tier1-build-2022
@@ -17,9 +17,9 @@ Stream Analytics clusters are billed by Streaming Units (SUs) which represent th
17
17
18
18
## What are Stream Analytics clusters
19
19
20
-
Stream Analytics clusters are powered by the same engine that powers Stream Analytics jobs running in a multi-tenant environment. The single tenant, dedicated cluster have the following features:
20
+
Stream Analytics clusters are powered by the same engine that powers Stream Analytics jobs running in a multi-tenant environment. The single tenant, dedicated cluster has the following features:
21
21
22
-
* Single tenant hosting with no noise from other tenants. Your resources are truly "isolated" and performs better when there are burst in traffic.
22
+
* Single tenant hosting with no noise from other tenants. Your resources are truly "isolated" and perform better when there are burst in traffic.
23
23
24
24
* Scale your cluster between 36 to 396 SUs as your streaming usage increases over time.
25
25
@@ -41,13 +41,13 @@ The easiest way to get started is to create and develop a Stream Analytics job t
41
41
42
42
Stream Analytics jobs alone don't support VNets. If your inputs or outputs are secured behind a firewall or an Azure Virtual Network, you have the following two options:
43
43
44
-
* If your local machine has access to the input and output resources secured by a VNet (for example, Azure Event Hubs or Azure SQL Database), you can [install Azure Stream Analytics tools for Visual Studio](stream-analytics-tools-for-visual-studio-install.md) on your local machine. You can develop and [test Stream Analytics jobs locally](stream-analytics-live-data-local-testing.md) on your device without incurring any cost. Once you are ready to use Stream Analytics in your architecture, you can then create a Stream Analytics cluster, configure private endpoints, and run your jobs at scale.
44
+
* If your local machine has access to the input and output resources secured by a VNet (for example, Azure Event Hubs or Azure SQL Database), you can [install Azure Stream Analytics tools for Visual Studio](stream-analytics-tools-for-visual-studio-install.md) on your local machine. You can develop and [test Stream Analytics jobs locally](stream-analytics-live-data-local-testing.md) on your device without incurring any cost. Once you're ready to use Stream Analytics in your architecture, you can then create a Stream Analytics cluster, configure private endpoints, and run your jobs at scale.
45
45
46
46
* You can create a Stream Analytics cluster, configure the cluster with the private endpoints needed for your pipeline, and run your Stream Analytics jobs on the cluster.
47
47
48
48
### What performance can I expect?
49
49
50
-
An SU is the same across the Standard and Dedicated offerings. A single job that utilizes a full 36 SU cluster can achieve approximately 36 MB/second throughput with millisecond latency. The exact number depends on the format of events and the type of analytics. Because it is dedicated, Stream Analytics cluster offers more reliable performance guarantees. All the jobs running on your cluster belong only to you.
50
+
An SU is the same across the Standard and Dedicated offerings. A single job that utilizes a full 36 SU cluster can achieve approximately 36 MB/second throughput with millisecond latency. The exact number depends on the format of events and the type of analytics. Because it's dedicated, Stream Analytics cluster offers more reliable performance guarantees. All the jobs running on your cluster belong only to you.
51
51
52
52
### Can I scale my cluster?
53
53
@@ -63,7 +63,7 @@ Your Stream Analytics clusters are charged based on the chosen SU capacity. Clus
63
63
64
64
### Which inputs and outputs can I privately connect to from my Stream Analytics cluster?
65
65
66
-
Stream Analytics supports various input and output types. You can [create private endpoints](private-endpoints.md) in your cluster that allow jobs to access the input and output resources. Currently Azure SQL Database, Azure Cosmos DB, Azure Storage, Azure Data Lake Storage Gen2, Azure Event Hub, Azure IoT Hubs, Azure Function and Azure Service Bus are supported services for which you can create managed private endpoints.
66
+
Stream Analytics supports various input and output types. You can [create private endpoints](private-endpoints.md) in your cluster that allow jobs to access the input and output resources. Currently Azure SQL Database, Azure Cosmos DB, Azure Storage, Azure Data Lake Storage Gen2, Azure Event Hubs, Azure IoT Hubs, Azure Function and Azure Service Bus are supported services for which you can create managed private endpoints.
description: Troubleshoot Azure Stream Analytics issues with configuration error codes.
4
-
ms.author: sidram
5
-
author: sidramadoss
4
+
author: ahartoon
5
+
ms.author: anboisve
6
6
ms.topic: troubleshooting
7
7
ms.date: 05/07/2020
8
8
ms.service: stream-analytics
@@ -15,21 +15,21 @@ You can use activity logs and resource logs to help debug unexpected behaviors f
15
15
16
16
## EventHubUnauthorizedAccess
17
17
18
-
***Cause**: Event Hub threw an *Unauthorized Access* error.
18
+
***Cause**: Event Hubs threw an *Unauthorized Access* error.
19
19
20
20
## EventHubReceiverEpochConflict
21
21
22
-
***Cause**: There is more than one Event Hub receiver with different epoch values.
23
-
***Recommendation**: Ensure *Service Bus Explorer* or an *EventProcessorHost* application is not connected while your Stream Analytics job is running.
22
+
***Cause**: There's more than one Event Hubs receiver with different epoch values.
23
+
***Recommendation**: Ensure *Service Bus Explorer* or an *EventProcessorHost* application isn't connected while your Stream Analytics job is running.
24
24
25
25
## EventHubReceiverQuotaExceeded
26
26
27
27
***Cause**: Stream Analytics can't connect to a partition because the maximum number of allowed receivers per partition in a consumer group has been reached.
28
-
***Recommendation**: Ensure that other Stream Analytics jobs or Service Bus Explorer are not using the same consumer group.
28
+
***Recommendation**: Ensure that other Stream Analytics jobs or Service Bus Explorer aren't using the same consumer group.
29
29
30
30
## EventHubOutputThrottled
31
31
32
-
***Cause**: An error occurred while writing data to Event Hub due to throttling.
32
+
***Cause**: An error occurred while writing data to Event Hubs due to throttling.
33
33
***Recommendation**: If this happens consistently, upgrade the throughput.
34
34
35
35
## EventHubOutputInvalidConnectionConfig
@@ -39,26 +39,26 @@ You can use activity logs and resource logs to help debug unexpected behaviors f
39
39
40
40
## EventHubOutputInvalidHostname
41
41
42
-
***Cause**: The Event Hub host is unreachable.
42
+
***Cause**: The Event Hubs host is unreachable.
43
43
***Recommendation**: Ensure the supplied host name is correct.
44
44
45
45
## EventHubOutputUnexpectedPartitionCount
46
46
47
-
***Cause**: The EventHub sender encountered an unexpected EventHub partition count.
48
-
***Recommendation**: Restart your Stream Analytics job if the EventHub's partition count has changed.
47
+
***Cause**: The Event Hubs sender encountered an unexpected partition count.
48
+
***Recommendation**: Restart your Stream Analytics job if the event hub's partition count has changed.
49
49
50
50
## CosmosDBPartitionKeyNotFound
51
51
52
52
***Cause**: Stream Analytics couldn't find the partition key of a particular Azure Cosmos DB collection in the database.
53
-
***Recommendation**: Ensure there is a valid partition key specified for the collection in Azure Cosmos DB.
53
+
***Recommendation**: Ensure there's a valid partition key specified for the collection in Azure Cosmos DB.
54
54
55
55
## CosmosDBInvalidPartitionKeyColumn
56
56
57
57
***Cause**: Thrown when a partition key is neither a leaf node nor at the top level.
58
58
59
59
## CosmosDBInvalidIdColumn
60
60
61
-
***Cause**: The query output can't contain the column \[id] if a different column is chosen as the primary key property.
61
+
***Cause**: The query output can't contain the column \[`id`] if a different column is chosen as the primary key property.
62
62
63
63
## CosmosDBDatabaseNotFound
64
64
@@ -89,7 +89,7 @@ You can use activity logs and resource logs to help debug unexpected behaviors f
Copy file name to clipboardExpand all lines: articles/stream-analytics/custom-deserializer-examples.md
+15-15Lines changed: 15 additions & 15 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,8 @@
1
1
---
2
2
title: Read input in any format using .NET custom deserializers in Azure Stream Analytics
3
3
description: This article explains the serialization format and the interfaces that define custom .NET deserializers for Azure Stream Analytics cloud and edge jobs.
4
-
author: sidramadoss
5
-
ms.author: sidram
4
+
author: ahartoon
5
+
ms.author: anboisve
6
6
ms.service: stream-analytics
7
7
ms.topic: conceptual
8
8
ms.date: 6/16/2021
@@ -17,7 +17,7 @@ ms.custom: devx-track-csharp
17
17
18
18
Following code samples are the interfaces that define the custom deserializer and implement `StreamDeserializer<T>`.
19
19
20
-
`UserDefinedOperator` is the base class for all custom streaming operators. It initializes `StreamingContext`, which provides context which includes mechanism for publishing diagnostics for which you will need to debug any issues with your deserializer.
20
+
`UserDefinedOperator` is the base class for all custom streaming operators. It initializes `StreamingContext`, which provides context, which includes mechanism for publishing diagnostics for which you'll need to debug any issues with your deserializer.
21
21
22
22
```csharp
23
23
publicabstractclassUserDefinedOperator
@@ -28,7 +28,7 @@ Following code samples are the interfaces that define the custom deserializer an
28
28
29
29
The following code snippet is the deserialization for streaming data.
30
30
31
-
Skippable errors should be emitted using `IStreamingDiagnostics` passed through `UserDefinedOperator`'s Initialize method. All exceptions will be treated as errors and the deserializer will be recreated. After a certain number of errors, the job will go to a failed status.
31
+
Skippable errors should be emitted using `IStreamingDiagnostics` passed through `UserDefinedOperator`'s Initialize method. All exceptions will be treated as errors and the deserializer will be recreated. After some errors, the job will go to a failed status.
32
32
33
33
`StreamDeserializer<T>` deserializes a stream into object of type `T`. The following conditions must be met:
34
34
@@ -37,8 +37,8 @@ Skippable errors should be emitted using `IStreamingDiagnostics` passed through
37
37
1. One of [sbyte, byte, short, ushort, int, uint, long, DateTime, string, float, double] or their nullable equivalents.
38
38
1. Another struct or class following the same rules.
39
39
1. Array of type `T2` that follows the same rules.
40
-
1.IList`T2` where T2 follows the same rules.
41
-
1.Does not have any recursive types.
40
+
1.`IListT2` where T2 follows the same rules.
41
+
1.Doesn't have any recursive types.
42
42
43
43
The parameter `stream` is the stream containing the serialized object. `Deserialize` returns a collection of `T` instances.
44
44
@@ -49,7 +49,7 @@ The parameter `stream` is the stream containing the serialized object. `Deserial
49
49
}
50
50
```
51
51
52
-
`StreamingContext` provides context which includes mechanism for publishing diagnostics for user operator.
52
+
`StreamingContext` provides context, which includes mechanism for publishing diagnostics for user operator.
53
53
54
54
```csharp
55
55
publicabstractclassStreamingContext
@@ -62,7 +62,7 @@ The parameter `stream` is the stream containing the serialized object. `Deserial
62
62
63
63
`WriteError` writes an error message to resource logs and sends the error to diagnostics.
64
64
65
-
`briefMessage` is a brief error message. This message shows up in diagnostics and is used by the product team for debugging purposes. Do not include sensitive information, and keep the message less than 200 characters
65
+
`briefMessage` is a brief error message. This message shows up in diagnostics and is used by the product team for debugging purposes. Don't include sensitive information, and keep the message fewer than 200 characters
66
66
67
67
`detailedMessage` is a detailed error message that is only added to your resource logs in your storage. This message should be less than 2000 characters.
68
68
@@ -75,7 +75,7 @@ The parameter `stream` is the stream containing the serialized object. `Deserial
75
75
76
76
## Deserializer examples
77
77
78
-
This section shows you how to write custom deserializers for Protobuf and CSV. For additional examples, such as AVRO format for Event Hub Capture, visit [Azure Stream Analytics on GitHub](https://github.com/Azure/azure-stream-analytics/tree/master/CustomDeserializers).
78
+
This section shows you how to write custom deserializers for Protobuf and CSV. For more examples, such as AVRO format for Event Hubs Capture, visit [Azure Stream Analytics on GitHub](https://github.com/Azure/azure-stream-analytics/tree/master/CustomDeserializers).
79
79
80
80
### Protocol buffer (Protobuf) format
81
81
@@ -107,7 +107,7 @@ message MessageBodyProto {
107
107
}
108
108
```
109
109
110
-
Running `protoc.exe` from the **Google.Protobuf.Tools** NuGet generates a .cs file with the definition. The generated file is not shown here. You must ensure that the version of Protobuf Nuget you use in your Stream Analytics project matches the Protobuf version that was used to generate the input.
110
+
Running `protoc.exe` from the **Google.Protobuf.Tools** NuGet generates a .cs file with the definition. The generated file isn't shown here. You must ensure that the version of Protobuf NuGet you use in your Stream Analytics project matches the Protobuf version that was used to generate the input.
111
111
112
112
The following code snippet is the deserializer implementation assuming the generated file is included in the project. This implementation is just a thin wrapper over the generated file.
113
113
@@ -227,25 +227,25 @@ This feature is available in the following regions when using Standard SKU:
227
227
* East US 2
228
228
* West Europe
229
229
230
-
You can [request support](https://aka.ms/ccodereqregion) for additional regions. However, there is no such region restriction when using [Stream Analytics clusters](./cluster-overview.md).
230
+
You can [request support](https://aka.ms/ccodereqregion) for more regions. However, there's no such region restriction when using [Stream Analytics clusters](./cluster-overview.md).
231
231
232
232
## Frequently asked questions
233
233
234
234
### When will this feature be available in all Azure regions?
235
235
236
-
This feature is available in [6 regions](#region-support). If you are interested in using this functionality in another region, you can [submit a request](https://aka.ms/ccodereqregion). Support for all Azure regions is on the roadmap.
236
+
This feature is available in [6 regions](#region-support). If you're interested in using this functionality in another region, you can [submit a request](https://aka.ms/ccodereqregion). Support for all Azure regions is on the roadmap.
237
237
238
238
### Can I access MetadataPropertyValue from my inputs similar to GetMetadataPropertyValue function?
239
239
240
-
This functionality is not supported. If you need this capability, you can vote for this request on [UserVoice](https://feedback.azure.com/d365community/idea/b4517302-b925-ec11-b6e6-000d3a4f0f1c).
240
+
This functionality isn't supported. If you need this capability, you can vote for this request on [UserVoice](https://feedback.azure.com/d365community/idea/b4517302-b925-ec11-b6e6-000d3a4f0f1c).
241
241
242
242
### Can I share my deserializer implementation with the community so that others can benefit?
243
243
244
-
Once you have implemented your deserializer, you can help others by sharing it with the community. Submit your code to the [Azure Stream Analytics GitHub repo](https://github.com/Azure/azure-stream-analytics/tree/master/CustomDeserializers).
244
+
Once you've implemented your deserializer, you can help others by sharing it with the community. Submit your code to the [Azure Stream Analytics GitHub repo](https://github.com/Azure/azure-stream-analytics/tree/master/CustomDeserializers).
245
245
246
246
### What are the other limitations of using custom deserializers in Stream Analytics?
247
247
248
-
If your input is of Protobuf format with a schema containing `MapField` type, you will not be able to implement a custom deserializer. Also, custom deserializers do not support sample data or preview data.
248
+
If your input is of Protobuf format with a schema containing `MapField` type, you won't be able to implement a custom deserializer. Also, custom deserializers don't support sample data or preview data.
0 commit comments