Skip to content

Commit 5281095

Browse files
authored
Merge pull request #220025 from spelluru/asaownership1130
Sid left MS. Updated ms.author
2 parents d4302ff + c2ada08 commit 5281095

File tree

52 files changed

+156
-167
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

52 files changed

+156
-167
lines changed

articles/stream-analytics/capture-event-hub-data-parquet.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22
title: Capture data from Azure Data Lake Storage Gen2 in Parquet format
33
description: Learn how to use the node code editor to automatically capture the streaming data in Event Hubs in an Azure Data Lake Storage Gen2 account in Parquet format.
4-
author: sidramadoss
5-
ms.author: sidram
4+
author: xujxu
5+
ms.author: xujiang1
66
ms.service: stream-analytics
77
ms.topic: how-to
88
ms.custom: mvc, event-tier1-build-2022

articles/stream-analytics/cluster-overview.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22
title: Overview of Azure Stream Analytics Clusters
33
description: Learn about single tenant dedicated offering of Stream Analytics Cluster.
4-
author: sidramadoss
5-
ms.author: sidram
4+
author: ahartoon
5+
ms.author: anboisve
66
ms.service: stream-analytics
77
ms.topic: overview
88
ms.custom: mvc, event-tier1-build-2022
@@ -17,9 +17,9 @@ Stream Analytics clusters are billed by Streaming Units (SUs) which represent th
1717

1818
## What are Stream Analytics clusters
1919

20-
Stream Analytics clusters are powered by the same engine that powers Stream Analytics jobs running in a multi-tenant environment. The single tenant, dedicated cluster have the following features:
20+
Stream Analytics clusters are powered by the same engine that powers Stream Analytics jobs running in a multi-tenant environment. The single tenant, dedicated cluster has the following features:
2121

22-
* Single tenant hosting with no noise from other tenants. Your resources are truly "isolated" and performs better when there are burst in traffic.
22+
* Single tenant hosting with no noise from other tenants. Your resources are truly "isolated" and perform better when there are burst in traffic.
2323

2424
* Scale your cluster between 36 to 396 SUs as your streaming usage increases over time.
2525

@@ -41,13 +41,13 @@ The easiest way to get started is to create and develop a Stream Analytics job t
4141

4242
Stream Analytics jobs alone don't support VNets. If your inputs or outputs are secured behind a firewall or an Azure Virtual Network, you have the following two options:
4343

44-
* If your local machine has access to the input and output resources secured by a VNet (for example, Azure Event Hubs or Azure SQL Database), you can [install Azure Stream Analytics tools for Visual Studio](stream-analytics-tools-for-visual-studio-install.md) on your local machine. You can develop and [test Stream Analytics jobs locally](stream-analytics-live-data-local-testing.md) on your device without incurring any cost. Once you are ready to use Stream Analytics in your architecture, you can then create a Stream Analytics cluster, configure private endpoints, and run your jobs at scale.
44+
* If your local machine has access to the input and output resources secured by a VNet (for example, Azure Event Hubs or Azure SQL Database), you can [install Azure Stream Analytics tools for Visual Studio](stream-analytics-tools-for-visual-studio-install.md) on your local machine. You can develop and [test Stream Analytics jobs locally](stream-analytics-live-data-local-testing.md) on your device without incurring any cost. Once you're ready to use Stream Analytics in your architecture, you can then create a Stream Analytics cluster, configure private endpoints, and run your jobs at scale.
4545

4646
* You can create a Stream Analytics cluster, configure the cluster with the private endpoints needed for your pipeline, and run your Stream Analytics jobs on the cluster.
4747

4848
### What performance can I expect?
4949

50-
An SU is the same across the Standard and Dedicated offerings. A single job that utilizes a full 36 SU cluster can achieve approximately 36 MB/second throughput with millisecond latency. The exact number depends on the format of events and the type of analytics. Because it is dedicated, Stream Analytics cluster offers more reliable performance guarantees. All the jobs running on your cluster belong only to you.
50+
An SU is the same across the Standard and Dedicated offerings. A single job that utilizes a full 36 SU cluster can achieve approximately 36 MB/second throughput with millisecond latency. The exact number depends on the format of events and the type of analytics. Because it's dedicated, Stream Analytics cluster offers more reliable performance guarantees. All the jobs running on your cluster belong only to you.
5151

5252
### Can I scale my cluster?
5353

@@ -63,7 +63,7 @@ Your Stream Analytics clusters are charged based on the chosen SU capacity. Clus
6363

6464
### Which inputs and outputs can I privately connect to from my Stream Analytics cluster?
6565

66-
Stream Analytics supports various input and output types. You can [create private endpoints](private-endpoints.md) in your cluster that allow jobs to access the input and output resources. Currently Azure SQL Database, Azure Cosmos DB, Azure Storage, Azure Data Lake Storage Gen2, Azure Event Hub, Azure IoT Hubs, Azure Function and Azure Service Bus are supported services for which you can create managed private endpoints.
66+
Stream Analytics supports various input and output types. You can [create private endpoints](private-endpoints.md) in your cluster that allow jobs to access the input and output resources. Currently Azure SQL Database, Azure Cosmos DB, Azure Storage, Azure Data Lake Storage Gen2, Azure Event Hubs, Azure IoT Hubs, Azure Function and Azure Service Bus are supported services for which you can create managed private endpoints.
6767

6868
## Next steps
6969

articles/stream-analytics/configuration-error-codes.md

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22
title: Configuration error codes - Azure Stream Analytics
33
description: Troubleshoot Azure Stream Analytics issues with configuration error codes.
4-
ms.author: sidram
5-
author: sidramadoss
4+
author: ahartoon
5+
ms.author: anboisve
66
ms.topic: troubleshooting
77
ms.date: 05/07/2020
88
ms.service: stream-analytics
@@ -15,21 +15,21 @@ You can use activity logs and resource logs to help debug unexpected behaviors f
1515

1616
## EventHubUnauthorizedAccess
1717

18-
* **Cause**: Event Hub threw an *Unauthorized Access* error.
18+
* **Cause**: Event Hubs threw an *Unauthorized Access* error.
1919

2020
## EventHubReceiverEpochConflict
2121

22-
* **Cause**: There is more than one Event Hub receiver with different epoch values.
23-
* **Recommendation**: Ensure *Service Bus Explorer* or an *EventProcessorHost* application is not connected while your Stream Analytics job is running.
22+
* **Cause**: There's more than one Event Hubs receiver with different epoch values.
23+
* **Recommendation**: Ensure *Service Bus Explorer* or an *EventProcessorHost* application isn't connected while your Stream Analytics job is running.
2424

2525
## EventHubReceiverQuotaExceeded
2626

2727
* **Cause**: Stream Analytics can't connect to a partition because the maximum number of allowed receivers per partition in a consumer group has been reached.
28-
* **Recommendation**: Ensure that other Stream Analytics jobs or Service Bus Explorer are not using the same consumer group.
28+
* **Recommendation**: Ensure that other Stream Analytics jobs or Service Bus Explorer aren't using the same consumer group.
2929

3030
## EventHubOutputThrottled
3131

32-
* **Cause**: An error occurred while writing data to Event Hub due to throttling.
32+
* **Cause**: An error occurred while writing data to Event Hubs due to throttling.
3333
* **Recommendation**: If this happens consistently, upgrade the throughput.
3434

3535
## EventHubOutputInvalidConnectionConfig
@@ -39,26 +39,26 @@ You can use activity logs and resource logs to help debug unexpected behaviors f
3939

4040
## EventHubOutputInvalidHostname
4141

42-
* **Cause**: The Event Hub host is unreachable.
42+
* **Cause**: The Event Hubs host is unreachable.
4343
* **Recommendation**: Ensure the supplied host name is correct.
4444

4545
## EventHubOutputUnexpectedPartitionCount
4646

47-
* **Cause**: The EventHub sender encountered an unexpected EventHub partition count.
48-
* **Recommendation**: Restart your Stream Analytics job if the EventHub's partition count has changed.
47+
* **Cause**: The Event Hubs sender encountered an unexpected partition count.
48+
* **Recommendation**: Restart your Stream Analytics job if the event hub's partition count has changed.
4949

5050
## CosmosDBPartitionKeyNotFound
5151

5252
* **Cause**: Stream Analytics couldn't find the partition key of a particular Azure Cosmos DB collection in the database.
53-
* **Recommendation**: Ensure there is a valid partition key specified for the collection in Azure Cosmos DB.
53+
* **Recommendation**: Ensure there's a valid partition key specified for the collection in Azure Cosmos DB.
5454

5555
## CosmosDBInvalidPartitionKeyColumn
5656

5757
* **Cause**: Thrown when a partition key is neither a leaf node nor at the top level.
5858

5959
## CosmosDBInvalidIdColumn
6060

61-
* **Cause**: The query output can't contain the column \[id] if a different column is chosen as the primary key property.
61+
* **Cause**: The query output can't contain the column \[`id`] if a different column is chosen as the primary key property.
6262

6363
## CosmosDBDatabaseNotFound
6464

@@ -89,7 +89,7 @@ You can use activity logs and resource logs to help debug unexpected behaviors f
8989

9090
## SQLDWOutputInvalidServiceEdition
9191

92-
* **Cause**: SQL Database is not supported.
92+
* **Cause**: SQL Database isn't supported.
9393
* **Recommendation**: Use dedicated SQL pool.
9494

9595
## Next steps

articles/stream-analytics/connect-job-to-vnet.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,8 @@
11
---
22
title: Connect Stream Analytics jobs to resources in an Azure Virtual Network (VNET)
33
description: This article describes how to connect an Azure Stream Analytics job with resources that are in a VNET.
4-
author: sidramadoss
5-
ms.author: sidram
6-
4+
author: ahartoon
5+
ms.author: anboisve
76
ms.service: stream-analytics
87
ms.topic: conceptual
98
ms.date: 01/04/2021

articles/stream-analytics/create-cluster.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,8 @@
22
title: Create an Azure Stream Analytics Cluster quickstart
33
description: Learn how to create an Azure Stream Analytics cluster.
44
ms.service: stream-analytics
5-
author: sidramadoss
6-
ms.author: sidram
5+
author: xujxu
6+
ms.author: xujiang1
77
ms.topic: quickstart
88
ms.custom: mvc, mode-ui, event-tier1-build-2022
99
ms.date: 05/10/2022

articles/stream-analytics/custom-deserializer-examples.md

Lines changed: 15 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22
title: Read input in any format using .NET custom deserializers in Azure Stream Analytics
33
description: This article explains the serialization format and the interfaces that define custom .NET deserializers for Azure Stream Analytics cloud and edge jobs.
4-
author: sidramadoss
5-
ms.author: sidram
4+
author: ahartoon
5+
ms.author: anboisve
66
ms.service: stream-analytics
77
ms.topic: conceptual
88
ms.date: 6/16/2021
@@ -17,7 +17,7 @@ ms.custom: devx-track-csharp
1717

1818
Following code samples are the interfaces that define the custom deserializer and implement `StreamDeserializer<T>`.
1919

20-
`UserDefinedOperator` is the base class for all custom streaming operators. It initializes `StreamingContext`, which provides context which includes mechanism for publishing diagnostics for which you will need to debug any issues with your deserializer.
20+
`UserDefinedOperator` is the base class for all custom streaming operators. It initializes `StreamingContext`, which provides context, which includes mechanism for publishing diagnostics for which you'll need to debug any issues with your deserializer.
2121

2222
```csharp
2323
public abstract class UserDefinedOperator
@@ -28,7 +28,7 @@ Following code samples are the interfaces that define the custom deserializer an
2828

2929
The following code snippet is the deserialization for streaming data.
3030

31-
Skippable errors should be emitted using `IStreamingDiagnostics` passed through `UserDefinedOperator`'s Initialize method. All exceptions will be treated as errors and the deserializer will be recreated. After a certain number of errors, the job will go to a failed status.
31+
Skippable errors should be emitted using `IStreamingDiagnostics` passed through `UserDefinedOperator`'s Initialize method. All exceptions will be treated as errors and the deserializer will be recreated. After some errors, the job will go to a failed status.
3232

3333
`StreamDeserializer<T>` deserializes a stream into object of type `T`. The following conditions must be met:
3434

@@ -37,8 +37,8 @@ Skippable errors should be emitted using `IStreamingDiagnostics` passed through
3737
1. One of [sbyte, byte, short, ushort, int, uint, long, DateTime, string, float, double] or their nullable equivalents.
3838
1. Another struct or class following the same rules.
3939
1. Array of type `T2` that follows the same rules.
40-
1. IList`T2` where T2 follows the same rules.
41-
1. Does not have any recursive types.
40+
1. `IListT2` where T2 follows the same rules.
41+
1. Doesn't have any recursive types.
4242

4343
The parameter `stream` is the stream containing the serialized object. `Deserialize` returns a collection of `T` instances.
4444

@@ -49,7 +49,7 @@ The parameter `stream` is the stream containing the serialized object. `Deserial
4949
}
5050
```
5151

52-
`StreamingContext` provides context which includes mechanism for publishing diagnostics for user operator.
52+
`StreamingContext` provides context, which includes mechanism for publishing diagnostics for user operator.
5353

5454
```csharp
5555
public abstract class StreamingContext
@@ -62,7 +62,7 @@ The parameter `stream` is the stream containing the serialized object. `Deserial
6262

6363
`WriteError` writes an error message to resource logs and sends the error to diagnostics.
6464

65-
`briefMessage` is a brief error message. This message shows up in diagnostics and is used by the product team for debugging purposes. Do not include sensitive information, and keep the message less than 200 characters
65+
`briefMessage` is a brief error message. This message shows up in diagnostics and is used by the product team for debugging purposes. Don't include sensitive information, and keep the message fewer than 200 characters
6666

6767
`detailedMessage` is a detailed error message that is only added to your resource logs in your storage. This message should be less than 2000 characters.
6868

@@ -75,7 +75,7 @@ The parameter `stream` is the stream containing the serialized object. `Deserial
7575

7676
## Deserializer examples
7777

78-
This section shows you how to write custom deserializers for Protobuf and CSV. For additional examples, such as AVRO format for Event Hub Capture, visit [Azure Stream Analytics on GitHub](https://github.com/Azure/azure-stream-analytics/tree/master/CustomDeserializers).
78+
This section shows you how to write custom deserializers for Protobuf and CSV. For more examples, such as AVRO format for Event Hubs Capture, visit [Azure Stream Analytics on GitHub](https://github.com/Azure/azure-stream-analytics/tree/master/CustomDeserializers).
7979

8080
### Protocol buffer (Protobuf) format
8181

@@ -107,7 +107,7 @@ message MessageBodyProto {
107107
}
108108
```
109109

110-
Running `protoc.exe` from the **Google.Protobuf.Tools** NuGet generates a .cs file with the definition. The generated file is not shown here. You must ensure that the version of Protobuf Nuget you use in your Stream Analytics project matches the Protobuf version that was used to generate the input.
110+
Running `protoc.exe` from the **Google.Protobuf.Tools** NuGet generates a .cs file with the definition. The generated file isn't shown here. You must ensure that the version of Protobuf NuGet you use in your Stream Analytics project matches the Protobuf version that was used to generate the input.
111111

112112
The following code snippet is the deserializer implementation assuming the generated file is included in the project. This implementation is just a thin wrapper over the generated file.
113113

@@ -227,25 +227,25 @@ This feature is available in the following regions when using Standard SKU:
227227
* East US 2
228228
* West Europe
229229

230-
You can [request support](https://aka.ms/ccodereqregion) for additional regions. However, there is no such region restriction when using [Stream Analytics clusters](./cluster-overview.md).
230+
You can [request support](https://aka.ms/ccodereqregion) for more regions. However, there's no such region restriction when using [Stream Analytics clusters](./cluster-overview.md).
231231

232232
## Frequently asked questions
233233

234234
### When will this feature be available in all Azure regions?
235235

236-
This feature is available in [6 regions](#region-support). If you are interested in using this functionality in another region, you can [submit a request](https://aka.ms/ccodereqregion). Support for all Azure regions is on the roadmap.
236+
This feature is available in [6 regions](#region-support). If you're interested in using this functionality in another region, you can [submit a request](https://aka.ms/ccodereqregion). Support for all Azure regions is on the roadmap.
237237

238238
### Can I access MetadataPropertyValue from my inputs similar to GetMetadataPropertyValue function?
239239

240-
This functionality is not supported. If you need this capability, you can vote for this request on [UserVoice](https://feedback.azure.com/d365community/idea/b4517302-b925-ec11-b6e6-000d3a4f0f1c).
240+
This functionality isn't supported. If you need this capability, you can vote for this request on [UserVoice](https://feedback.azure.com/d365community/idea/b4517302-b925-ec11-b6e6-000d3a4f0f1c).
241241

242242
### Can I share my deserializer implementation with the community so that others can benefit?
243243

244-
Once you have implemented your deserializer, you can help others by sharing it with the community. Submit your code to the [Azure Stream Analytics GitHub repo](https://github.com/Azure/azure-stream-analytics/tree/master/CustomDeserializers).
244+
Once you've implemented your deserializer, you can help others by sharing it with the community. Submit your code to the [Azure Stream Analytics GitHub repo](https://github.com/Azure/azure-stream-analytics/tree/master/CustomDeserializers).
245245

246246
### What are the other limitations of using custom deserializers in Stream Analytics?
247247

248-
If your input is of Protobuf format with a schema containing `MapField` type, you will not be able to implement a custom deserializer. Also, custom deserializers do not support sample data or preview data.
248+
If your input is of Protobuf format with a schema containing `MapField` type, you won't be able to implement a custom deserializer. Also, custom deserializers don't support sample data or preview data.
249249

250250
## Next Steps
251251

articles/stream-analytics/custom-deserializer.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22
title: Custom .NET deserializers for Azure Stream Analytics cloud jobs
33
description: This doc demonstrates how to create a custom .NET deserializer for an Azure Stream Analytics cloud job using Visual Studio.
4-
author: sidramadoss
5-
ms.author: sidram
4+
author: ahartoon
5+
ms.author: anboisve
66
ms.service: stream-analytics
77
ms.topic: tutorial
88
ms.date: 12/17/2020

articles/stream-analytics/data-error-codes.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22
title: Data error codes - Azure Stream Analytics
33
description: Troubleshoot Azure Stream Analytics issues with data error codes, which occur when there's bad data in the stream.
4-
ms.author: sidram
5-
author: sidramadoss
4+
author: ahartoon
5+
ms.author: anboisve
66
ms.topic: troubleshooting
77
ms.date: 05/25/2022
88
ms.service: stream-analytics
@@ -19,7 +19,7 @@ You can use activity logs and resource logs to help debug unexpected behaviors f
1919

2020
## InputEventTimestampNotFound
2121

22-
* **Cause**: Stream Analytics is unable to get a timestamp for a resource.
22+
* **Cause**: Stream Analytics is unable to get a time stamp for a resource.
2323

2424
## InputEventTimestampByOverValueNotFound
2525

@@ -31,7 +31,7 @@ You can use activity logs and resource logs to help debug unexpected behaviors f
3131

3232
## InputEventEarlyBeyondThreshold
3333

34-
* **Cause**: An input event arrival time is earlier than the input event application timestamp threshold.
34+
* **Cause**: An input event arrival time is earlier than the input event application time stamp threshold.
3535

3636
## AzureFunctionMessageSizeExceeded
3737

articles/stream-analytics/data-errors.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22
title: Azure Stream Analytics resource log data errors
33
description: This article explains the different input and output data errors that can occur when using Azure Stream Analytics.
4-
author: sidramadoss
5-
ms.author: sidram
4+
author: ahartoon
5+
ms.author: anboisve
66
ms.service: stream-analytics
77
ms.topic: troubleshooting
88
ms.date: 08/07/2020

articles/stream-analytics/data-protection.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22
title: Data protection in Azure Stream Analytics
33
description: This article explains how to encrypt your private data used by an Azure Stream Analytics job.
4-
author: sidramadoss
5-
ms.author: sidram
4+
author: ahartoon
5+
ms.author: anboisve
66
ms.service: stream-analytics
77
ms.topic: how-to
88
ms.date: 05/20/2022

0 commit comments

Comments
 (0)