Skip to content

Commit 509cf7f

Browse files
Merge pull request #265270 from Saglodha/saglodha-LMS
Adding content for Upcoming feature - Large Message Support
2 parents 3b4acb8 + 4828b3f commit 509cf7f

File tree

5 files changed

+60
-0
lines changed

5 files changed

+60
-0
lines changed

articles/event-hubs/TOC.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -58,6 +58,8 @@
5858
- name: Send events using Data Generator
5959
href: send-and-receive-events-using-data-generator.md
6060
displayName: No Code, Sample data
61+
- name: Stream large messages
62+
href: event-hubs-quickstart-stream-large-messages.md
6163
- name: Capture events
6264
items:
6365
- name: Use the Azure portal to enable Event Hubs Capture

articles/event-hubs/event-hubs-about.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -70,6 +70,10 @@ You can experience flexible and cost-efficient event streaming through Event Hub
7070
### Scalable
7171
With Event Hubs, you can start with data streams in megabytes, and grow to gigabytes or terabytes. The [Auto inflate](event-hubs-auto-inflate.md) feature is one of the many options available to scale the number of throughput units or processing units to meet your usage needs.
7272

73+
### Supports streaming large messages
74+
75+
In most streaming scenarios, data is characterized by being lightweight, typically less than 1 MB, and having a high throughput. However, there are instances where messages cannot be divided into smaller segments. Azure Event Hubs can effortlessly accommodate events up to 20 MB with self-serve scalable [dedicated clusters](event-hubs-dedicated-overview.md)at no extra charge. This capability allows Event Hubs to handle a wide range of message sizes, thereby ensuring uninterrupted business operations. For more information, refer stream [large messages](event-hubs-quickstart-stream-large-messages.md).
76+
7377
### Capture streaming data for long term retention and batch analytics
7478
Capture your data in near-real time in an Azure Blob storage or Azure Data Lake Storage for long-term retention or micro-batch processing. You can achieve this behavior on the same stream you use for deriving real-time analytics. Setting up capture of event data is fast.
7579

articles/event-hubs/event-hubs-dedicated-overview.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -34,6 +34,9 @@ The Dedicated cluster offers self-serve scaling capabilities that allow you to a
3434
### High-end features and generous quotas
3535
Dedicated clusters include all features of the Premium tier and more. The service also manages load balancing, operating system updates, security patches, and partitioning. So, you can spend less time on infrastructure maintenance and more time on building your event streaming applications.
3636

37+
### Supports streaming large messages
38+
In most streaming scenarios, data is lightweight, typically less than 1 MB, and requires high throughput. However, there are instances where messages cannot be divided into smaller segments. Self-serve Dedicated clusters can effortlessly accommodate events up to 20 MB of size at no additional cost. This capability allows Event Hubs to handle a wide range of message sizes, thereby ensuring uninterrupted business operations. For more information, refer stream [large messages](event-hubs-quickstart-stream-large-messages.md).
39+
3740
## Capacity Units(CU)
3841
Dedicated clusters are provisioned and billed by capacity units (CUs), a pre-allocated amount of CPU and memory resources.
3942

Lines changed: 51 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
---
2+
title: Azure Quickstart - Send and receive large messages with Azure Event Hubs (Preview)
3+
description: In this quickstart, you learn how to send and receive large messages with Azure Event Hubs.
4+
ms.topic: quickstart
5+
ms.author: Saglodha
6+
ms.date: 5/6/2024
7+
---
8+
9+
# QuickStart: Send and receive large messages with Azure Event Hubs (Preview)
10+
11+
In this quickstart, you learn how to send and receive large messages (up to 20 MB) using Azure Event Hubs. If you're new to Azure Event Hubs, see [Event Hubs overview](event-hubs-about.md) before you go through this quickstart.
12+
13+
### Prerequisites
14+
15+
To complete this QuickStart, you need the following prerequisites:
16+
17+
- Microsoft Azure subscription. To use Azure services, including Azure Event Hubs, you need a subscription. If you don't have an existing Azure account, you can sign up for a [free trial](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) or use your MSDN subscriber benefits when you [create an account](https://azure.microsoft.com/).
18+
19+
- Create [Self-serve scalable dedicated cluster](event-hubs-dedicated-cluster-create-portal.md), event hubs namespace and an event hub. The first step is to use the Azure portal to create a Dedicated cluster and namespace inside a cluster. To create an event hub, see [QuickStart: Create an event hub using Azure portal. ](event-hubs-create.md). You can skip this step if you already have a self-serve scalable dedicated cluster.
20+
21+
> [!NOTE]
22+
> Large Message Support, currently in Public Preview, is exclusively available with certain Event Hubs self-serve dedicated clusters. Streaming large messages with these clusters incurs no extra charges.
23+
24+
## Configuring Event Hubs Dedicated Cluster
25+
26+
To stream large messages, you must configure your self-serve scalable dedicated cluster. You could follow below steps below:
27+
28+
- On Azure portal, navigate to the ‘Settings’ section for Dedicated cluster and select the ‘Quota’ tab under Settings.
29+
30+
:::image type="content" source="./media/event-hubs-quickstart-stream-large-messages/large-message-configuration-for-dedicated-cluster.png" alt-text="Screenshot showing the Quota blade for Dedicated Cluster.":::
31+
32+
- Validate that the value for read-only key **supportslargemessages** is set to true.
33+
- You could update the key: **eventhubmaxmessagesizeinbytes** to suitable value in bytes. Acceptable range for this value is between 1048576 and 20971520 bytes.
34+
35+
Once the configuration is saved, you're all set to stream Large messages with event hubs.
36+
37+
> [!IMPORTANT]
38+
> Large message streaming is only supported with Self-serve scalable dedicated clusters built out of latest infrastructure. This capability is reflected by the “Supportslargemessages” key.
39+
> If its value is false, the cluster will not support large message streaming. To enable this feature, you must recreate the cluster.
40+
41+
## Streaming Large messages with Azure Event hubs
42+
43+
Azure Event Hubs allows streaming of large messages up to 20 MB, both in batches and as individual publications. Being able to stream large messages or events requires no client code changes apart from the change in message or event itself. You could continue sending/receiving messages using any existing event hubs SDK/ Kafka API to stream large messages to event hub. This allows you to stream large messages to the event hub in the same manner as you would for messages of size less than 1 MB.
44+
Know more [here](event-hubs-dotnet-standard-getstarted-send.md)
45+
46+
47+
> [!TIP]
48+
> Make sure to review any Event Hubs AMQP client or Kafka client configuration that could be limiting maximum message size that you stream into event hubs.You must update Client timeout to higher value to be able to stream large messages. By default, AMQP client prefetch count is 300. You should lower this value to avoid client side memory issues when dealing with large messages.
49+
50+
For complete .NET library reference, see our [SDK documentation](/dotnet/api/overview/azure/event-hubs).
51+
Loading

0 commit comments

Comments
 (0)