Skip to content

Commit 273ee7f

Browse files
committed
edit pass: event-hubs-articles-batch1
1 parent 5873b52 commit 273ee7f

File tree

5 files changed

+209
-177
lines changed

5 files changed

+209
-177
lines changed
Lines changed: 70 additions & 56 deletions
Original file line numberDiff line numberDiff line change
@@ -1,106 +1,118 @@
11
---
2-
title: Azure Event Hubs – data streaming platform with Kafka support
3-
description: Learn about Azure Event Hubs, A real-time data streaming platform with native Apache Kafka support.
2+
title: 'Azure Event Hubs: Data streaming platform with Kafka support'
3+
description: Learn about Azure Event Hubs, a real-time data streaming platform with native Apache Kafka support.
44
ms.topic: overview
55
ms.date: 01/24/2024
66
---
77

8-
# Azure Event Hubs – A real-time data streaming platform with native Apache Kafka support
9-
Azure Event Hubs is a cloud native data streaming service that can stream millions of events per second, with low latency, from any source to any destination. Event Hubs is compatible with Apache Kafka, and it enables you to run existing Kafka workloads without any code changes.
8+
# Azure Event Hubs: A real-time data streaming platform with native Apache Kafka support
109

11-
Using Event Hubs to ingest and store streaming data, businesses can harness the power of streaming data to gain valuable insights, drive real-time analytics, and respond to events as they happen, enhancing overall efficiency and customer experience.
10+
Azure Event Hubs is a native data-streaming service in the cloud that can stream millions of events per second, with low latency, from any source to any destination. Event Hubs is compatible with Apache Kafka. It enables you to run existing Kafka workloads without any code changes.
11+
12+
Businesses can use Event Hubs to ingest and store streaming data. By using streaming data, businesses can gain valuable insights, drive real-time analytics, and respond to events as they happen. They can use this data to enhance their overall efficiency and customer experience.
1213

1314
:::image type="content" source="./media/event-hubs-about/event-streaming-platform.png" alt-text="Diagram that shows how Azure Event Hubs fits in an event streaming platform." lightbox="./media/event-hubs-about/event-streaming-platform.png":::
1415

15-
Azure Event Hubs is the preferred event ingestion layer of any event streaming solution that you build on top of Azure. It seamlessly integrates with data and analytics services inside and outside Azure to build your complete data streaming pipeline to serve following use cases.
16+
Event Hubs is the preferred event ingestion layer of any event streaming solution that you build on top of Azure. It integrates with data and analytics services inside and outside Azure to build a complete data streaming pipeline to serve the following use cases:
1617

17-
- [Real-time analytics with Azure Stream Analytics](./process-data-azure-stream-analytics.md) to generate real-time insights from streaming data.
18-
- [Analyze and explore streaming data with Azure Data Explorer](/azure/data-explorer/ingest-data-event-hub-overview).
18+
- [Process real-time analytics with Azure Stream Analytics](./process-data-azure-stream-analytics.md) to generate real-time insights from streaming data.
19+
- [Analyze and explore streaming data with Azure Data Explorer](/azure/data-explorer/ingest-data-event-hub-overview).
1920
- Create your own cloud native applications, functions, or microservices that run on streaming data from Event Hubs.
20-
- [Stream events with schema validation using a built-in schema registry to ensure quality and compatibility of streaming data](schema-registry-overview.md).
21-
21+
- [Stream events with schema validation by using a built-in schema registry to ensure quality and compatibility of streaming data](schema-registry-overview.md).
2222

2323
## Key capabilities
2424

25+
Learn about the key capabilities of Azure Event Hubs in the following sections.
26+
2527
### Apache Kafka on Azure Event Hubs
26-
Azure Event Hubs is a multi-protocol event streaming engine that natively supports AMQP, Apache Kafka, and HTTPs protocols. Since it supports Apache Kafka, you bring Kafka workloads to Azure Event Hubs without doing any code change. You don't need to set up, configure, and manage your own Kafka clusters or use a Kafka-as-a-Service offering that's not native to Azure.
2728

28-
Event Hubs is built from the ground up as a cloud native broker engine. Hence, you can run Kafka workloads with better performance, better cost efficiency and with no operational overhead.
29+
Event Hubs is a multi-protocol event streaming engine that natively supports Advanced Message Queuing Protocol (AMQP), Apache Kafka, and HTTPS protocols. Because it supports Apache Kafka, you bring Kafka workloads to Event Hubs without making any code changes. You don't need to set up, configure, or manage your own Kafka clusters or use a Kafka-as-a-service offering that's not native to Azure.
30+
31+
Event Hubs is built as a cloud native broker engine. For this reason, you can run Kafka workloads with better performance, better cost efficiency, and no operational overhead.
2932

3033
For more information, see [Azure Event Hubs for Apache Kafka](azure-event-hubs-kafka-overview.md).
3134

32-
### Schema Registry in Azure Event Hubs
33-
Azure Schema Registry in Event Hubs provides a centralized repository for managing schemas of events streaming applications. Azure Schema Registry comes free with every Event Hubs namespace, and it integrates seamlessly with your Kafka applications or Event Hubs SDK based applications.
35+
### Schema Registry in Azure Event Hubs
36+
37+
Azure Schema Registry in Event Hubs provides a centralized repository for managing schemas of event streaming applications. Azure Schema Registry comes free with every Event Hubs namespace. It integrates with your Kafka applications or Event Hubs SDK-based applications.
3438

3539
:::image type="content" source="./media/event-hubs-about/schema-registry.png" alt-text="Diagram that shows Schema Registry and Azure Event Hubs integration.":::
3640

37-
It ensures data compatibility and consistency across event producers and consumers. Schema Registry enables seamless schema evolution, validation, and governance, and promoting efficient data exchange and interoperability.
41+
Schema Registry ensures data compatibility and consistency across event producers and consumers. It enables schema evolution, validation, and governance and promotes efficient data exchange and interoperability.
3842

39-
Schema Registry seamlessly integrates with your existing Kafka applications and it supports multiple schema formats including Avro and JSON Schemas.
43+
Schema Registry integrates with your existing Kafka applications and supports multiple schema formats, including Avro and JSON schemas.
4044

4145
For more information, see [Azure Schema Registry in Event Hubs](schema-registry-overview.md).
4246

43-
### Real-time processing of streaming events with Azure Stream Analytics
44-
Event Hubs integrates seamlessly with Azure Stream Analytics to enable real-time stream processing. With the built-in no-code editor, you can effortlessly develop a Stream Analytics job using drag-and-drop functionality, without writing any code.
47+
### Real-time processing of streaming events with Stream Analytics
48+
49+
Event Hubs integrates with Azure Stream Analytics to enable real-time stream processing. With the built-in no-code editor, you can develop a Stream Analytics job by using drag-and-drop functionality, without writing any code.
4550

46-
:::image type="content" source="./media/event-hubs-about/process-data.png" alt-text="Screenshot showing the Process data page with Enable real time insights from events tile selected." lightbox="./media/event-hubs-about/process-data.png":::
51+
:::image type="content" source="./media/event-hubs-about/process-data.png" alt-text="Screenshot that shows the Process data page with the Enable real-time insights from events tile." lightbox="./media/event-hubs-about/process-data.png":::
4752

48-
Alternatively, developers can use the SQL-based Stream Analytics query language to perform real-time stream processing and take advantage of a wide range of functions for analyzing streaming data.
53+
Alternatively, developers can use the SQL-based Stream Analytics query language to perform real-time stream processing and take advantage of a wide range of functions for analyzing streaming data.
4954

50-
For more information, see articles in [the Azure Stream Analytics integration section](../stream-analytics/no-code-build-power-bi-dashboard.md) of the table of contents.
55+
For more information, see articles in [the Azure Stream Analytics integration section](../stream-analytics/no-code-build-power-bi-dashboard.md) of the table of contents.
5156

52-
### Exploring streaming data with Azure Data Explorer
53-
Azure Data Explorer is a fully managed platform for big data analytics that delivers high performance and allows for the analysis of large volumes of data in near real time. By integrating Event Hubs with Azure Data Explorer, you can easily perform near real-time analytics and exploration of streaming data.
57+
### Explore streaming data with Azure Data Explorer
5458

55-
:::image type="content" source="./media/event-hubs-about/data-explorer-integration.png" alt-text="Diagram that shows Azure Data explorer query and output.":::
59+
Azure Data Explorer is a fully managed platform for big data analytics that delivers high performance and allows for the analysis of large volumes of data in near real-time. By integrating Event Hubs with Azure Data Explorer, you can perform near real-time analytics and exploration of streaming data.
5660

57-
For more information, see [Ingest data from an event hub into Azure Data Explorer](/azure/data-explorer/ingest-data-event-hub-overview) and articles in the same section.
61+
:::image type="content" source="./media/event-hubs-about/data-explorer-integration.png" alt-text="Diagram that shows Azure Data Explorer query and output.":::
5862

59-
### Rich ecosystem– Azure functions, SDKs, and Kafka ecosystem
60-
Ingest, buffer, store, and process your stream in real time to get actionable insights. Event Hubs uses a partitioned consumer model, enabling multiple applications to process the stream concurrently and letting you control the speed of processing. Azure Event Hubs also integrates with Azure Functions for serverless architectures.
63+
For more information, see [Ingest data from an event hub into Azure Data Explorer](/azure/data-explorer/ingest-data-event-hub-overview).
6164

62-
With a broad ecosystem available for the industry-standard AMQP 1.0 protocol and SDKs available in various languages: .NET, Java, Python, JavaScript, you can easily start processing your streams from Event Hubs. All supported client languages provide low-level integration.
65+
### Azure functions, SDKs, and the Kafka ecosystem
6366

64-
The ecosystem also provides you with seamless integration Azure Functions, Azure Spring Apps, Kafka Connectors, and other data analytics platforms and technologies such as Apache Spark and Apache Flink.
67+
With Event Hubs, you can ingest, buffer, store, and process your stream in real time to get actionable insights. Event Hubs uses a partitioned consumer model. It enables multiple applications to process the stream concurrently and lets you control the speed of processing. Event Hubs also integrates with Azure Functions for serverless architectures.
6568

69+
A broad ecosystem is available for the industry-standard AMQP 1.0 protocol. SDKs are available in languages like .NET, Java, Python, and JavaScript, so you can start processing your streams from Event Hubs. All supported client languages provide low-level integration.
70+
71+
The ecosystem also allows you to integrate with Azure Functions, Azure Spring Apps, Kafka Connectors, and other data analytics platforms and technologies, such as Apache Spark and Apache Flink.
6672

6773
### Flexible and cost-efficient event streaming
68-
You can experience flexible and cost-efficient event streaming through Event Hubs' diverse selection of tiers – including Standard, Premium, and Dedicated. These options cater to data streaming needs ranging from a few MB/s to several GB/s, allowing you to choose the perfect match for your requirements.
74+
75+
You can experience flexible and cost-efficient event streaming through the Standard, Premium, or Dedicated tiers for Event Hubs. These options cater to data streaming needs that range from a few MB/s to several GB/s. You can choose the match that's appropriate for your requirements.
6976

7077
### Scalable
71-
With Event Hubs, you can start with data streams in megabytes, and grow to gigabytes or terabytes. The [Auto inflate](event-hubs-auto-inflate.md) feature is one of the many options available to scale the number of throughput units or processing units to meet your usage needs.
7278

73-
### Supports streaming large messages
79+
With Event Hubs, you can start with data streams in megabytes and grow to gigabytes or terabytes. The [auto-inflate](event-hubs-auto-inflate.md) feature is one of the options available to scale the number of throughput units or processing units to meet your usage needs.
7480

75-
In most streaming scenarios, data is characterized by being lightweight, typically less than 1 MB, and having a high throughput. However, there are instances where messages cannot be divided into smaller segments. Azure Event Hubs can effortlessly accommodate events up to 20 MB with self-serve scalable [dedicated clusters](event-hubs-dedicated-overview.md)at no extra charge. This capability allows Event Hubs to handle a wide range of message sizes, thereby ensuring uninterrupted business operations. For more information, refer stream [large messages](event-hubs-quickstart-stream-large-messages.md).
81+
### Supports streaming large messages
7682

77-
### Capture streaming data for long term retention and batch analytics
78-
Capture your data in near-real time in an Azure Blob storage or Azure Data Lake Storage for long-term retention or micro-batch processing. You can achieve this behavior on the same stream you use for deriving real-time analytics. Setting up capture of event data is fast.
83+
In most streaming scenarios, data is characterized by being lightweight, typically less than 1 MB, and having a high throughput. There are also instances where messages can't be divided into smaller segments. Event Hubs can accommodate events up to 20 MB with self-serve scalable [dedicated clusters](event-hubs-dedicated-overview.md) at no extra charge. This capability allows Event Hubs to handle a wide range of message sizes to ensure uninterrupted business operations. For more information, see [Send and receive large messages with Azure Event Hubs](event-hubs-quickstart-stream-large-messages.md).
7984

80-
:::image type="content" source="./media/event-hubs-capture-overview/event-hubs-capture-msi.png" alt-text="Image showing capturing of Event Hubs data into Azure Storage or Azure Data Lake Storage using Managed Identity":::
85+
### Capture streaming data for long-term retention and batch analytics
8186

82-
## How it works
83-
Event Hubs provides a unified event streaming platform with time retention buffer, decoupling event producers from event consumers. The producers and consumer applications can perform large scale data ingestion through multiple protocols.
87+
Capture your data in near real-time in Azure Blob Storage or Azure Data Lake Storage for long-term retention or micro-batch processing. You can achieve this behavior on the same stream that you use for deriving real-time analytics. Setting up capture of event data is fast.
8488

85-
The following figure shows the key components of Event Hubs architecture:
89+
:::image type="content" source="./media/event-hubs-capture-overview/event-hubs-capture-msi.png" alt-text="Diagram that shows capturing Event Hubs data into Azure Storage or Azure Data Lake Storage by using Managed Identity.":::
90+
91+
## How it works
92+
93+
Event Hubs provides a unified event streaming platform with a time-retention buffer, decoupling event producers from event consumers. The producers and consumer applications can perform large-scale data ingestion through multiple protocols.
94+
95+
The following diagram shows the main components of Event Hubs architecture.
8696

8797
:::image type="content" source="./media/event-hubs-about/components.png" alt-text="Diagram that shows the main components of Event Hubs.":::
8898

89-
The key functional components of Event Hubs include:
99+
The key functional components of Event Hubs include:
100+
101+
- **Producer applications**: Can ingest data to an event hub by using Event Hubs SDKs or any Kafka producer client.
102+
- **Namespace**: The management container for one or more event hubs or Kafka topics. The management tasks such as allocating streaming capacity, configuring network security, and enabling Geo Disaster recovery are handled at the namespace level.
103+
- **Event Hubs/Kafka topic**: In Event Hubs, you can organize events into an event hub or a Kafka topic. It's an append-only distributed log, which can comprise one or more partitions.
104+
- **Partitions**: Used to scale an event hub. They're like lanes in a freeway. If you need more streaming throughput, you can add more partitions.
105+
- **Consumer applications**: Can consume data by seeking through the event log and maintaining consumer offset. Consumers can be Kafka consumer clients or Event Hubs SDK clients.
106+
- **Consumer group**: A logical group of consumer instances that reads data from an event hub/Kafka topic. It enables multiple consumers to read the same streaming data in an event hub independently at their own pace and with their own offsets.
90107

91-
- **Producer applications** can ingest data to an event hub using Event Hubs SDKs or any Kafka producer client.
92-
- **Namespace** is the management container for one or more event hubs or Kafka topics. The management tasks such as allocating streaming capacity, configuring network security, enabling Geo Disaster recovery etc. are handled at the namespace level.
93-
- **Event Hub/Kafka topic**: In Event Hubs, you can organize events into an event hub or a Kafka topic. It's an append only distributed log, which can comprise of one or more partitions.
94-
- **Partitions** are used to scale an event hub. They are like lanes in a freeway. If you need more streaming throughput, you need to add more partitions.
95-
- **Consumer applications** consume data by seeking through the event log and maintaining consumer offset. Consumers can be Kafka consumer clients or Event Hubs SDK clients.
96-
- **Consumer Group** is a logical group of consumer instances that reads data from an event hub/Kafka topic. It enables multiple consumers to read the same streaming data in an event hub independently at their own pace and with their own offsets.
108+
## Related content
97109

98-
## Next steps
110+
To get started using Event Hubs, see the following quickstarts.
99111

100-
To get started using Event Hubs, see the following quick start guides.
112+
### Stream data by using Event Hubs SDK (AMQP)
113+
114+
You can use any of the following samples to stream data to Event Hubs by using SDKs.
101115

102-
### Stream data using Event Hubs SDK (AMQP)
103-
You can use any of the following samples to stream data to Event Hubs using SDKs.
104116
- [.NET Core](event-hubs-dotnet-standard-getstarted-send.md)
105117
- [Java](event-hubs-java-get-started-send.md)
106118
- [Spring](/azure/developer/java/spring-framework/configure-spring-cloud-stream-binder-java-app-azure-event-hub?toc=/azure/event-hubs/TOC.json)
@@ -110,12 +122,14 @@ You can use any of the following samples to stream data to Event Hubs using SDKs
110122
- [C](event-hubs-c-getstarted-send.md) (send only)
111123
- [Apache Storm](event-hubs-storm-getstarted-receive.md) (receive only)
112124

113-
### Stream data using Apache Kafka
114-
You can use following samples to stream data from your Kafka applications to Event Hubs.
115-
- [Using Event Hubs with Kafka applications](event-hubs-java-get-started-send.md)
125+
### Stream data by using Apache Kafka
126+
127+
You can use the following samples to stream data from your Kafka applications to Event Hubs.
128+
129+
- [Use Event Hubs with Kafka applications](event-hubs-java-get-started-send.md)
116130

117-
### Schema validation with Schema Registry
118-
You can use Event Hubs Schema Registry to perform schema validation for your event streaming applications.
131+
### Schema validation with Schema Registry
119132

120-
- [Schema validation for Kafka applications](schema-registry-kafka-java-send-receive-quickstart.md)
133+
You can use Event Hubs Schema Registry to perform schema validation for your event streaming applications.
121134

135+
- [Schema validation for Kafka applications](schema-registry-kafka-java-send-receive-quickstart.md)

0 commit comments

Comments
 (0)