You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
title: Azure Event Hubs – data streaming platform with Kafka support
3
-
description: Learn about Azure Event Hubs, A real-time data streaming platform with native Apache Kafka support.
2
+
title: 'Azure Event Hubs: Data streaming platform with Kafka support'
3
+
description: Learn about Azure Event Hubs, which is a real-time data streaming platform with native Apache Kafka support.
4
4
ms.topic: overview
5
5
ms.date: 01/24/2024
6
6
---
7
7
8
-
# Azure Event Hubs – A real-time data streaming platform with native Apache Kafka support
9
-
Azure Event Hubs is a cloud native data streaming service that can stream millions of events per second, with low latency, from any source to any destination. Event Hubs is compatible with Apache Kafka, and it enables you to run existing Kafka workloads without any code changes.
8
+
# Azure Event Hubs: A real-time data streaming platform with native Apache Kafka support
10
9
11
-
Using Event Hubs to ingest and store streaming data, businesses can harness the power of streaming data to gain valuable insights, drive real-time analytics, and respond to events as they happen, enhancing overall efficiency and customer experience.
10
+
Azure Event Hubs is a native data-streaming service in the cloud that can stream millions of events per second, with low latency, from any source to any destination. Event Hubs is compatible with Apache Kafka. It enables you to run existing Kafka workloads without any code changes.
11
+
12
+
Businesses can use Event Hubs to ingest and store streaming data. By using streaming data, businesses can gain valuable insights, drive real-time analytics, and respond to events as they happen. They can use this data to enhance their overall efficiency and customer experience.
12
13
13
14
:::image type="content" source="./media/event-hubs-about/event-streaming-platform.png" alt-text="Diagram that shows how Azure Event Hubs fits in an event streaming platform." lightbox="./media/event-hubs-about/event-streaming-platform.png":::
14
15
15
-
Azure Event Hubs is the preferred event ingestion layer of any event streaming solution that you build on top of Azure. It seamlessly integrates with data and analytics services inside and outside Azure to build your complete data streaming pipeline to serve following use cases.
16
+
Event Hubs is the preferred event ingestion layer of any event streaming solution that you build on top of Azure. It integrates with data and analytics services inside and outside Azure to build a complete data streaming pipeline to serve the following use cases:
16
17
17
-
-[Real-time analytics with Azure Stream Analytics](./process-data-azure-stream-analytics.md) to generate real-time insights from streaming data.
18
-
-[Analyze and explore streaming data with Azure Data Explorer](/azure/data-explorer/ingest-data-event-hub-overview).
18
+
-[Process data from your event hub by using Azure Stream Analytics](./process-data-azure-stream-analytics.md) to generate real-time insights.
19
+
-[Analyze and explore streaming data with Azure Data Explorer](/azure/data-explorer/ingest-data-event-hub-overview).
19
20
- Create your own cloud native applications, functions, or microservices that run on streaming data from Event Hubs.
20
-
-[Stream events with schema validation using a built-in schema registry to ensure quality and compatibility of streaming data](schema-registry-overview.md).
21
-
21
+
-[Stream events with schema validation by using the built-in Azure Schema Registry to ensure quality and compatibility of streaming data](schema-registry-overview.md).
22
22
23
23
## Key capabilities
24
24
25
+
Learn about the key capabilities of Azure Event Hubs in the following sections.
26
+
25
27
### Apache Kafka on Azure Event Hubs
26
-
Azure Event Hubs is a multi-protocol event streaming engine that natively supports AMQP, Apache Kafka, and HTTPs protocols. Since it supports Apache Kafka, you bring Kafka workloads to Azure Event Hubs without doing any code change. You don't need to set up, configure, and manage your own Kafka clusters or use a Kafka-as-a-Service offering that's not native to Azure.
27
28
28
-
Event Hubs is built from the ground up as a cloud native broker engine. Hence, you can run Kafka workloads with better performance, better cost efficiency and with no operational overhead.
29
+
Event Hubs is a multi-protocol event streaming engine that natively supports Advanced Message Queuing Protocol (AMQP), Apache Kafka, and HTTPS protocols. Because it supports Apache Kafka, you can bring Kafka workloads to Event Hubs without making any code changes. You don't need to set up, configure, or manage your own Kafka clusters or use a Kafka-as-a-service offering that's not native to Azure.
30
+
31
+
Event Hubs is built as a cloud native broker engine. For this reason, you can run Kafka workloads with better performance, better cost efficiency, and no operational overhead.
29
32
30
33
For more information, see [Azure Event Hubs for Apache Kafka](azure-event-hubs-kafka-overview.md).
31
34
32
-
### Schema Registry in Azure Event Hubs
33
-
Azure Schema Registry in Event Hubs provides a centralized repository for managing schemas of events streaming applications. Azure Schema Registry comes free with every Event Hubs namespace, and it integrates seamlessly with your Kafka applications or Event Hubs SDK based applications.
35
+
### Schema Registry in Event Hubs
36
+
37
+
Azure Schema Registry in Event Hubs provides a centralized repository for managing schemas of event streaming applications. Schema Registry comes free with every Event Hubs namespace. It integrates with your Kafka applications or Event Hubs SDK-based applications.
34
38
35
-
:::image type="content" source="./media/event-hubs-about/schema-registry.png" alt-text="Diagram that shows Schema Registry and Azure Event Hubs integration.":::
39
+
:::image type="content" source="./media/event-hubs-about/schema-registry.png" alt-text="Diagram that shows Schema Registry and Event Hubs integration.":::
36
40
37
-
It ensures data compatibility and consistency across event producers and consumers. Schema Registry enables seamless schema evolution, validation, and governance, and promoting efficient data exchange and interoperability.
41
+
Schema Registry ensures data compatibility and consistency across event producers and consumers. It enables schema evolution, validation, and governance and promotes efficient data exchange and interoperability.
38
42
39
-
Schema Registry seamlessly integrates with your existing Kafka applications and it supports multiple schema formats including Avro and JSON Schemas.
43
+
Schema Registry integrates with your existing Kafka applications and supports multiple schema formats, including Avro and JSON schemas.
40
44
41
45
For more information, see [Azure Schema Registry in Event Hubs](schema-registry-overview.md).
42
46
43
-
### Real-time processing of streaming events with Azure Stream Analytics
44
-
Event Hubs integrates seamlessly with Azure Stream Analytics to enable real-time stream processing. With the built-in no-code editor, you can effortlessly develop a Stream Analytics job using drag-and-drop functionality, without writing any code.
47
+
### Real-time processing of streaming events with Stream Analytics
45
48
46
-
:::image type="content" source="./media/event-hubs-about/process-data.png" alt-text="Screenshot showing the Process data page with Enable real time insights from events tile selected." lightbox="./media/event-hubs-about/process-data.png":::
49
+
Event Hubs integrates with Azure Stream Analytics to enable real-time stream processing. With the built-in no-code editor, you can develop a Stream Analytics job by using drag-and-drop functionality, without writing any code.
47
50
48
-
Alternatively, developers can use the SQL-based Stream Analytics query language to perform real-time stream processing and take advantage of a wide range of functions for analyzing streaming data.
51
+
:::image type="content" source="./media/event-hubs-about/process-data.png" alt-text="Screenshot that shows the Process data page with the Enable real-time insights from events tile." lightbox="./media/event-hubs-about/process-data.png":::
49
52
50
-
For more information, see articles in [the Azure Stream Analytics integration section](../stream-analytics/no-code-build-power-bi-dashboard.md)of the table of contents.
53
+
Alternatively, developers can use the SQL-based Stream Analytics query language to perform real-time stream processing and take advantage of a wide range of functions for analyzing streaming data.
51
54
52
-
### Exploring streaming data with Azure Data Explorer
53
-
Azure Data Explorer is a fully managed platform for big data analytics that delivers high performance and allows for the analysis of large volumes of data in near real time. By integrating Event Hubs with Azure Data Explorer, you can easily perform near real-time analytics and exploration of streaming data.
55
+
For more information, see articles in [the Azure Stream Analytics integration section](../stream-analytics/no-code-build-power-bi-dashboard.md) of the table of contents.
54
56
55
-
:::image type="content" source="./media/event-hubs-about/data-explorer-integration.png" alt-text="Diagram that shows Azure Data explorer query and output.":::
57
+
### Explore streaming data with Azure Data Explorer
56
58
57
-
For more information, see [Ingest data from an event hub into Azure Data Explorer](/azure/data-explorer/ingest-data-event-hub-overview)and articles in the same section.
59
+
Azure Data Explorer is a fully managed platform for big data analytics that delivers high performance and allows for the analysis of large volumes of data in near real time. By integrating Event Hubs with Azure Data Explorer, you can perform near real-time analytics and exploration of streaming data.
58
60
59
-
### Rich ecosystem– Azure functions, SDKs, and Kafka ecosystem
60
-
Ingest, buffer, store, and process your stream in real time to get actionable insights. Event Hubs uses a partitioned consumer model, enabling multiple applications to process the stream concurrently and letting you control the speed of processing. Azure Event Hubs also integrates with Azure Functions for serverless architectures.
61
+
:::image type="content" source="./media/event-hubs-about/data-explorer-integration.png" alt-text="Diagram that shows Azure Data Explorer query and output.":::
61
62
62
-
With a broad ecosystem available for the industry-standard AMQP 1.0 protocol and SDKs available in various languages: .NET, Java, Python, JavaScript, you can easily start processing your streams from Event Hubs. All supported client languages provide low-level integration.
63
+
For more information, see [Ingest data from an event hub into Azure Data Explorer](/azure/data-explorer/ingest-data-event-hub-overview).
63
64
64
-
The ecosystem also provides you with seamless integration Azure Functions, Azure Spring Apps, Kafka Connectors, and other data analytics platforms and technologies such as Apache Spark and Apache Flink.
65
+
### Azure functions, SDKs, and the Kafka ecosystem
65
66
67
+
With Event Hubs, you can ingest, buffer, store, and process your stream in real time to get actionable insights. Event Hubs uses a partitioned consumer model. It enables multiple applications to process the stream concurrently and lets you control the speed of processing. Event Hubs also integrates with Azure Functions for serverless architectures.
68
+
69
+
A broad ecosystem is available for the industry-standard AMQP 1.0 protocol. SDKs are available in languages like .NET, Java, Python, and JavaScript, so you can start processing your streams from Event Hubs. All supported client languages provide low-level integration.
70
+
71
+
The ecosystem also allows you to integrate with Azure Functions, Azure Spring Apps, Kafka Connectors, and other data analytics platforms and technologies, such as Apache Spark and Apache Flink.
66
72
67
73
### Flexible and cost-efficient event streaming
68
-
You can experience flexible and cost-efficient event streaming through Event Hubs' diverse selection of tiers – including Standard, Premium, and Dedicated. These options cater to data streaming needs ranging from a few MB/s to several GB/s, allowing you to choose the perfect match for your requirements.
74
+
75
+
You can experience flexible and cost-efficient event streaming through the Standard, Premium, or Dedicated tiers for Event Hubs. These options cater to data streaming needs that range from a few MB/sec to several GB/sec. You can choose the match that's appropriate for your requirements.
69
76
70
77
### Scalable
71
-
With Event Hubs, you can start with data streams in megabytes, and grow to gigabytes or terabytes. The [Auto inflate](event-hubs-auto-inflate.md) feature is one of the many options available to scale the number of throughput units or processing units to meet your usage needs.
72
78
73
-
### Supports streaming large messages
79
+
With Event Hubs, you can start with data streams in megabytes and grow to gigabytes or terabytes. The [auto-inflate](event-hubs-auto-inflate.md) feature is one of the options available to scale the number of throughput units or processing units to meet your usage needs.
80
+
81
+
### Supports streaming large messages
74
82
75
-
In most streaming scenarios, data is characterized by being lightweight, typically less than 1 MB, and having a high throughput. However, there are instances where messages cannot be divided into smaller segments. Azure Event Hubs can effortlessly accommodate events up to 20 MB with self-serve scalable [dedicated clusters](event-hubs-dedicated-overview.md)at no extra charge. This capability allows Event Hubs to handle a wide range of message sizes, thereby ensuring uninterrupted business operations. For more information, refer stream [large messages](event-hubs-quickstart-stream-large-messages.md).
83
+
In most streaming scenarios, data is characterized by being lightweight, typically less than 1 MB, and having a high throughput. There are also instances where messages can't be divided into smaller segments. Event Hubs can accommodate events up to 20 MB with self-serve scalable [dedicated clusters](event-hubs-dedicated-overview.md)at no extra charge. This capability allows Event Hubs to handle a wide range of message sizes to ensure uninterrupted business operations. For more information, see [Send and receive large messages with Azure Event Hubs](event-hubs-quickstart-stream-large-messages.md).
76
84
77
-
### Capture streaming data for long term retention and batch analytics
78
-
Capture your data in near-real time in an Azure Blob storage or Azure Data Lake Storage for long-term retention or micro-batch processing. You can achieve this behavior on the same stream you use for deriving real-time analytics. Setting up capture of event data is fast.
85
+
### Capture streaming data for long-term retention and batch analytics
79
86
80
-
:::image type="content" source="./media/event-hubs-capture-overview/event-hubs-capture-msi.png" alt-text="Image showing capturing of Event Hubs data into Azure Storage or Azure Data Lake Storage using Managed Identity":::
87
+
Capture your data in near real time in Azure Blob Storage or Azure Data Lake Storage for long-term retention or micro-batch processing. You can achieve this behavior on the same stream that you use for deriving real-time analytics. Setting up capture of event data is fast.
81
88
82
-
## How it works
83
-
Event Hubs provides a unified event streaming platform with time retention buffer, decoupling event producers from event consumers. The producers and consumer applications can perform large scale data ingestion through multiple protocols.
89
+
:::image type="content" source="./media/event-hubs-capture-overview/event-hubs-capture-msi.png" alt-text="Diagram that shows capturing Event Hubs data into Azure Storage or Azure Data Lake Storage by using Managed Identity.":::
84
90
85
-
The following figure shows the key components of Event Hubs architecture:
91
+
## How it works
92
+
93
+
Event Hubs provides a unified event streaming platform with a time-retention buffer, decoupling event producers from event consumers. The producer and consumer applications can perform large-scale data ingestion through multiple protocols.
94
+
95
+
The following diagram shows the main components of Event Hubs architecture.
86
96
87
97
:::image type="content" source="./media/event-hubs-about/components.png" alt-text="Diagram that shows the main components of Event Hubs.":::
88
98
89
-
The key functional components of Event Hubs include:
99
+
The key functional components of Event Hubs include:
100
+
101
+
-**Producer applications**: These applications can ingest data to an event hub by using Event Hubs SDKs or any Kafka producer client.
102
+
-**Namespace**: The management container for one or more event hubs or Kafka topics. The management tasks such as allocating streaming capacity, configuring network security, and enabling geo-disaster recovery are handled at the namespace level.
103
+
-**Event Hubs/Kafka topic**: In Event Hubs, you can organize events into an event hub or a Kafka topic. It's an append-only distributed log, which can comprise one or more partitions.
104
+
-**Partitions**: They're used to scale an event hub. They're like lanes in a freeway. If you need more streaming throughput, you can add more partitions.
105
+
-**Consumer applications**: These applications can consume data by seeking through the event log and maintaining consumer offset. Consumers can be Kafka consumer clients or Event Hubs SDK clients.
106
+
-**Consumer group**: This logical group of consumer instances reads data from an event hub or Kafka topic. It enables multiple consumers to read the same streaming data in an event hub independently at their own pace and with their own offsets.
107
+
108
+
## Related content
90
109
91
-
-**Producer applications** can ingest data to an event hub using Event Hubs SDKs or any Kafka producer client.
92
-
-**Namespace** is the management container for one or more event hubs or Kafka topics. The management tasks such as allocating streaming capacity, configuring network security, enabling Geo Disaster recovery etc. are handled at the namespace level.
93
-
-**Event Hub/Kafka topic**: In Event Hubs, you can organize events into an event hub or a Kafka topic. It's an append only distributed log, which can comprise of one or more partitions.
94
-
-**Partitions** are used to scale an event hub. They are like lanes in a freeway. If you need more streaming throughput, you need to add more partitions.
95
-
-**Consumer applications** consume data by seeking through the event log and maintaining consumer offset. Consumers can be Kafka consumer clients or Event Hubs SDK clients.
96
-
-**Consumer Group** is a logical group of consumer instances that reads data from an event hub/Kafka topic. It enables multiple consumers to read the same streaming data in an event hub independently at their own pace and with their own offsets.
110
+
To get started using Event Hubs, see the following quickstarts.
97
111
98
-
##Next steps
112
+
### Stream data by using the Event Hubs SDK (AMQP)
99
113
100
-
To get started using Event Hubs, see the following quick start guides.
114
+
You can use any of the following samples to stream data to Event Hubs by using SDKs.
101
115
102
-
### Stream data using Event Hubs SDK (AMQP)
103
-
You can use any of the following samples to stream data to Event Hubs using SDKs.
0 commit comments