| [AWS Simple Notification Service](https://aws.amazon.com/sns/) | Automation integration: [AWS Simple Notification Service](/docs/platform-services/automation-service/app-central/integrations/aws-simple-notification-service/) |
|
| [AWS WAF](https://aws.amazon.com/waf/) | Apps:
| [Axonius](https://www.axonius.com/) | Automation integration: [Axonius](/docs/platform-services/automation-service/app-central/integrations/axonius/) |
-|
| [Azure](https://azure.microsoft.com/en-us) | Apps:
| [Azure](https://azure.microsoft.com/en-us) | Apps:
2. **Create a consumer group**.
1. Go to your **Event Hub**.
2. Select **Consumer groups** on left panel.
@@ -51,13 +51,13 @@ After choosing one of the above two strategies, you will now have an event hub n
Creating **Consumer Groups** is needed only for the customers using the older event hub namespace, see [Existing event hub namespace](#strategy-a-existing-event-hub-namespaces) section in step 1. The default consumer group is already in use by function so we need to create a new one.
:::
-
+
After completing the above steps, you will have **Azure Event Hubs Namespace**, **Event Hubs Instance Name**, **Shared Access Policy**, and **Consumer Group Name** - all four parameters are required for creating an event hub source.
-## Step 3. Create event hub cloud-to-cloud sources
+## Step 3. Create event hub sources
-For each of the event hubs present in your namespace, you need to create a cloud-to-cloud source. For more information, see [Creating Azure Event Hub Source](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/azure-event-hubs-source/#vendor-configuration) section.
+For each of the event hubs present in your namespace, you need to create a Azure Event Hubs source. For more information, refer to the [Creating Azure Event Hub Source](/docs/send-data/collect-from-other-data-sources/azure-monitoring/ms-azure-event-hubs-source).
:::note
We recommend giving the same source category so that your custom dashboards or apps require no changes. You can verify whether the data comes from your source using `1_source metadata`.
@@ -71,13 +71,13 @@ After verifying that all the log types are ingesting in your new source, follow
If your resource group contains only resources created by the older ARM template, as shown below, and you have created a new namespace in a different resource group, see [Creating new event hub namespace](#strategy-b-creating-new-event-hub-namespaces) section in step 1.
- 
+
1. **Stopping the data flow in older sumo logic source**. To stop the logs export to the older event hub namespace, we need to delete the older diagnostic settings. You can delete them by following the steps below for each of your azure services that are sending logs to sumo.
1. Go to **Azure Portal**.
2. Search for **Diagnostic Settings** in the **Search bar**, it will take you to a page with all the resources which have diagnostic settings.
- 3. Select your **subscription**, **resource group** (for the azure service whose logs you are ingesting into sumo), and whose diagnostics status is enabled.
+ 4. Select the resource name (whose logs you are ingesting into sumo) it shows a list of diagnostic settings.
5. Select the setting whose event hub column matches with your older event hub namespace. Go to **Edit settings** corresponding to that setting and delete it.
2. **Wait for all data to be ingested into Sumo**. The azure function is draining all the logs from the older event hub namespace and sending them to sumo, we will need to wait till it finishes it. You can run a query in sumo with your older source name `(_source=
:::note
If you see more resources than the ones shown in the above screenshot you can delete all six individual resources (the ones with the prefix sumo) one by one by selecting each resource and clicking on the **Delete** button at the top bar.
@@ -103,7 +103,7 @@ If your resource group contains only resources created by the older ARM template
1. **Stopping the data flow in older sumo logic source**. The newer source will start collecting data from the point you created the source. You can verify that by running `(_source=
2. **Verify the new source is ingesting logs without any delay**. You can run the below query to verify the latency.
```sql
_source=
:::note
Before deleting resources, make sure your new source is working without any latency.
@@ -127,7 +127,7 @@ If your resource group contains only resources created by the older ARM template
## FAQ
-#### After migrating to Cloud-to-Cloud, will the acquired data volume increase as compared to when configured with the previous ARM Template?
+#### After migrating to Azure Event Hubs source, will the acquired data volume increase as compared to when configured with the previous ARM Template?
There won't be any change in data volume since these are the same logs we are just changing the collection method.
diff --git a/docs/send-data/collect-from-other-data-sources/azure-monitoring/ms-azure-event-hubs-source.md b/docs/send-data/collect-from-other-data-sources/azure-monitoring/ms-azure-event-hubs-source.md
index f27a2bbdd8..353396ef6d 100644
--- a/docs/send-data/collect-from-other-data-sources/azure-monitoring/ms-azure-event-hubs-source.md
+++ b/docs/send-data/collect-from-other-data-sources/azure-monitoring/ms-azure-event-hubs-source.md
@@ -8,8 +8,7 @@ import useBaseUrl from '@docusaurus/useBaseUrl';
import ForwardToSiem from '/docs/reuse/forward-to-siem.md';
:::note
-- For higher data ingestion speed and scalability, this collection method is preferred over our similar [Azure Event Hubs cloud-to-cloud source collection method](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/azure-event-hubs-source).
-- Azure Event Hubs for Logs does not support IP restrictions. We recommend using the [Azure Event Hubs cloud-to-cloud source collection method](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/azure-event-hubs-source) if you require higher throughput and have IP address restrictions on Event Hubs.
+Azure Event Hubs for Logs is preferred for higher throughput but does not support IP restrictions. We recommend using the [Azure Event Hubs cloud-to-cloud source collection method](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/azure-event-hubs-source) if you have IP address restrictions on Event Hubs. If you require higher throughput and have IP address restrictions on Event Hubs, consider splitting your Event Hubs into smaller namespaces, each staying within the 1MB/s (86GB/day) limit, and create a Cloud-to-Cloud collection method for each namespace.
:::
The Azure Event Hubs Source provides a secure endpoint to receive data from Azure Event Hubs. It securely stores the required authentication, scheduling, and state tracking information.
diff --git a/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/azure-event-hubs-source.md b/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/azure-event-hubs-source.md
index 8bddb1d833..e5e7674f84 100644
--- a/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/azure-event-hubs-source.md
+++ b/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/azure-event-hubs-source.md
@@ -15,7 +15,7 @@ import ForwardToSiem from '/docs/reuse/forward-to-siem.md';
import useBaseUrl from '@docusaurus/useBaseUrl';
:::note
-- Collecting data from Azure Event Hubs using this Cloud-to-Cloud collection method supports a throughput limit of 1MB/s (86GB/day) per named Event Hub egress rate. If you require higher throughput, we recommend using the [Azure Event Hubs Source for Logs](/docs/send-data/collect-from-other-data-sources/azure-monitoring/ms-azure-event-hubs-source).
+- Collecting data from Azure Event Hubs using this Cloud-to-Cloud collection method supports a throughput limit of 1MB/s (86GB/day) per named Event Hub egress rate. If you require higher throughput, we recommend using the [Azure Event Hubs Source for Logs](/docs/send-data/collect-from-other-data-sources/azure-monitoring/ms-azure-event-hubs-source).
- The only caveat is this Cloud-to-Cloud collection method supports IP restrictions and the [Azure Event Hubs Source for Logs](/docs/send-data/collect-from-other-data-sources/azure-monitoring/ms-azure-event-hubs-source/) does not. If you require higher throughput and have IP address restrictions on Event Hubs, consider splitting your Event Hubs into smaller namespaces, each staying within the 1MB/s (86GB/day) limit, and create a Cloud-to-Cloud collection method for each namespace.
:::
@@ -24,7 +24,7 @@ import useBaseUrl from '@docusaurus/useBaseUrl';
This cloud-to-cloud Azure Event Hubs Source provides a secure endpoint to receive data from Azure Event Hubs. It securely stores the required authentication, scheduling, and state tracking information.
:::tip Migrating to C2C
-See [Migrating from Azure Function-Based Collection to Event Hub Cloud-to-Cloud Source](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/azure-event-hubs-cloud-to-cloud-source-migration).
+See [Migrating from ARM based Azure Monitor Logs Collection](/docs/send-data/collect-from-other-data-sources/azure-monitoring/azure-event-hubs-source-migration).
:::
## Data collected
diff --git a/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/index.md b/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/index.md
index 8fe46164fd..5086ae8e7f 100644
--- a/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/index.md
+++ b/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/index.md
@@ -121,12 +121,6 @@ In this section, we'll introduce the following concepts:
Provides a secure endpoint to receive data from Azure Event Hubs.
-This source is available in all deployments, including FedRAMP.
-