Skip to content

Commit dbf092d

Browse files
authored
Integration-Pages (#104)
* added integration for VPC Flow logs * modifying image dimensions * Added integration page for Amazon RDS * Added integration page for Amazon RDS * Added integration page for Amazon RDS * resolved duplication * added Integration page for Cloudfront * added Integration page for AWS API Gateway * added Integration page for AWS cloudwatch metrics * Added integration page for aws route 53 * added integration page for aws network firewall * added integration page for aws waf --------- Co-authored-by: simranquirky <simranquirky>
1 parent 01edc29 commit dbf092d

32 files changed

+624
-1
lines changed

docs/integration/aws/.pages

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,14 @@ nav:
44
- Application Load Balancer(ALB) : alb.md
55
- Amazon Virtual Private Cloud : vpc-flow.md
66
- Amazon Cognito : cognito.md
7+
- AWS Network Firewall : network-firewall.md
78
- AWS Cloudwatch logs: cloudwatch-logs.md
9+
- AWS Cloudwatch metrics: cloudwatch-metrics.md
10+
- Amazon Relational Database Service (RDS) : rds.md
11+
- AWS Route 53 : route-53.md
12+
- AWS Web Application Firewall (WAF): waf.md
13+
- API Gateway logs: api-gateway.md
14+
- Amazon CloudFront : cdn.md
815
- Amazon EventBridge : eventbridge.md
9-
16+
17+

docs/integration/aws/api-gateway.md

Lines changed: 80 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,80 @@
1+
---
2+
title: API Gateway Logs Integration Guide
3+
description: Stream AWS API Gateway access logs to OpenObserve using Kinesis Firehose for real-time API observability.
4+
---
5+
6+
# Integration with API Gateway Access Logs via Kinesis Firehose
7+
8+
This guide explains how to stream AWS API Gateway access logs into OpenObserve using Amazon Kinesis Data Firehose and the HTTP endpoint ingestion method.
9+
10+
## Overview
11+
12+
API Gateway can send access logs directly to a Kinesis Firehose delivery stream. OpenObserve ingests these logs in real-time via an HTTP endpoint, allowing you to monitor API request rates, status codes, latency, and failure patterns.
13+
14+
15+
## Steps to Integrate
16+
17+
??? "Prerequisites"
18+
- OpenObserve account ([Cloud](https://cloud.openobserve.ai/web/) or [Self-Hosted](../../../quickstart/#self-hosted-installation))
19+
- AWS API Gateway already configured
20+
- IAM permissions to manage API Gateway, Firehose, and roles
21+
- An S3 bucket for Firehose backup (optional but recommended)
22+
23+
??? "Step 1: Get OpenObserve Ingestion URL and Access Key"
24+
25+
1. In OpenObserve: go to **Data Sources → Recommended → AWS**
26+
2. Copy the ingestion URL and Access Key
27+
28+
![Get OpenObserve Ingestion URL and Access Key](../images/aws-integrations/vpc-flow/fetch-url.png)
29+
30+
> Update the URL to include your desired stream name:
31+
```
32+
https://<your-openobserve-domain>/aws/default/<stream_name>/_kinesis_firehose
33+
```
34+
35+
??? "Step 2: Create Kinesis Firehose Delivery Stream"
36+
37+
1. Go to **Amazon Kinesis → Firehose → Create delivery stream**
38+
2. Choose:
39+
- **Source**: `Direct PUT`
40+
- **Destination**: `HTTP Endpoint`
41+
3. Provide:
42+
- **HTTP Endpoint URL**: the OpenObserve Firehose ingestion URL
43+
- **Access Key / Secret Key**: from OpenObserve
44+
4. Optional: Configure backup to an S3 bucket (recommended)
45+
5. Give it a name and complete creation of the stream
46+
47+
??? "Step 3: Enable Access Logging in API Gateway"
48+
49+
1. Go to your API → **Stages → [Stage Name] → Logs/Tracing**
50+
2. Select **Error and Info logs** (collects all events) and Enable: **Custom Access Logging**
51+
3. Set **Log Destination ARN**: paste the ARN of the Firehose stream
52+
4. Set **Log Format**, for example:
53+
```
54+
RequestId: $context.requestId, SourceIP: $context.identity.sourceIp, Method: $context.httpMethod, ResourcePath: $context.resourcePath, StatusCode: $context.status, ResponseLength: $context.responseLength, RequestTime: $context.requestTime
55+
```
56+
5. Save changes
57+
58+
![Enable Access Logging in API Gateway](../images/aws-integrations/api-gateway/enable-access-log.png)
59+
60+
61+
??? "Step 4: Generate API Traffic (Optional)"
62+
63+
Use curl, Postman, or your app to trigger API requests:
64+
65+
??? "Step 5: Verify Logs in OpenObserve"
66+
67+
1. Go to **Logs** in Openobserve → select **stream** → set **time range** → **Run Query** to check for EC2 logs.
68+
69+
![Verify Logs in OpenObserve](../images/aws-integrations/api-gateway/verify-logs.png)
70+
71+
??? "Troubleshooting"
72+
73+
**No logs?**
74+
75+
- Confirm IAM role is attached and has correct permissions
76+
- Verify the log format and destination ARN in API Gateway
77+
- Check the Firehose delivery stream status for delivery failures
78+
- Inspect your S3 bucket for failed log records
79+
- Confirm the OpenObserve URL and credentials are valid
80+

docs/integration/aws/cdn.md

Lines changed: 76 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,76 @@
1+
---
2+
title: AWS CloudFront Logs Integration Guide
3+
description: Stream Amazon CloudFront access logs to OpenObserve using Kinesis Streams and Firehose for real-time visibility.
4+
---
5+
6+
# Integration with Amazon CloudFront Logs
7+
8+
This guide explains how to ingest Amazon CloudFront access logs into OpenObserve using Amazon Kinesis Data Streams and Kinesis Firehose.
9+
10+
## Overview
11+
12+
CloudFront access logs contain detailed request-level data. By streaming them to OpenObserve, you can monitor CDN performance, detect anomalies, and build custom dashboards in real time.
13+
14+
## Steps to Integrate
15+
16+
??? "Prerequisites"
17+
- OpenObserve account ([Cloud](https://cloud.openobserve.ai/web/) or [Self-Hosted](../../../quickstart/#self-hosted-installation))
18+
- Amazon CloudFront distribution
19+
- S3 bucket to store CloudFront logs
20+
- IAM role with necessary permissions
21+
- AWS Kinesis Data Stream and Firehose access
22+
23+
??? "Step 1: Get OpenObserve Ingestion URL and Access Key"
24+
25+
1. In OpenObserve: go to **Data Sources → Recommended → AWS**
26+
2. Copy the ingestion URL and Access Key
27+
28+
![Get OpenObserve Ingestion URL and Access Key](../images/aws-integrations/vpc-flow/fetch-url.png)
29+
30+
> Update the URL to have the stream name of your choice:
31+
```
32+
https://<your-openobserve-domain>/aws/default/<stream_name>/_kinesis_firehose
33+
```
34+
35+
??? "Step 2: Create a Kinesis Data Stream"
36+
37+
1. Navigate to **Kinesis → Data streams → Create data stream**
38+
2. Enter stream name: `cloudfront-stream`
39+
3. Choose number of shards (default 1 is fine for most use cases)
40+
4. Click **Create data stream**
41+
42+
??? "Step 3: Enable Real-Time Log Configuration in CloudFront"
43+
44+
1. Go to **CloudFront → Telemetry → Logs → Real-time log configurations → Create configuration**
45+
2. Set configuration name: e.g., `cloudfront-realtime-logs`
46+
3. Choose fields to include in the logs, such as:
47+
- `timestamp`, `c-ip`, `cs-uri-stem`, `sc-status`, `cs(User-Agent)` (and others as needed)
48+
4. Set sampling rate: e.g., `100` (for 100% of requests)
49+
5. Select the **Kinesis data stream** created earlier as Endpoint
50+
6. Select the cloudfront distribution to which you want to apply this configuration
51+
7. Create the real-time log configuration
52+
53+
![Enable Real-Time Log Configuration in CloudFront](../images/aws-integrations/rds/create-config.png)
54+
55+
??? "Step 4: Create a Firehose Delivery Stream to OpenObserve"
56+
57+
1. Go to **Kinesis → Delivery streams → Create delivery stream**
58+
2. Choose:
59+
- Source: `Kinesis Data Stream`
60+
- Stream name: the stream created in step 2
61+
- Destination: `HTTP Endpoint`
62+
3. In HTTP endpoint settings, provide OpenObserve's HTTP Endpoint URL and Access Key, and set an S3 backup bucket.
63+
4. Give the stream a meaningful name and Click **Create delivery stream**
64+
65+
??? "Step 5: Verify Logs in OpenObserve"
66+
67+
1. Go to **Logs** → select your log stream → Set time range → Click **Run Query**
68+
![Verify Logs](https://openobserve-prod-website.s3.us-west-2.amazonaws.com/assets/8_view_logs_in_o2_93b009c027.gif)
69+
70+
??? "Troubleshooting"
71+
72+
**No logs visible?**
73+
74+
- Check Firehose delivery status
75+
- Verify OpenObserve ingestion URL and access key
76+
- Check S3 backup for failed logs
Lines changed: 71 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,71 @@
1+
---
2+
title: AWS CloudWatch Metrics Integration Guide
3+
description: Stream AWS CloudWatch metrics to OpenObserve via Metric Streams and Kinesis Data Streams for real-time infrastructure monitoring.
4+
---
5+
6+
# Integration with AWS CloudWatch Metrics via Metric Stream
7+
8+
This guide explains how to stream AWS CloudWatch metrics (e.g., EC2, RDS, Lambda) into OpenObserve using CloudWatch Metric Streams and Kinesis Data Streams.
9+
10+
## Overview
11+
12+
CloudWatch Metric Streams allow efficient, near real-time delivery of metrics to a Kinesis Data Stream. OpenObserve pulls these metrics via an HTTP endpoint, enabling unified, real-time observability across your AWS infrastructure and applications.
13+
14+
## Steps to Integrate
15+
16+
??? "Prerequisites"
17+
- OpenObserve account ([Cloud](https://cloud.openobserve.ai/web/) or [Self‑Hosted](../../../quickstart/#self‑hosted‑installation))
18+
- AWS IAM permissions for:
19+
- CloudWatch: `ListMetrics`, `GetMetricData`, `GetMetricStatistics`
20+
- Kinesis Data Streams: `CreateStream`, `PutRecord`
21+
- OpenObserve ingestion URL and Access Key
22+
23+
??? "Step 1: Get OpenObserve Ingestion URL and Access Key"
24+
25+
1. In OpenObserve: go to **Data Sources → Recommended → AWS**
26+
2. Copy the ingestion HTTP URL and Access Key
27+
28+
![Fetch OpenObserve Ingestion URL](../images/aws-integrations/vpc-flow/fetch-url.png)
29+
30+
> Example format:
31+
> ```
32+
> https://<your-openobserve-domain>/aws/default/cloudwatch-logs/_kinesis_firehose
33+
> ```
34+
35+
??? "Step 2: Create a Kinesis Firehose Delivery Stream"
36+
37+
1. In AWS Kinesis Firehose, Create delivery stream.
38+
2. Set Source: `Direct PUT` and Destination: `HTTP Endpoint`.
39+
3. Provide OpenObserve's HTTP Endpoint URL and Access Key, and set an S3 backup bucket.
40+
4. Give the stream a meaningful name and Create it.
41+
42+
![Kinesis Firehose Delivery Stream](../images/aws-integrations/cloudwatch-metrics/firehose-stream.png){: style="height:800px"}
43+
44+
??? "Step 3: Create CloudWatch Metric Stream"
45+
46+
1. Navigate to **CloudWatch → Metric Streams → Create**
47+
2. Choose **Custom** setup
48+
![Create Metric Stream](../images/aws-integrations/cloudwatch-metrics/custom-setup.png)
49+
3. For **Destination**, select **Kinesis Data Stream** → Select your newly created stream
50+
4. Choose AWS namespaces to monitor (e.g., `AWS/EC2`, `AWS/RDS`, `AWS/Lambda`)
51+
5. Optional: Filter specific metrics
52+
6. Name your stream and click **Create Stream**
53+
54+
![Create Metric Stream](../images/aws-integrations/cloudwatch-metrics/metric-stream.png){: style="height:500px"}
55+
56+
57+
??? "Step 4: Verify Metrics in OpenObserve"
58+
59+
1. Go to **Logs** → select your stream → Set time range → Click **Run Query**
60+
61+
![Verify Metrics](https://openobserve-prod-website.s3.us-west-2.amazonaws.com/assets%2Frds_metrics_logs_d773f2b4ef.gif)
62+
63+
??? "Troubleshooting"
64+
65+
**No metrics appearing?**
66+
67+
- Confirm Kinesis Data Stream exists and is active
68+
- Verify Metric Stream configuration in CloudWatch
69+
- Check IAM permissions for CloudWatch and Kinesis
70+
- Validate ingestion URL and Access Key in OpenObserve
71+
Lines changed: 103 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,103 @@
1+
---
2+
title: AWS Network Firewall Logs Integration Guide
3+
description: Stream AWS Network Firewall flow and alert logs to OpenObserve via CloudWatch Logs and Kinesis Firehose for real-time network traffic monitoring.
4+
---
5+
6+
# Integration with AWS Network Firewall Logs via Firehose
7+
8+
This guide explains how to configure AWS Network Firewall to stream flow and alert logs to OpenObserve using Kinesis Data Firehose.
9+
10+
## Overview
11+
12+
AWS Network Firewall allows you to log network traffic data, including both flow logs and alert logs ,and stream it to destinations like Amazon S3 and Amazon Kinesis. This integration forwards logs to OpenObserve via Firehose, enabling real-time monitoring of your VPC traffic.
13+
14+
## Steps to Integrate
15+
16+
??? "Prerequisites"
17+
- OpenObserve account ([Cloud](https://cloud.openobserve.ai/web/) or [Self‑Hosted](../../../quickstart/#self‑hosted‑installation))
18+
- AWS IAM permissions to manage Network Firewall, Kinesis Firehose, and IAM roles
19+
- S3 bucket for backup (recommended)
20+
- A configured AWS Network Firewall deployment
21+
22+
??? "Step 1: Get OpenObserve Ingestion URL and Access Key"
23+
24+
1. In OpenObserve: go to **Data Sources → Recommended → AWS**
25+
2. Copy the HTTP ingestion URL and Access Key
26+
27+
![Get OpenObserve Ingestion URL and Access Key](../images/aws-integrations/vpc-flow/fetch-url.png)
28+
29+
> Example ingestion URL:
30+
> ```
31+
> https://<your-openobserve-domain>/aws/default/<stream_name>/_kinesis_firehose
32+
> ```
33+
34+
??? "Step 2: Create a Kinesis Data Firehose Delivery Stream"
35+
36+
1. Go to **Kinesis → Firehose → Create delivery stream**
37+
2. Choose:
38+
- **Source**: `Direct PUT`
39+
- **Destination**: `HTTP Endpoint`
40+
3. Provide:
41+
- **Endpoint URL**: OpenObserve Firehose ingestion URL
42+
- **Access Key / Secret Key**: from OpenObserve
43+
4. Optionally configure an S3 bucket as a backup
44+
5. Name the delivery stream (e.g., `network-firewall-logs`)
45+
6. Complete stream creation
46+
47+
> NOTE: You can create multiple streams for different types of logs
48+
49+
![Create a Kinesis Data Firehose Delivery Stream](../images/aws-integrations/firewall/firehose-streams.png)
50+
51+
??? "Step 3: Enable Logging in AWS Network Firewall"
52+
53+
1. Go to **VPC → Network Firewall → Firewalls**
54+
2. Select your firewall → Navigate to the **Monitoring and Observability** tab → Enable Detailed Monitoring
55+
3. Click **Configure** button under **logging configuration**
56+
57+
![Enable Logging in AWS Network Firewall](../images/aws-integrations/firewall/enable-monitoring.png)
58+
59+
4. Enable logging for alerts and Flow logs
60+
5. Set the **Log destination type** to **Kinesis Data Firehose**
61+
6. Select the delivery stream you created earlier
62+
> NOTE: In case you created multiple firehose streams, select respective streams for each log type
63+
7. Save the configuration
64+
65+
![Enable Logging in AWS Network Firewall](../images/aws-integrations/firewall/firewall-logging-config.png)
66+
67+
68+
??? "Step 4: Generate Traffic to Create Logs (Optional)"
69+
70+
1. Deploy an EC2 instance into a subnet protected by the Network Firewall
71+
2. Allow outbound internet access (via NAT or Internet Gateway)
72+
3. From the instance, generate traffic:
73+
```bash
74+
curl -I https://google.com
75+
```
76+
77+
This will help trigger flow and alert logs.
78+
79+
??? "Step 5: Verify Log Ingestion in OpenObserve"
80+
81+
1. Go to **Logs** → select your log stream → Set time range → Click **Run Query**
82+
83+
![AWS Network Firewall alert logs in Openobserve](../images/aws-integrations/firewall/alert-logs.png)
84+
![AWS Network Firewall flow logs in Openobserve](../images/aws-integrations/firewall/flow-logs.png)
85+
86+
87+
??? "Troubleshooting"
88+
89+
- No logs visible in OpenObserve
90+
- Ensure logging is enabled in Network Firewall and that network traffic is actually flowing through it.
91+
- Also, confirm that your delivery stream is properly configured.
92+
93+
- Firehose delivery failures
94+
- Verify that the OpenObserve ingestion URL and access credentials (Access Key and Secret Key) are correct.
95+
- Check if the Firehose role has the necessary permissions.
96+
97+
- Log fields are missing or incomplete
98+
- Check your S3 backup (if enabled) for any malformed or truncated records. This can help identify issues with the Firehose delivery configuration or log formatting.
99+
100+
- No alert logs are received
101+
- Confirm that your firewall policies include rules with the alert action. Only these rules generate alert logs. Without such rules, only flow logs will be produced.
102+
103+

0 commit comments

Comments
 (0)