From d422eb68c302a4728398480f68143fb54b554475 Mon Sep 17 00:00:00 2001 From: Igor Rodionov Date: Wed, 20 Aug 2025 17:15:16 +0200 Subject: [PATCH 1/2] chore: Update README.yaml wiht AI --- .gitignore | 2 + AGENTS.md | 35 +++++++++++++ Makefile | 8 --- README.md | 116 +++++++++++++++++++++++++++++------------- README.yaml | 137 ++++++-------------------------------------------- atmos.yaml | 11 ++++ src/README.md | 33 ++++++------ 7 files changed, 163 insertions(+), 179 deletions(-) create mode 100644 AGENTS.md delete mode 100644 Makefile create mode 100644 atmos.yaml diff --git a/.gitignore b/.gitignore index 6964514..edeabaf 100644 --- a/.gitignore +++ b/.gitignore @@ -7,6 +7,8 @@ aws-assumed-role/ *.iml .direnv .envrc +.cache +.atmos # Compiled and auto-generated files # Note that the leading "**/" appears necessary for Docker even if not for Git diff --git a/AGENTS.md b/AGENTS.md new file mode 100644 index 0000000..8deadc1 --- /dev/null +++ b/AGENTS.md @@ -0,0 +1,35 @@ +# Repository Guidelines + +## Project Structure & Module Organization +- `src/`: Terraform component (`main.tf`, `variables.tf`, `outputs.tf`, `providers.tf`, `versions.tf`, `context.tf`). This is the source of truth. +- `test/`: Go Terratest suite using Atmos fixtures (`component_test.go`, `fixtures/`, `test_suite.yaml`). Tests deploy/destroy real AWS resources. +- `README.yaml`: Source for the generated `README.md` (via atmos + terraform-docs). +- `.github/`: CI/CD, Renovate/Dependabot, labels, and automerge settings. +- `docs/`: Project docs (if any). Keep lightweight and current. + +## Build, Test, and Development Commands +- To install atmos read this docs https://github.com/cloudposse/atmos +- `atmos docs generate readme`: Regenerate `README.md` from `README.yaml` and terraform source. +- `atmos docs generate readme-simple`: Regenerate `src/README.md` from `README.yaml` and terraform source. +- `atmos test run`: Run Terratest suite in `test/` (uses Atmos fixtures; creates and destroys AWS resources). +- Pre-commit locally: `pre-commit install && pre-commit run -a` (runs `terraform_fmt`, `terraform_docs`, `tflint`). +- TFLint plugin setup: `tflint --init` (uses `.tflint.hcl`). + +## Coding Style & Naming Conventions +- Indentation: Terraform 2 spaces; YAML/Markdown 2 spaces. +- Terraform: prefer lower_snake_case for variables/locals; keep resources/data sources descriptive and aligned with Cloud Posse null-label patterns. +- Lint/format: `terraform fmt -recursive`, TFLint rules per `.tflint.hcl`. Do not commit formatting or lint violations. + +## Testing Guidelines +- Framework: Go Terratest with `github.com/cloudposse/test-helpers` and `atmos` fixtures. +- Location/naming: put tests in `test/` and name files `*_test.go`. Add scenarios under `test/fixtures/stacks/catalog/usecase/`. +- Run: `atmos test run`. Ensure AWS credentials are configured; tests may incur AWS costs and will clean up after themselves. + +## Commit & Pull Request Guidelines +- Commits: follow Conventional Commits (e.g., `feat:`, `fix:`, `chore(deps):`, `docs:`). Keep messages concise and scoped. +- PRs: include a clear description, linked issues, and any behavioral changes. Update `README.yaml` when inputs/outputs change and run `atmos docs generate readme`. +- CI: ensure pre-commit, TFLint, and tests pass. Avoid unrelated changes in the same PR. + +## Security & Configuration Tips +- Never commit secrets. Configure AWS credentials/role assumption externally; the provider setup in `src/providers.tf` supports role assumption via the `iam_roles` module. +- Global quotas must be applied in `us-east-1`; place in the `gbl` stack and set `region: us-east-1` in `vars`. diff --git a/Makefile b/Makefile deleted file mode 100644 index 8a6d902..0000000 --- a/Makefile +++ /dev/null @@ -1,8 +0,0 @@ --include $(shell curl -sSL -o .build-harness "https://cloudposse.tools/build-harness"; echo .build-harness) - -all: init readme - -test:: - @echo "🚀 Starting tests..." - ./test/run.sh - @echo "✅ All tests passed." diff --git a/README.md b/README.md index 2024b4f..8ad3bf7 100644 --- a/README.md +++ b/README.md @@ -2,8 +2,11 @@ Project Banner
-

-Latest ReleaseSlack Community

+ + +

Latest ReleaseSlack CommunityGet Support + +

-This component is responsible for provision all the necessary infrastructure to deploy -[Datadog Lambda forwarders](https://github.com/DataDog/datadog-serverless-functions/tree/master/aws/logs_monitoring). It -depends on the `datadog-configuration` component to get the Datadog API keys. +This component provisions all infrastructure required to deploy +[Datadog Lambda forwarders](https://github.com/DataDog/datadog-serverless-functions/tree/master/aws/logs_monitoring). +It depends on the `datadog-configuration` component to obtain the Datadog API keys. + + +> [!TIP] +> #### 👽 Use Atmos with Terraform +> Cloud Posse uses [`atmos`](https://atmos.tools) to easily orchestrate multiple environments using Terraform.
+> Works with [Github Actions](https://atmos.tools/integrations/github-actions/), [Atlantis](https://atmos.tools/integrations/atlantis), or [Spacelift](https://atmos.tools/integrations/spacelift). +> +>
+> Watch demo of using Atmos with Terraform +>
+> Example of running atmos to manage infrastructure from our Quick Start tutorial. +>
+ + + + ## Usage @@ -70,10 +89,10 @@ components: filter_pattern: "" ``` -Note for other regions, you need to deploy the `datadog-configuration` component in the respective region - the datadog -configuration will be moving to a regional implementation. +Note for other regions: you need to deploy the `datadog-configuration` component in the respective region — the Datadog +configuration is moving to a regional implementation. -For example if you usually deploy to us-west-2 (and DD Configuration is `gbl`), deploy it to the new region and then +For example, if you usually deploy to `us-west-2` (and DD Configuration is `gbl`), deploy it to the new region and then deploy the lambda forwarder. ```yaml @@ -96,8 +115,20 @@ components: datadog_configuration_environment: "use1" ``` - - +> [!IMPORTANT] +> In Cloud Posse's examples, we avoid pinning modules to specific versions to prevent discrepancies between the documentation +> and the latest released versions. However, for your own projects, we strongly advise pinning each module to the exact version +> you're using. This practice ensures the stability of your infrastructure. Additionally, we recommend implementing a systematic +> approach for updating versions to avoid unexpected changes. + + + + + + + + + ## Requirements | Name | Version | @@ -203,43 +234,28 @@ components: | [lambda\_forwarder\_rds\_function\_arn](#output\_lambda\_forwarder\_rds\_function\_arn) | Datadog Lambda forwarder RDS Enhanced Monitoring function ARN | | [lambda\_forwarder\_vpc\_log\_function\_arn](#output\_lambda\_forwarder\_vpc\_log\_function\_arn) | Datadog Lambda forwarder VPC Flow Logs function ARN | | [lambda\_forwarder\_vpc\_log\_function\_name](#output\_lambda\_forwarder\_vpc\_log\_function\_name) | Datadog Lambda forwarder VPC Flow Logs function name | - - - -## References - -- Datadog's [documentation about provisioning keys](https://docs.datadoghq.com/account_management/api-app-keys) -- [cloudposse/terraform-aws-components](https://github.com/cloudposse/terraform-aws-components/tree/main/modules/datadog-lambda-forwarder) - - Cloud Posse's upstream component - - -> [!TIP] -> #### 👽 Use Atmos with Terraform -> Cloud Posse uses [`atmos`](https://atmos.tools) to easily orchestrate multiple environments using Terraform.
-> Works with [Github Actions](https://atmos.tools/integrations/github-actions/), [Atlantis](https://atmos.tools/integrations/atlantis), or [Spacelift](https://atmos.tools/integrations/spacelift). -> ->
-> Watch demo of using Atmos with Terraform ->
-> Example of running atmos to manage infrastructure from our Quick Start tutorial. ->
+ +## Related Projects +Check out these related projects. +- [Cloud Posse Terraform Modules](https://docs.cloudposse.com/modules/) - Our collection of reusable Terraform modules used by our reference architectures. +- [Atmos](https://atmos.tools) - Atmos is like docker-compose but for your infrastructure +## References -## Related Projects +For additional context, refer to some of these links. -Check out these related projects. +- [Datadog's documentation about provisioning keys](https://docs.datadoghq.com/account_management/api-app-keys) - +- [cloudposse/terraform-aws-components](https://github.com/cloudposse/terraform-aws-components/tree/main/modules/datadog-lambda-forwarder) - Cloud Posse's upstream component -- [Cloud Posse Terraform Modules](https://docs.cloudposse.com/modules/) - Our collection of reusable Terraform modules used by our reference architectures. -- [Atmos](https://atmos.tools) - Atmos is like docker-compose but for your infrastructure > [!TIP] @@ -306,6 +322,38 @@ In general, PRs are welcome. We follow the typical "fork-and-pull" Git workflow. **NOTE:** Be sure to merge the latest changes from "upstream" before making a pull request! + +## Running Terraform Tests + +We use [Atmos](https://atmos.tools) to streamline how Terraform tests are run. It centralizes configuration and wraps common test workflows with easy-to-use commands. + +All tests are located in the [`test/`](test) folder. + +Under the hood, tests are powered by Terratest together with our internal [Test Helpers](https://github.com/cloudposse/test-helpers) library, providing robust infrastructure validation. + +Setup dependencies: +- Install Atmos ([installation guide](https://atmos.tools/install/)) +- Install Go [1.24+ or newer](https://go.dev/doc/install) +- Install Terraform or OpenTofu + +To run tests: + +- Run all tests: + ```sh + atmos test run + ``` +- Clean up test artifacts: + ```sh + atmos test clean + ``` +- Explore additional test options: + ```sh + atmos test --help + ``` +The configuration for test commands is centrally managed. To review what's being imported, see the [`atmos.yaml`](https://raw.githubusercontent.com/cloudposse/.github/refs/heads/main/.github/atmos/terraform-module.yaml) file. + +Learn more about our [automated testing in our documentation](https://docs.cloudposse.com/community/contribute/automated-testing/) or implementing [custom commands](https://atmos.tools/core-concepts/custom-commands/) with atmos. + ### 🌎 Slack Community Join our [Open Source Community](https://cpco.io/slack?utm_source=github&utm_medium=readme&utm_campaign=cloudposse-terraform-components/aws-datadog-lambda-forwarder&utm_content=slack) on Slack. It's **FREE** for everyone! Our "SweetOps" community is where you get to talk with others who share a similar vision for how to rollout and manage infrastructure. This is the best place to talk shop, ask questions, solicit feedback, and work together as a community to build totally *sweet* infrastructure. diff --git a/README.yaml b/README.yaml index f1062d0..e11c8a7 100644 --- a/README.yaml +++ b/README.yaml @@ -3,12 +3,11 @@ name: "aws-datadog-lambda-forwarder" github_repo: "cloudposse-terraform-components/aws-datadog-lambda-forwarder" # Short description of this project description: |- - This component is responsible for provision all the necessary infrastructure to deploy - [Datadog Lambda forwarders](https://github.com/DataDog/datadog-serverless-functions/tree/master/aws/logs_monitoring). It - depends on the `datadog-configuration` component to get the Datadog API keys. - - ## Usage + This component provisions all infrastructure required to deploy + [Datadog Lambda forwarders](https://github.com/DataDog/datadog-serverless-functions/tree/master/aws/logs_monitoring). + It depends on the `datadog-configuration` component to obtain the Datadog API keys. +usage: |- **Stack Level**: Regional Here's an example snippet for how to use this component: @@ -46,10 +45,10 @@ description: |- filter_pattern: "" ``` - Note for other regions, you need to deploy the `datadog-configuration` component in the respective region - the datadog - configuration will be moving to a regional implementation. + Note for other regions: you need to deploy the `datadog-configuration` component in the respective region — the Datadog + configuration is moving to a regional implementation. - For example if you usually deploy to us-west-2 (and DD Configuration is `gbl`), deploy it to the new region and then + For example, if you usually deploy to `us-west-2` (and DD Configuration is `gbl`), deploy it to the new region and then deploy the lambda forwarder. ```yaml @@ -72,121 +71,13 @@ description: |- datadog_configuration_environment: "use1" ``` - - - ## Requirements - - | Name | Version | - |------|---------| - | [terraform](#requirement\_terraform) | >= 1.0.0 | - | [aws](#requirement\_aws) | >= 4.0 | - | [datadog](#requirement\_datadog) | >= 3.3.0 | - - ## Providers - - | Name | Version | - |------|---------| - | [datadog](#provider\_datadog) | >= 3.3.0 | - - ## Modules - - | Name | Source | Version | - |------|--------|---------| - | [datadog-integration](#module\_datadog-integration) | cloudposse/stack-config/yaml//modules/remote-state | 1.5.0 | - | [datadog\_configuration](#module\_datadog\_configuration) | ../datadog-configuration/modules/datadog_keys | n/a | - | [datadog\_lambda\_forwarder](#module\_datadog\_lambda\_forwarder) | cloudposse/datadog-lambda-forwarder/aws | 1.5.3 | - | [iam\_roles](#module\_iam\_roles) | ../account-map/modules/iam-roles | n/a | - | [log\_group\_prefix](#module\_log\_group\_prefix) | cloudposse/label/null | 0.25.0 | - | [this](#module\_this) | cloudposse/label/null | 0.25.0 | - - ## Resources - - | Name | Type | - |------|------| - | [datadog_integration_aws_lambda_arn.log_collector](https://registry.terraform.io/providers/datadog/datadog/latest/docs/resources/integration_aws_lambda_arn) | resource | - | [datadog_integration_aws_lambda_arn.rds_collector](https://registry.terraform.io/providers/datadog/datadog/latest/docs/resources/integration_aws_lambda_arn) | resource | - | [datadog_integration_aws_lambda_arn.vpc_logs_collector](https://registry.terraform.io/providers/datadog/datadog/latest/docs/resources/integration_aws_lambda_arn) | resource | - | [datadog_integration_aws_log_collection.main](https://registry.terraform.io/providers/datadog/datadog/latest/docs/resources/integration_aws_log_collection) | resource | - - ## Inputs - - | Name | Description | Type | Default | Required | - |------|-------------|------|---------|:--------:| - | [additional\_tag\_map](#input\_additional\_tag\_map) | Additional key-value pairs to add to each map in `tags_as_list_of_maps`. Not added to `tags` or `id`.
This is for some rare cases where resources want additional configuration of tags
and therefore take a list of maps with tag key, value, and additional configuration. | `map(string)` | `{}` | no | - | [attributes](#input\_attributes) | ID element. Additional attributes (e.g. `workers` or `cluster`) to add to `id`,
in the order they appear in the list. New attributes are appended to the
end of the list. The elements of the list are joined by the `delimiter`
and treated as a single ID element. | `list(string)` | `[]` | no | - | [cloudwatch\_forwarder\_event\_patterns](#input\_cloudwatch\_forwarder\_event\_patterns) | Map of title to CloudWatch Event patterns to forward to Datadog. Event structure from here: https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/CloudWatchEventsandEventPatterns.html#CloudWatchEventsPatterns
Example:
hcl
cloudwatch_forwarder_event_rules = {
"guardduty" = {
source = ["aws.guardduty"]
detail-type = ["GuardDuty Finding"]
}
"ec2-terminated" = {
source = ["aws.ec2"]
detail-type = ["EC2 Instance State-change Notification"]
detail = {
state = ["terminated"]
}
}
}
|
map(object({
version = optional(list(string))
id = optional(list(string))
detail-type = optional(list(string))
source = optional(list(string))
account = optional(list(string))
time = optional(list(string))
region = optional(list(string))
resources = optional(list(string))
detail = optional(map(list(string)))
}))
| `{}` | no | - | [cloudwatch\_forwarder\_log\_groups](#input\_cloudwatch\_forwarder\_log\_groups) | Map of CloudWatch Log Groups with a filter pattern that the Lambda forwarder will send logs from. For example: { mysql1 = { name = "/aws/rds/maincluster", filter\_pattern = "" } | `map(map(string))` | `{}` | no | - | [context](#input\_context) | Single object for setting entire context at once.
See description of individual variables for details.
Leave string and numeric variables as `null` to use default value.
Individual variable settings (non-null) override settings in context object,
except for attributes, tags, and additional\_tag\_map, which are merged. | `any` |
{
"additional_tag_map": {},
"attributes": [],
"delimiter": null,
"descriptor_formats": {},
"enabled": true,
"environment": null,
"id_length_limit": null,
"label_key_case": null,
"label_order": [],
"label_value_case": null,
"labels_as_tags": [
"unset"
],
"name": null,
"namespace": null,
"regex_replace_chars": null,
"stage": null,
"tags": {},
"tenant": null
}
| no | - | [context\_tags](#input\_context\_tags) | List of context tags to add to each monitor | `set(string)` |
[
"namespace",
"tenant",
"environment",
"stage"
]
| no | - | [context\_tags\_enabled](#input\_context\_tags\_enabled) | Whether to add context tags to add to each monitor | `bool` | `true` | no | - | [datadog\_configuration\_environment](#input\_datadog\_configuration\_environment) | AWS region where the Datadog configuration is deployed, useful for multi region setups, null uses default (gbl) | `string` | `null` | no | - | [datadog\_forwarder\_lambda\_environment\_variables](#input\_datadog\_forwarder\_lambda\_environment\_variables) | Map of environment variables to pass to the Lambda Function | `map(string)` | `{}` | no | - | [dd\_api\_key\_kms\_ciphertext\_blob](#input\_dd\_api\_key\_kms\_ciphertext\_blob) | CiphertextBlob stored in environment variable DD\_KMS\_API\_KEY used by the lambda function, along with the KMS key, to decrypt Datadog API key | `string` | `""` | no | - | [dd\_artifact\_filename](#input\_dd\_artifact\_filename) | The Datadog artifact filename minus extension | `string` | `"aws-dd-forwarder"` | no | - | [dd\_forwarder\_version](#input\_dd\_forwarder\_version) | Version tag of Datadog lambdas to use. https://github.com/DataDog/datadog-serverless-functions/releases | `string` | `"3.66.0"` | no | - | [dd\_module\_name](#input\_dd\_module\_name) | The Datadog GitHub repository name | `string` | `"datadog-serverless-functions"` | no | - | [dd\_tags\_map](#input\_dd\_tags\_map) | A map of Datadog tags to apply to all logs forwarded to Datadog | `map(string)` | `{}` | no | - | [delimiter](#input\_delimiter) | Delimiter to be used between ID elements.
Defaults to `-` (hyphen). Set to `""` to use no delimiter at all. | `string` | `null` | no | - | [descriptor\_formats](#input\_descriptor\_formats) | Describe additional descriptors to be output in the `descriptors` output map.
Map of maps. Keys are names of descriptors. Values are maps of the form
`{
format = string
labels = list(string)
}`
(Type is `any` so the map values can later be enhanced to provide additional options.)
`format` is a Terraform format string to be passed to the `format()` function.
`labels` is a list of labels, in order, to pass to `format()` function.
Label values will be normalized before being passed to `format()` so they will be
identical to how they appear in `id`.
Default is `{}` (`descriptors` output will be empty). | `any` | `{}` | no | - | [enabled](#input\_enabled) | Set to false to prevent the module from creating any resources | `bool` | `null` | no | - | [environment](#input\_environment) | ID element. Usually used for region e.g. 'uw2', 'us-west-2', OR role 'prod', 'staging', 'dev', 'UAT' | `string` | `null` | no | - | [forwarder\_lambda\_debug\_enabled](#input\_forwarder\_lambda\_debug\_enabled) | Whether to enable or disable debug for the Lambda forwarder | `bool` | `false` | no | - | [forwarder\_log\_artifact\_url](#input\_forwarder\_log\_artifact\_url) | The URL for the code of the Datadog forwarder for Logs. It can be a local file, URL or git repo | `string` | `null` | no | - | [forwarder\_log\_enabled](#input\_forwarder\_log\_enabled) | Flag to enable or disable Datadog log forwarder | `bool` | `false` | no | - | [forwarder\_log\_layers](#input\_forwarder\_log\_layers) | List of Lambda Layer Version ARNs (maximum of 5) to attach to Datadog log forwarder lambda function | `list(string)` | `[]` | no | - | [forwarder\_log\_retention\_days](#input\_forwarder\_log\_retention\_days) | Number of days to retain Datadog forwarder lambda execution logs. One of [0 1 3 5 7 14 30 60 90 120 150 180 365 400 545 731 1827 3653] | `number` | `14` | no | - | [forwarder\_rds\_artifact\_url](#input\_forwarder\_rds\_artifact\_url) | The URL for the code of the Datadog forwarder for RDS. It can be a local file, url or git repo | `string` | `null` | no | - | [forwarder\_rds\_enabled](#input\_forwarder\_rds\_enabled) | Flag to enable or disable Datadog RDS enhanced monitoring forwarder | `bool` | `false` | no | - | [forwarder\_rds\_filter\_pattern](#input\_forwarder\_rds\_filter\_pattern) | Filter pattern for Lambda forwarder RDS | `string` | `""` | no | - | [forwarder\_rds\_layers](#input\_forwarder\_rds\_layers) | List of Lambda Layer Version ARNs (maximum of 5) to attach to Datadog RDS enhanced monitoring lambda function | `list(string)` | `[]` | no | - | [forwarder\_vpc\_logs\_artifact\_url](#input\_forwarder\_vpc\_logs\_artifact\_url) | The URL for the code of the Datadog forwarder for VPC Logs. It can be a local file, url or git repo | `string` | `null` | no | - | [forwarder\_vpc\_logs\_enabled](#input\_forwarder\_vpc\_logs\_enabled) | Flag to enable or disable Datadog VPC flow log forwarder | `bool` | `false` | no | - | [forwarder\_vpc\_logs\_layers](#input\_forwarder\_vpc\_logs\_layers) | List of Lambda Layer Version ARNs (maximum of 5) to attach to Datadog VPC flow log forwarder lambda function | `list(string)` | `[]` | no | - | [forwarder\_vpclogs\_filter\_pattern](#input\_forwarder\_vpclogs\_filter\_pattern) | Filter pattern for Lambda forwarder VPC Logs | `string` | `""` | no | - | [id\_length\_limit](#input\_id\_length\_limit) | Limit `id` to this many characters (minimum 6).
Set to `0` for unlimited length.
Set to `null` for keep the existing setting, which defaults to `0`.
Does not affect `id_full`. | `number` | `null` | no | - | [kms\_key\_id](#input\_kms\_key\_id) | Optional KMS key ID to encrypt Datadog Lambda function logs | `string` | `null` | no | - | [label\_key\_case](#input\_label\_key\_case) | Controls the letter case of the `tags` keys (label names) for tags generated by this module.
Does not affect keys of tags passed in via the `tags` input.
Possible values: `lower`, `title`, `upper`.
Default value: `title`. | `string` | `null` | no | - | [label\_order](#input\_label\_order) | The order in which the labels (ID elements) appear in the `id`.
Defaults to ["namespace", "environment", "stage", "name", "attributes"].
You can omit any of the 6 labels ("tenant" is the 6th), but at least one must be present. | `list(string)` | `null` | no | - | [label\_value\_case](#input\_label\_value\_case) | Controls the letter case of ID elements (labels) as included in `id`,
set as tag values, and output by this module individually.
Does not affect values of tags passed in via the `tags` input.
Possible values: `lower`, `title`, `upper` and `none` (no transformation).
Set this to `title` and set `delimiter` to `""` to yield Pascal Case IDs.
Default value: `lower`. | `string` | `null` | no | - | [labels\_as\_tags](#input\_labels\_as\_tags) | Set of labels (ID elements) to include as tags in the `tags` output.
Default is to include all labels.
Tags with empty values will not be included in the `tags` output.
Set to `[]` to suppress all generated tags.
**Notes:**
The value of the `name` tag, if included, will be the `id`, not the `name`.
Unlike other `null-label` inputs, the initial setting of `labels_as_tags` cannot be
changed in later chained modules. Attempts to change it will be silently ignored. | `set(string)` |
[
"default"
]
| no | - | [lambda\_arn\_enabled](#input\_lambda\_arn\_enabled) | Enable adding the Lambda Arn to this account integration | `bool` | `true` | no | - | [lambda\_policy\_source\_json](#input\_lambda\_policy\_source\_json) | Additional IAM policy document that can optionally be passed and merged with the created policy document | `string` | `""` | no | - | [lambda\_reserved\_concurrent\_executions](#input\_lambda\_reserved\_concurrent\_executions) | Amount of reserved concurrent executions for the lambda function. A value of 0 disables Lambda from being triggered and -1 removes any concurrency limitations. Defaults to Unreserved Concurrency Limits -1 | `number` | `-1` | no | - | [lambda\_runtime](#input\_lambda\_runtime) | Runtime environment for Datadog Lambda | `string` | `"python3.8"` | no | - | [log\_collection\_services](#input\_log\_collection\_services) | List of log collection services to enable | `list(string)` |
[
"apigw-access-logs",
"apigw-execution-logs",
"elbv2",
"elb",
"cloudfront",
"lambda",
"redshift",
"s3"
]
| no | - | [name](#input\_name) | ID element. Usually the component or solution name, e.g. 'app' or 'jenkins'.
This is the only ID element not also included as a `tag`.
The "name" tag is set to the full `id` string. There is no tag with the value of the `name` input. | `string` | `null` | no | - | [namespace](#input\_namespace) | ID element. Usually an abbreviation of your organization name, e.g. 'eg' or 'cp', to help ensure generated IDs are globally unique | `string` | `null` | no | - | [regex\_replace\_chars](#input\_regex\_replace\_chars) | Terraform regular expression (regex) string.
Characters matching the regex will be removed from the ID elements.
If not set, `"/[^a-zA-Z0-9-]/"` is used to remove all characters other than hyphens, letters and digits. | `string` | `null` | no | - | [region](#input\_region) | AWS Region | `string` | n/a | yes | - | [s3\_bucket\_kms\_arns](#input\_s3\_bucket\_kms\_arns) | List of KMS key ARNs for s3 bucket encryption | `list(string)` | `[]` | no | - | [s3\_buckets](#input\_s3\_buckets) | The names of S3 buckets to forward logs to Datadog | `list(string)` | `[]` | no | - | [s3\_buckets\_with\_prefixes](#input\_s3\_buckets\_with\_prefixes) | The names S3 buckets and prefix to forward logs to Datadog | `map(object({ bucket_name : string, bucket_prefix : string }))` | `{}` | no | - | [security\_group\_ids](#input\_security\_group\_ids) | List of security group IDs to use when the Lambda Function runs in a VPC | `list(string)` | `null` | no | - | [stage](#input\_stage) | ID element. Usually used to indicate role, e.g. 'prod', 'staging', 'source', 'build', 'test', 'deploy', 'release' | `string` | `null` | no | - | [subnet\_ids](#input\_subnet\_ids) | List of subnet IDs to use when deploying the Lambda Function in a VPC | `list(string)` | `null` | no | - | [tags](#input\_tags) | Additional tags (e.g. `{'BusinessUnit': 'XYZ'}`).
Neither the tag keys nor the tag values will be modified by this module. | `map(string)` | `{}` | no | - | [tenant](#input\_tenant) | ID element \_(Rarely used, not included by default)\_. A customer identifier, indicating who this instance of a resource is for | `string` | `null` | no | - | [tracing\_config\_mode](#input\_tracing\_config\_mode) | Can be either PassThrough or Active. If PassThrough, Lambda will only trace the request from an upstream service if it contains a tracing header with 'sampled=1'. If Active, Lambda will respect any tracing header it receives from an upstream service | `string` | `"PassThrough"` | no | - | [vpclogs\_cloudwatch\_log\_group](#input\_vpclogs\_cloudwatch\_log\_group) | The name of the CloudWatch Log Group for VPC flow logs | `string` | `null` | no | - - ## Outputs - - | Name | Description | - |------|-------------| - | [lambda\_forwarder\_log\_function\_arn](#output\_lambda\_forwarder\_log\_function\_arn) | Datadog Lambda forwarder CloudWatch/S3 function ARN | - | [lambda\_forwarder\_log\_function\_name](#output\_lambda\_forwarder\_log\_function\_name) | Datadog Lambda forwarder CloudWatch/S3 function name | - | [lambda\_forwarder\_rds\_enhanced\_monitoring\_function\_name](#output\_lambda\_forwarder\_rds\_enhanced\_monitoring\_function\_name) | Datadog Lambda forwarder RDS Enhanced Monitoring function name | - | [lambda\_forwarder\_rds\_function\_arn](#output\_lambda\_forwarder\_rds\_function\_arn) | Datadog Lambda forwarder RDS Enhanced Monitoring function ARN | - | [lambda\_forwarder\_vpc\_log\_function\_arn](#output\_lambda\_forwarder\_vpc\_log\_function\_arn) | Datadog Lambda forwarder VPC Flow Logs function ARN | - | [lambda\_forwarder\_vpc\_log\_function\_name](#output\_lambda\_forwarder\_vpc\_log\_function\_name) | Datadog Lambda forwarder VPC Flow Logs function name | - - - - ## References - - - Datadog's [documentation about provisioning keys](https://docs.datadoghq.com/account_management/api-app-keys) - - [cloudposse/terraform-aws-components](https://github.com/cloudposse/terraform-aws-components/tree/main/modules/datadog-lambda-forwarder) - - Cloud Posse's upstream component +references: + - name: "Datadog's documentation about provisioning keys" + description: "" + url: "https://docs.datadoghq.com/account_management/api-app-keys" + - name: "cloudposse/terraform-aws-components" + description: "Cloud Posse's upstream component" + url: "https://github.com/cloudposse/terraform-aws-components/tree/main/modules/datadog-lambda-forwarder" tags: - component/datadog-lambda-forwarder - layer/datadog diff --git a/atmos.yaml b/atmos.yaml new file mode 100644 index 0000000..481c199 --- /dev/null +++ b/atmos.yaml @@ -0,0 +1,11 @@ +# Atmos Configuration — powered by https://atmos.tools +# +# This configuration enables centralized, DRY, and consistent project scaffolding using Atmos. +# +# Included features: +# - Organizational custom commands: https://atmos.tools/core-concepts/custom-commands +# - Automated README generation: https://atmos.tools/cli/commands/docs/generate +# +# Import shared configuration used by all modules +import: + - https://raw.githubusercontent.com/cloudposse-terraform-components/.github/refs/heads/main/.github/atmos/terraform-component.yaml diff --git a/src/README.md b/src/README.md index 5a075bb..ebdd484 100644 --- a/src/README.md +++ b/src/README.md @@ -8,10 +8,9 @@ tags: # Component: `datadog-lambda-forwarder` -This component is responsible for provision all the necessary infrastructure to deploy -[Datadog Lambda forwarders](https://github.com/DataDog/datadog-serverless-functions/tree/master/aws/logs_monitoring). It -depends on the `datadog-configuration` component to get the Datadog API keys. - +This component provisions all infrastructure required to deploy +[Datadog Lambda forwarders](https://github.com/DataDog/datadog-serverless-functions/tree/master/aws/logs_monitoring). +It depends on the `datadog-configuration` component to obtain the Datadog API keys. ## Usage **Stack Level**: Regional @@ -51,10 +50,10 @@ components: filter_pattern: "" ``` -Note for other regions, you need to deploy the `datadog-configuration` component in the respective region - the datadog -configuration will be moving to a regional implementation. +Note for other regions: you need to deploy the `datadog-configuration` component in the respective region — the Datadog +configuration is moving to a regional implementation. -For example if you usually deploy to us-west-2 (and DD Configuration is `gbl`), deploy it to the new region and then +For example, if you usually deploy to `us-west-2` (and DD Configuration is `gbl`), deploy it to the new region and then deploy the lambda forwarder. ```yaml @@ -77,8 +76,8 @@ components: datadog_configuration_environment: "use1" ``` - - + + ## Requirements | Name | Version | @@ -184,13 +183,19 @@ components: | [lambda\_forwarder\_rds\_function\_arn](#output\_lambda\_forwarder\_rds\_function\_arn) | Datadog Lambda forwarder RDS Enhanced Monitoring function ARN | | [lambda\_forwarder\_vpc\_log\_function\_arn](#output\_lambda\_forwarder\_vpc\_log\_function\_arn) | Datadog Lambda forwarder VPC Flow Logs function ARN | | [lambda\_forwarder\_vpc\_log\_function\_name](#output\_lambda\_forwarder\_vpc\_log\_function\_name) | Datadog Lambda forwarder VPC Flow Logs function name | - - + + + ## References -- Datadog's [documentation about provisioning keys](https://docs.datadoghq.com/account_management/api-app-keys) -- [cloudposse/terraform-aws-components](https://github.com/cloudposse/terraform-aws-components/tree/main/modules/datadog-lambda-forwarder) - - Cloud Posse's upstream component + +- [Datadog's documentation about provisioning keys](https://docs.datadoghq.com/account_management/api-app-keys) - + +- [cloudposse/terraform-aws-components](https://github.com/cloudposse/terraform-aws-components/tree/main/modules/datadog-lambda-forwarder) - Cloud Posse's upstream component + + + [](https://cpco.io/homepage?utm_source=github&utm_medium=readme&utm_campaign=cloudposse-terraform-components/aws-datadog-lambda-forwarder&utm_content=) + From d73d21d50a79d25a071130e683117be4856238f3 Mon Sep 17 00:00:00 2001 From: Igor Rodionov Date: Thu, 21 Aug 2025 23:36:23 +0300 Subject: [PATCH 2/2] Update README.yaml --- README.yaml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.yaml b/README.yaml index e11c8a7..d81bb87 100644 --- a/README.yaml +++ b/README.yaml @@ -75,9 +75,9 @@ references: - name: "Datadog's documentation about provisioning keys" description: "" url: "https://docs.datadoghq.com/account_management/api-app-keys" - - name: "cloudposse/terraform-aws-components" + - name: cloudposse-terraform-components + url: https://github.com/orgs/cloudposse-terraform-components/repositories description: "Cloud Posse's upstream component" - url: "https://github.com/cloudposse/terraform-aws-components/tree/main/modules/datadog-lambda-forwarder" tags: - component/datadog-lambda-forwarder - layer/datadog