Skip to content

Commit 095d66b

Browse files
authored
Warn about input errors with object inputs, remove DNS section (#606)
1 parent f6b441e commit 095d66b

File tree

12 files changed

+144
-64
lines changed

12 files changed

+144
-64
lines changed

.editorconfig

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,3 +11,8 @@ indent_size = 4
1111

1212
[*.sh]
1313
indent_style = tab
14+
15+
[*.py]
16+
indent_style = space
17+
indent_size = 4
18+

content/docs/fundamentals/introduction.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ How does it differentiate from these solutions?
3434

3535
1. It's 100% Open Source: SweetOps [is on GitHub](https://github.com/cloudposse) and is free to use with no strings attached under Apache 2.0.
3636
1. It's comprehensive: SweetOps is not only about Terraform. It provides patterns and conventions for building cloud native platforms that are security focused, Kubernetes-based, and driven by continuous delivery.
37-
1. It's community focused: SweetOps has [over 3400 users in Slack](https://sweetops.com/slack/), well-attended weekly office hours, and a [budding community forum](https://ask.sweetops.com/).
37+
1. It's community focused: SweetOps has [over 9000 users in Slack](https://sweetops.com/slack/), well-attended weekly office hours, and a [budding community forum](https://ask.sweetops.com/).
3838

3939

4040
## How is this documentation structured?

content/docs/intro.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ Start with getting familiar with the [geodesic](/reference/tools.mdx#geodesic).
1212

1313
Get intimately familiar with docker inheritance and [multi-stage docker builds](/reference/best-practices/docker-best-practices.md#multi-stage-builds). We use this pattern extensively.
1414

15-
Check out our [terraform-aws-components](https://github.com/cloudposse/terraform-aws-components) for reference architectures to easily provision infrastructure
15+
Check out our [terraform-aws-components](https://github.com/cloudposse/terraform-aws-components) for reference architectures to easily provision infrastructure.
1616

1717
## Tools
1818

@@ -70,7 +70,7 @@ Review our [glossary](/category/glossary/) if there are any terms that are confu
7070

7171
File issues anywhere you find the documentation lacking by going to our [docs repo](https://github.com/cloudposse/docs).
7272

73-
Join our [Slack Community](https://cloudposse.com/slack/) and speak directly with the maintainers
73+
Join our [Slack Community](https://cloudposse.com/slack/) and speak directly with the maintainers.
7474

7575
We provide "white glove" DevOps support. [Get in touch](/contact-us.md) with us today!
7676

content/docs/reference/best-practices/docker-best-practices.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -15,10 +15,10 @@ Try to leverage the same base image in as many of your images as possible for fa
1515

1616
## Multi-stage Builds
1717

18-
There are two ways to leverage multi-stage builds.
18+
There are two ways to leverage multi-stage builds:
1919

20-
1. *Build-time Environments* The most common application of multi-stage builds is for using a build-time environment for compiling apps, and then a minimal image (E.g. `alpine` or `scratch`) for distributing the resultant artifacts (e.g. statically-linked go binaries).
21-
2. *Multiple-Inheritance* We like to think of "multi-stage builds" as a mechanism for "multiple inheritance" as it relates to docker images. While not technically the same thing, using mult-stage images, it's possible `COPY --from=other-image` to keep things very DRY.
20+
1. *Build-time Environments* The most common application of multi-stage builds is for using a build-time environment for compiling apps, and then a minimal image (E.g. `alpine` or `scratch`) for distributing the resultant artifacts (e.g. statically-linked `go` binaries).
21+
2. *Multiple-Inheritance* We like to think of "multi-stage builds" as a mechanism for "multiple inheritance" as it relates to docker images. While not technically the same thing, using multi-stage images makes it possible to `COPY --from=other-image` to keep things very DRY.
2222

2323
:::info
2424
- <https://docs.docker.com/develop/develop-images/multistage-build/>

content/docs/reference/best-practices/terraform-best-practices.md

Lines changed: 47 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -100,6 +100,23 @@ conversion errors, and allows for future expansion without breaking changes.
100100
Make as many fields as possible optional, provide defaults at every level of
101101
nesting, and use `nullable = false` if possible.
102102

103+
:::caution Extra (or Misspelled) Fields in Object Inputs Will Be Silently Ignored
104+
105+
If you use an object with defaults as an input, Terraform will not give any
106+
indication if the user provides extra fields in the input object. This is
107+
particularly a problem if they misspelled an optional field name, because
108+
the misspelled field will be silently ignored, and the default value the
109+
user intended to override will silently be used. This is
110+
[a limitation of Terraform](https://github.com/hashicorp/terraform/issues/29204#issuecomment-1989579801).
111+
Furthermore, there is no way to add any checks for this situation, because
112+
the input will have already been transformed (unexpected fields removed) by
113+
the time any validation code runs. This makes using an object a trade-off
114+
versus using separate inputs, which do not have this problem, or `type = any`
115+
which allows you to write validation code to catch this problem and
116+
additional code to supply defaults for missing fields.
117+
118+
:::
119+
103120
Reserve `type = any` for exceptional cases where the input is highly
104121
variable and/or complex, and the module is designed to handle it. For
105122
example, the configuration of a [Datadog synthetic test](https://registry.terraform.io/providers/DataDog/datadog/latest/docs/resources/synthetics_test)
@@ -117,8 +134,9 @@ often use a large number of input variables of simple types. This is because
117134
in the early development of Terraform, there was no good way to define
118135
complex objects with defaults. However, now that Terraform supports complex
119136
objects with field-level defaults, we recommend using a single object input
120-
variable with such defaults to group related configuration. This makes the
121-
interface easier to understand and use.
137+
variable with such defaults to group related configuration, taking into consideration
138+
the trade-offs listed in the [above caution](#use-objects-with-optional-fields-for-complex-inputs).
139+
This makes the interface easier to understand and use.
122140

123141
For example, prefer:
124142

@@ -151,6 +169,27 @@ variable "eip_delete_timeout" {
151169
}
152170
```
153171

172+
However, using an object with defaults versus multiple simple inputs is not
173+
without trade-offs, as explained in the [above caution](#use-objects-with-optional-fields-for-complex-inputs).
174+
175+
176+
There are a few ways to mitigate this problem besides using separate inputs:
177+
178+
- If all the defaults are null or empty, you can use a `map(string)` input
179+
variable and use the `keys` function to check for unexpected fields. This
180+
catches errors, but has the drawback that it does not provide
181+
documentation of what fields are expected.
182+
- You can use `type = any` for inputs, but then you have to write the extra
183+
code to validate the input and supply defaults for missing fields. You
184+
should also document the expected fields in the input description.
185+
- If all you are worried about is misspelled field names, you can make the
186+
correctly spelled field names required, ensuring they are supplied.
187+
Alternatively, if the misspelling is predictable, such as you have a field
188+
named `minsize` but people are likely to try to supply `min_size`, you can
189+
make the misspelled field name optional with a sentinel value and then
190+
check for that value in the validation code.
191+
192+
154193
### Use custom validators to enforce custom constraints
155194

156195
Use the `validation` block to enforce custom constraints on input variables.
@@ -277,7 +316,12 @@ configuration to a separate template file.
277316

278317
Linting helps to ensure a consistent code formatting, improves code quality and catches common errors with syntax.
279318

280-
Run `terraform fmt` before committing all code. Use a `pre-commit` hook to do this automatically. See [Terraform Tips & Tricks](/reference/best-practices/terraform-tips-tricks.md)
319+
Run `terraform fmt` before committing all code. Use a `pre-commit` hook to
320+
do this automatically. See [Terraform Tips &
321+
Tricks](/reference/best-practices/terraform-tips-tricks.md)
322+
323+
Consider using [`tflint`](https://github.com/terraform-linters/tflint) with
324+
the `aws` plugin.
281325

282326
### Use CIDR math interpolation functions for network calculations
283327

@@ -412,25 +456,6 @@ To enforce consistency, we require that all modules use the [`terraform-null-lab
412456
With this module, users have the ability to change the way resource names are generated such as by changing the order of parameters or the delimiter.
413457
While the module is opinionated on the parameters, it's proved invaluable as a mechanism for generating consistent resource names.
414458

415-
## DNS Infrastructure
416-
417-
### Use lots of DNS zones
418-
419-
Never mingle DNS from different stages or environments in the same zone.
420-
421-
### Delegate DNS zones across account boundaries
422-
423-
Delegate each AWS account a DNS zone for which it is authoritative.
424-
425-
### Distinguish between branded domains and service discovery domains
426-
427-
Service discovery domains are what services use to discover each other. These are seldom if ever used by end-users. There should only
428-
be one service discovery domain, but there may be many zones delegated from that domain.
429-
430-
Branded domains are the domains that users use to access the services. These are determined by products, marketing, and business use-cases.
431-
There may be many branded domains pointing to a single service discovery domain. The architecture of the branded domains won't mirror the
432-
service discovery domains.
433-
434459
## Module Design
435460

436461
### Small Opinionated Modules

content/docs/reference/terraform-in-depth/terraform-count-vs-for-each.md

Lines changed: 13 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -303,7 +303,7 @@ If you are unlucky (or if you run `terraform apply` 3 times), the change
303303
will go through, and user "Dick" will be renamed user "Tom", meaning that
304304
whatever access Dick had, Tom now gets. Likewise, user Dick is renamed Harry,
305305
getting Harry's access, and Harry get the newly created user. For example, Tom
306-
can now log in with user name Tom using Dick's password, while Harry will be
306+
can now log in with user name "Tom" using Dick's password, while Harry will be
307307
locked out as a new user. This nightmare scenario has a lot to do with
308308
peculiarities of the implementation of IAM principals, but gives you an idea
309309
of what can happen when you use `count` with a list of resource configurations.
@@ -322,6 +322,8 @@ affected. The answer to this is `for_each`, but that is not without its own
322322
limitations.
323323
:::
324324

325+
### For Each is Stable, But Not Always Feasible to Use
326+
325327
#### The Stability of `for_each`
326328

327329
In large part to address the instability of `count`, Terraform introduced
@@ -399,6 +401,16 @@ this has all the same problems as `count`, in which case using `count` is
399401
better because it is simpler and all of the issues with `count` are already
400402
understood.
401403

404+
::: note
405+
Another limitation, though not frequently encountered, is that "sensitive"
406+
values, such as sensitive input variables, sensitive outputs, or sensitive
407+
resource attributes, cannot be used as arguments to `for_each`. As stated
408+
previously, the value supplied to `for_each` is used as part of the resource
409+
address, and as such, it will always be disclosed in UI output, which is why
410+
sensitive values are not allowed. Attempts to use sensitive values as
411+
`for_each` arguments will result in an error.
412+
:::
413+
402414
Ideally, as we saw with IAM users in the examples above, the user would
403415
supply static keys in the initial configuration, and then they would always
404416
be known and usable in `for_each`, while allowing the user to add or remove

content/docs/tutorials/geodesic-getting-started.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -27,9 +27,9 @@ Before we jump in, it's important to note that Geodesic is built around some adv
2727

2828
Let's talk about a few of the ways that one can run Geodesic. Our toolbox has been built to satisfy many use-cases, and each result in a different pattern of invocation:
2929

30-
1. You can **run standalone** Geodesic as a standard docker container using `docker run`. This enables you to get started quickly, to avoid fiddling with configuration or run one-off commands using some of the built-in tools.
30+
1. You can **run standalone** Geodesic as a standard docker container using `docker run`. This enables you to quickly use most of the built-in tools. (Some tools require installing the wrapper script first, as explained in the next step.)
3131
1. Example: `docker run -it --rm --volume $HOME:/localhost cloudposse/geodesic:latest-debian --login` opens a bash login shell (`--login` is our Docker `CMD` here; it's actually just [the arguments passed to the `bash` shell](https://www.gnu.org/software/bash/manual/html_node/Bash-Startup-Files.html) which is our `ENTRYPOINT`) in our Geodesic container.
32-
1. Example: `docker run -it --rm --volume $HOME:/localhost cloudposse/geodesic:latest-debian -c "terraform version"` executes the `terraform version` command as a one off and outputs the result.
32+
1. Example: `docker run --rm cloudposse/geodesic:latest-debian -c "terraform version"` executes the `terraform version` command as a one-off and outputs the result.
3333
1. You can **install** Geodesic onto your local machine using what we call the docker-bash pattern (e.g. `docker run ... | bash`). Similar to above, this enables a quickstart process but supports longer lived usage as it creates a callable script on your machine that enables reuse any time you want to start a shell.
3434
1. Example: `docker run --rm cloudposse/geodesic:latest-debian init | bash -s latest-debian` installs `/usr/local/bin/geodesic` on your local machine which you can execute repeatedly via simply typing `geodesic`. In this example, we're pinning the script to use the `cloudposse/geodesic:latest-debian` docker image, but we could also pin to our own image or to a specific version.
3535
1. Lastly, you can **build your own toolbox** on top of Geodesic. This is what SweetOps generally recommends to practitioners. We do this when we want to provide additional packages or customization to our team while building on the foundation that geodesic provides. This is simple to do by using Geodesic as your base image (e.g. `FROM cloudposse/geodesic:latest-debian`) in your own `Dockerfile`, adding your own Docker `RUN` commands or overriding environment variables, and then using `docker build` to create a new image that you distribute to your team. This is more advanced usage and we'll cover how to do this in a future how-to article.
@@ -90,9 +90,9 @@ terraform init
9090
terraform apply -auto-approve
9191
```
9292

93-
Sweet, you should see a successful `terraform apply` with some detailed `output` info on the original star wars hero! 😎
93+
Sweet, you should see a successful `terraform apply` with some detailed `output` data on the original star wars hero! 😎
9494

95-
Just to show some simple usage of another tool in the toolbox, how about we pull apart that info and get that hero's name?
95+
Just to show some simple usage of another tool in the toolbox, how about we parse that data and get that hero's name?
9696

9797
### 4. Read some data from our Outputs
9898

scripts/docs-collator/AbstractFetcher.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,8 +19,8 @@ def __init__(self, github_provider, download_dir):
1919
def _fetch_readme_yaml(self, repo, module_download_dir):
2020
self.github_provider.fetch_file(repo, README_YAML, module_download_dir)
2121

22-
def _fetch_docs(self, repo, module_download_dir):
23-
remote_files = self.github_provider.list_repo_dir(repo, DOCS_DIR)
22+
def _fetch_docs(self, repo, module_download_dir, submodule_dir=""):
23+
remote_files = self.github_provider.list_repo_dir(repo, os.path.join(submodule_dir, DOCS_DIR))
2424

2525
for remote_file in remote_files:
2626
if os.path.basename(remote_file) == TARGETS_MD: # skip targets.md

scripts/docs-collator/AbstractRenderer.py

Lines changed: 10 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -15,20 +15,23 @@ def __init__(self, message):
1515

1616

1717
class AbstractRenderer:
18-
def _pre_rendering_fixes(self, repo, module_download_dir):
19-
readme_yaml_file = os.path.join(module_download_dir, README_YAML)
18+
def _pre_rendering_fixes(self, repo, module_download_dir, submodule_dir=""):
19+
readme_yaml_file = os.path.join(module_download_dir, submodule_dir, README_YAML)
2020
content = io.read_file_to_string(readme_yaml_file)
2121
content = rendering.remove_targets_md(content)
22-
content = rendering.rename_name(repo, content)
22+
if submodule_dir == "":
23+
content = rendering.rename_name(repo.name, content)
24+
else:
25+
content = rendering.rename_name("pre-fix-" + os.path.basename(submodule_dir), content)
2326
io.save_string_to_file(readme_yaml_file, content)
2427

25-
def _post_rendering_fixes(self, repo, readme_md_file):
28+
def _post_rendering_fixes(self, repo, readme_md_file, submodule_dir=""):
2629
content = io.read_file_to_string(readme_md_file)
2730
content = rendering.fix_self_non_closing_br_tags(content)
2831
content = rendering.fix_custom_non_self_closing_tags_in_pre(content)
29-
content = rendering.fix_github_edit_url(content, repo)
30-
content = rendering.fix_sidebar_label(content, repo)
31-
content = rendering.replace_relative_links_with_github_links(repo, content)
32+
content = rendering.fix_github_edit_url(content, repo, submodule_dir)
33+
content = rendering.fix_sidebar_label(content, repo, os.path.basename(submodule_dir))
34+
content = rendering.replace_relative_links_with_github_links(repo, content, submodule_dir)
3235
io.save_string_to_file(readme_md_file, content)
3336

3437
def _copy_extra_resources_for_docs(self, module_download_dir, module_docs_dir):

scripts/docs-collator/ModuleFetcher.py

Lines changed: 13 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -40,9 +40,18 @@ def __fetch_images(self, repo, module_download_dir):
4040

4141
def __fetch_submodules(self, repo, module_download_dir):
4242
remote_files = self.github_provider.list_repo_dir(repo, SUBMODULES_DIR)
43+
readme_files = {}
4344

4445
for remote_file in remote_files:
45-
if os.path.basename(remote_file) != README_MD:
46-
continue
47-
48-
self.github_provider.fetch_file(repo, remote_file, module_download_dir)
46+
base_name = os.path.basename(remote_file)
47+
dir_name = os.path.dirname(remote_file)
48+
49+
if base_name == README_YAML:
50+
readme_files[dir_name] = remote_file
51+
elif base_name == README_MD and dir_name not in readme_files:
52+
readme_files[dir_name] = remote_file
53+
54+
for readme_file in readme_files.values():
55+
self.github_provider.fetch_file(repo, readme_file, module_download_dir)
56+
if os.path.basename(readme_file) == README_YAML:
57+
self._fetch_docs(repo, module_download_dir, submodule_dir=os.path.dirname(readme_file))

0 commit comments

Comments
 (0)