Skip to content

Commit 59c1be2

Browse files
committed
Merge branch 'master' into remove-top-nav
2 parents 28335d3 + 12656ce commit 59c1be2

File tree

11 files changed

+300
-5
lines changed

11 files changed

+300
-5
lines changed
Lines changed: 65 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,65 @@
1+
name: Update Persistence Docs
2+
on:
3+
schedule:
4+
- cron: 0 5 * * MON
5+
workflow_dispatch:
6+
inputs:
7+
targetBranch:
8+
required: false
9+
type: string
10+
default: 'master'
11+
12+
jobs:
13+
update-persistence-docs:
14+
name: Update Parity Docs
15+
runs-on: ubuntu-latest
16+
steps:
17+
- name: Checkout docs
18+
uses: actions/checkout@v4
19+
with:
20+
fetch-depth: 0
21+
path: docs
22+
ref: ${{ github.event.inputs.targetBranch || 'master' }}
23+
24+
- name: Set up Python 3.11
25+
id: setup-python
26+
uses: actions/setup-python@v5
27+
with:
28+
python-version: "3.11"
29+
30+
- name: Update Coverage Docs with Persistence Coverage
31+
working-directory: docs
32+
run: |
33+
cd scripts/persistence
34+
python3 -m venv .venv
35+
source .venv/bin/activate
36+
pip3 install -r requirements.txt
37+
python3 create_persistence_docs.py
38+
env:
39+
NOTION_TOKEN: ${{ secrets.NOTION_TOKEN }}
40+
41+
- name: Check for changes
42+
id: check-for-changes
43+
working-directory: docs
44+
run: |
45+
# Check if there are changed files and store the result in resources/diff-check.log
46+
# Check against the PR branch if it exists, otherwise against the main
47+
# Store the result in resources/diff-check.log and store the diff count in the GitHub Action output "diff-count"
48+
mkdir -p resources
49+
(git diff --name-only origin/persistence-auto-updates src/data/persistence/ 2>/dev/null || git diff --name-only origin/${{ github.event.inputs.targetBranch || 'master' }} src/data/persistence/ 2>/dev/null) | tee -a resources/diff-check.log
50+
echo "diff-count=$(cat resources/diff-check.log | wc -l)" >> $GITHUB_OUTPUT
51+
cat cat resources/diff-check.log
52+
53+
- name: Create PR
54+
uses: peter-evans/create-pull-request@v7
55+
if: ${{ success() && steps.check-for-changes.outputs.diff-count != '0' && steps.check-for-changes.outputs.diff-count != '' }}
56+
with:
57+
path: docs
58+
title: "Update Persistence Docs"
59+
body: "Updating Persistence Coverage Documentation based on the [Persistence Catalog](https://www.notion.so/localstack/Persistence-Catalog-a9e0e5cb89df4784adb4a1ed377b3c23) on Notion."
60+
branch: "persistence-auto-updates"
61+
author: "LocalStack Bot <[email protected]>"
62+
committer: "LocalStack Bot <[email protected]>"
63+
commit-message: "update generated persistence docs"
64+
token: ${{ secrets.PRO_ACCESS_TOKEN }}
65+
reviewers: giograno
Lines changed: 50 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,50 @@
1+
name: Update function coverage docs
2+
3+
on:
4+
schedule:
5+
# “At 00:00 on Sunday.”
6+
- cron: "0 0 * * 0"
7+
workflow_dispatch:
8+
9+
jobs:
10+
update-function-coverage:
11+
name: Update function coverage docs
12+
runs-on: ubuntu-latest
13+
steps:
14+
- name: Checkout
15+
uses: actions/checkout@v4
16+
with:
17+
path: localstack-docs
18+
19+
- name: Setup Python
20+
uses: actions/setup-python@v4
21+
with:
22+
python-version: '3.10'
23+
24+
- name: Checkout private tools
25+
uses: actions/checkout@v4
26+
with:
27+
repository: localstack/snowflake
28+
path: snowflake
29+
token: ${{ secrets.GH_TOKEN }}
30+
31+
- name: Run the script
32+
run: |
33+
cd localstack-docs
34+
pip install localstack lxml requests
35+
python ../snowflake/etc/coverage.py
36+
37+
- name: Move the generated files
38+
run: |
39+
cd localstack-docs
40+
mv coverage-features.md src/content/docs/snowflake/features/index.md
41+
mv coverage-functions.md src/content/docs/snowflake/sql-functions.md
42+
43+
- name: Commit changes
44+
uses: EndBug/add-and-commit@v9
45+
with:
46+
author_name: 'LocalStack Bot'
47+
author_email: [email protected]
48+
message: 'Updated function coverage docs'
49+
cwd: localstack-docs
50+
add: 'src/content/'

.gitignore

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,4 +20,13 @@ pnpm-debug.log*
2020
# macOS-specific files
2121
.DS_Store
2222
.kiro
23-
.vscode
23+
.vscode
24+
25+
# Python
26+
.venv/
27+
__pycache__/
28+
*.pyc
29+
*.pyo
30+
*.pyd
31+
*.pyw
32+
*.pyz
4.21 KB
Binary file not shown.
Lines changed: 138 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,138 @@
1+
import os
2+
from io import BytesIO
3+
import json
4+
from pathlib import Path
5+
import notion_client as n_client
6+
import frontmatter
7+
from ruamel.yaml import YAML
8+
from frontmatter.default_handlers import YAMLHandler, DEFAULT_POST_TEMPLATE
9+
from notion.catalog import PersistenceCatalog
10+
11+
token = os.getenv("NOTION_TOKEN")
12+
markdown_path = "../../src/content/docs/aws/services"
13+
persistence_path = "../../src/data/persistence"
14+
persistence_data = os.path.join(persistence_path, "coverage.json")
15+
16+
17+
class CustomYAMLHandler(YAMLHandler):
18+
19+
def load(self, fm: str, **kwargs: object):
20+
yaml = YAML()
21+
yaml.default_flow_style = False
22+
yaml.preserve_quotes = True
23+
return yaml.load(fm, **kwargs) # type: ignore[arg-type]
24+
25+
def export(self, metadata: dict[str, object], **kwargs: object) -> str:
26+
yaml = YAML()
27+
yaml.default_flow_style = False
28+
from io import StringIO
29+
stream = StringIO()
30+
yaml.dump(metadata, stream)
31+
return stream.getvalue().rstrip()
32+
33+
def format(self, post, **kwargs):
34+
"""
35+
Simple customization to avoid removing the last line.
36+
"""
37+
start_delimiter = kwargs.pop("start_delimiter", self.START_DELIMITER)
38+
end_delimiter = kwargs.pop("end_delimiter", self.END_DELIMITER)
39+
40+
metadata = self.export(post.metadata, **kwargs)
41+
42+
return DEFAULT_POST_TEMPLATE.format(
43+
metadata=metadata,
44+
content=post.content,
45+
start_delimiter=start_delimiter,
46+
end_delimiter=end_delimiter,
47+
).lstrip()
48+
49+
50+
def lookup_full_name(shortname: str) -> str:
51+
"""Given the short default name of a service, looks up for the full name"""
52+
service_lookup = Path("../../src/data/coverage/service_display_name.json")
53+
service_info = {}
54+
if service_lookup.exists() and service_lookup.is_file():
55+
with open(service_lookup, "r") as f:
56+
service_info = json.load(f)
57+
58+
service_name_title = shortname
59+
60+
if service_name_details := service_info.get(shortname, {}):
61+
service_name_title = service_name_details.get("long_name", shortname)
62+
if service_name_title and (short_name := service_name_details.get("short_name")):
63+
service_name_title = f"{short_name} ({service_name_title})"
64+
return service_name_title
65+
66+
67+
def collect_status() -> dict:
68+
"""Reads the catalog on Notion and returns the status of persistence for each service"""
69+
if not token:
70+
print("Aborting, please provide a NOTION_TOKEN in the env")
71+
notion_client = n_client.Client(auth=token)
72+
73+
catalog_db = PersistenceCatalog(notion_client=notion_client)
74+
statuses = {}
75+
for item in catalog_db:
76+
# we do not want some services to be mentioned in the docs (for instance, not yet released)
77+
if item.exclude:
78+
continue
79+
80+
# Skip entries with empty or placeholder names
81+
if not item.name or not item.name.strip():
82+
continue
83+
84+
# Skip template/placeholder entries
85+
if item.name.strip().lower() in ['new service page', 'template', 'placeholder']:
86+
continue
87+
88+
service = item.name.replace('_', '-')
89+
status = item.status.lower()
90+
statuses[service] = {
91+
"service": service,
92+
"full_name": lookup_full_name(service),
93+
"support": status,
94+
"test_suite": item.has_test or False,
95+
# we collect limitations notes only for the services explicitly marked with limitations
96+
"limitations": item.limitations if "limit" in status else ""
97+
}
98+
statuses = dict(sorted(statuses.items()))
99+
100+
# save the data
101+
if not os.path.exists(persistence_path):
102+
os.mkdir(persistence_path)
103+
with open(persistence_data, 'w') as f:
104+
json.dump(statuses, f, indent=2)
105+
return statuses
106+
107+
108+
def update_frontmatter(statuses: dict):
109+
"""Updates the frontmatter of the service page in the user guide Markdown file"""
110+
for service, values in statuses.items():
111+
112+
# a bunch of special cases
113+
if "cognito" in service:
114+
service = "cognito"
115+
if service == "kafka":
116+
service = "msk"
117+
118+
_path = os.path.join(markdown_path, f"{service}.mdx")
119+
if not os.path.exists(_path):
120+
continue
121+
122+
support_value = values.get("support")
123+
is_supported = support_value == "supported" or support_value == "supported with limitations"
124+
if not is_supported:
125+
# we don't want to modify the frontmatter for the services not supporting persistence
126+
continue
127+
128+
# open the markdown file and read the content
129+
content = frontmatter.load(_path, handler=CustomYAMLHandler())
130+
desc = content.metadata["description"]
131+
content.metadata["description"] = desc.strip()
132+
content.metadata["persistence"] = values.get("support", "unknown")
133+
frontmatter.dump(content, _path)
134+
135+
136+
if __name__ == "__main__":
137+
data = collect_status()
138+
update_frontmatter(statuses=data)

scripts/persistence/notion/__init__.py

Whitespace-only changes.
Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
"""Models for the notion service catalog https://www.notion.so/localstack/3c0f615e7ffc4ae2a034f1ed9c444bd2"""
2+
3+
from notion_client import Client as NotionClient
4+
5+
from notion_objects import (
6+
Page,
7+
TitlePlainText,
8+
Status,
9+
Database,
10+
Checkbox,
11+
PeopleProperty,
12+
Text
13+
)
14+
15+
DEFAULT_CATALOG_DATABASE_ID = "3c0f615e7ffc4ae2a034f1ed9c444bd2"
16+
17+
18+
class PersistenceServiceItem(Page):
19+
name = TitlePlainText("Name")
20+
status = Status("Persistence")
21+
has_test = Checkbox("Persistence Tests")
22+
primary_owner = PeopleProperty("Primary Owner")
23+
secondary_owner = PeopleProperty("Secondary Owner(s)")
24+
limitations = Text("Limitations (synced with docs)")
25+
exclude = Checkbox("Exclude from docs")
26+
27+
28+
class PersistenceCatalog(Database[PersistenceServiceItem]):
29+
def __init__(self, notion_client: NotionClient, database_id: str | None = None):
30+
super().__init__(PersistenceServiceItem, database_id or DEFAULT_CATALOG_DATABASE_ID, notion_client)
Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
notion-client==2.2.1
2+
notion-objects==0.6.2
3+
python-frontmatter==1.1.0
4+
ruamel.yaml==0.18.6

src/content/docs/aws/services/pinpoint.mdx

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,6 @@ title: "Pinpoint"
33
description: Get started with Pinpoint on LocalStack
44
tags: ["Ultimate"]
55
persistence: supported
6-
76
---
87

98
import FeatureCoverage from "../../../../components/feature-coverage/FeatureCoverage";

src/content/docs/snowflake/features/stages.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ In this guide, you will create a database and a table for storing data. You will
1818

1919
### Download the sample data
2020

21-
You can download the sample data by [right-clicking on this link](./getting-started.zip) and downloading this in your machine. Unzip the file and save the contents to a directory on your local machine.
21+
You can download the sample data by [clicking on this link](/artifacts/getting-started.zip) and downloading this in your machine. Unzip the file and save the contents to a directory on your local machine.
2222

2323
### Create a database & table
2424

@@ -87,7 +87,7 @@ The expected output is:
8787

8888
## Loading files from S3
8989

90-
You can also load data from an S3 bucket using the `CREATE STAGE` command. Create a new S3 bucket named `testbucket` and upload the [employees CSV files](./getting-started.zip) to the bucket. You can use LocalStack's `awslocal` CLI to create the S3 bucket and upload the files.
90+
You can also load data from an S3 bucket using the `CREATE STAGE` command. Create a new S3 bucket named `testbucket` and upload the [employees CSV files](/artifacts/getting-started.zip) to the bucket. You can use LocalStack's `awslocal` CLI to create the S3 bucket and upload the files.
9191

9292
```bash
9393
awslocal s3 mb s3://testbucket

0 commit comments

Comments
 (0)