Skip to content

Commit 00ea70f

Browse files
authored
Merge branch 'main' into feat-49796-azure_virtual_machines_operator
2 parents 2244400 + 19fdbe4 commit 00ea70f

File tree

360 files changed

+6462
-2866
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

360 files changed

+6462
-2866
lines changed

.asf.yaml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -171,6 +171,7 @@ github:
171171
- gyli
172172
- jroachgolf84
173173
- Dev-iL
174+
- kacpermuda
174175

175176
notifications:
176177
jobs: jobs@airflow.apache.org

.codespellignorelines

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,3 +4,4 @@
44
The platform supports **C**reate, **R**ead, **U**pdate, and **D**elete operations on most resources.
55
<pre><code>Code block\ndoes not\nrespect\nnewlines\n</code></pre>
66
"trough",
7+
assert "task_instance_id" in route.dependant.path_param_names, (
Lines changed: 266 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,266 @@
1+
# Licensed to the Apache Software Foundation (ASF) under one
2+
# or more contributor license agreements. See the NOTICE file
3+
# distributed with this work for additional information
4+
# regarding copyright ownership. The ASF licenses this file
5+
# to you under the Apache License, Version 2.0 (the
6+
# "License"); you may not use this file except in compliance
7+
# with the License. You may obtain a copy of the License at
8+
#
9+
# http://www.apache.org/licenses/LICENSE-2.0
10+
#
11+
# Unless required by applicable law or agreed to in writing,
12+
# software distributed under the License is distributed on an
13+
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
14+
# KIND, either express or implied. See the License for the
15+
# specific language governing permissions and limitations
16+
# under the License.
17+
#
18+
---
19+
name: Registry Backfill
20+
on: # yamllint disable-line rule:truthy
21+
workflow_dispatch:
22+
inputs:
23+
destination:
24+
description: >
25+
Publish to live or staging S3 bucket
26+
required: true
27+
type: choice
28+
options:
29+
- staging
30+
- live
31+
default: staging
32+
providers:
33+
description: >
34+
Space-separated provider IDs
35+
(e.g. 'amazon google databricks')
36+
required: true
37+
type: string
38+
versions:
39+
description: >
40+
Space-separated versions to backfill
41+
(e.g. '9.15.0 9.14.0'). Applied to ALL providers.
42+
required: true
43+
type: string
44+
45+
permissions:
46+
contents: read
47+
48+
jobs:
49+
prepare:
50+
runs-on: ubuntu-latest
51+
outputs:
52+
matrix: ${{ steps.matrix.outputs.matrix }}
53+
bucket: ${{ steps.destination.outputs.bucket }}
54+
steps:
55+
- name: "Build provider matrix"
56+
id: matrix
57+
env:
58+
PROVIDERS: ${{ inputs.providers }}
59+
run: |
60+
MATRIX=$(echo "${PROVIDERS}" \
61+
| tr ' ' '\n' | jq -R . \
62+
| jq -cs '{"provider": .}')
63+
echo "matrix=${MATRIX}" >> "${GITHUB_OUTPUT}"
64+
65+
- name: "Determine S3 destination"
66+
id: destination
67+
env:
68+
DESTINATION: ${{ inputs.destination }}
69+
run: |
70+
if [[ "${DESTINATION}" == "live" ]]; then
71+
URL="s3://live-docs-airflow-apache-org"
72+
else
73+
URL="s3://staging-docs-airflow-apache-org"
74+
fi
75+
echo "bucket=${URL}/registry/" \
76+
>> "${GITHUB_OUTPUT}"
77+
78+
backfill:
79+
needs: prepare
80+
runs-on: ubuntu-latest
81+
timeout-minutes: 60
82+
strategy:
83+
fail-fast: false
84+
matrix: ${{ fromJSON(needs.prepare.outputs.matrix) }}
85+
name: "Backfill ${{ matrix.provider }}"
86+
if: >
87+
contains(fromJSON('[
88+
"ashb",
89+
"bugraoz93",
90+
"eladkal",
91+
"ephraimbuddy",
92+
"jedcunningham",
93+
"jscheffl",
94+
"kaxil",
95+
"pierrejeambrun",
96+
"shahar1",
97+
"potiuk",
98+
"utkarsharma2",
99+
"vincbeck"
100+
]'), github.event.sender.login)
101+
steps:
102+
- name: "Checkout repository"
103+
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
104+
with:
105+
persist-credentials: false
106+
fetch-depth: 0
107+
108+
- name: "Fetch provider tags"
109+
env:
110+
VERSIONS: ${{ inputs.versions }}
111+
PROVIDER: ${{ matrix.provider }}
112+
run: |
113+
for VERSION in ${VERSIONS}; do
114+
TAG="providers-${PROVIDER}/${VERSION}"
115+
echo "Fetching tag: ${TAG}"
116+
git fetch origin tag "${TAG}" \
117+
2>/dev/null || echo "Tag not found"
118+
done
119+
120+
- name: "Install uv"
121+
uses: astral-sh/setup-uv@bd01e18f51369d5765a7df3681d34498e332e27e # v6.3.1
122+
123+
- name: "Install Breeze"
124+
uses: ./.github/actions/breeze
125+
with:
126+
python-version: "3.12"
127+
128+
- name: "Install AWS CLI v2"
129+
run: |
130+
curl -sSf \
131+
"https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" \
132+
-o /tmp/awscliv2.zip
133+
unzip -q /tmp/awscliv2.zip -d /tmp
134+
rm /tmp/awscliv2.zip
135+
sudo /tmp/aws/install --update
136+
rm -rf /tmp/aws/
137+
138+
- name: "Configure AWS credentials"
139+
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7 # v6.0.0
140+
with:
141+
aws-access-key-id: ${{ secrets.DOCS_AWS_ACCESS_KEY_ID }}
142+
aws-secret-access-key: ${{ secrets.DOCS_AWS_SECRET_ACCESS_KEY }}
143+
aws-region: us-east-2
144+
145+
- name: "Download existing providers.json"
146+
env:
147+
S3_BUCKET: ${{ needs.prepare.outputs.bucket }}
148+
run: |
149+
aws s3 cp \
150+
"${S3_BUCKET}api/providers.json" \
151+
dev/registry/providers.json || true
152+
153+
- name: "Extract version metadata from git tags"
154+
env:
155+
VERSIONS: ${{ inputs.versions }}
156+
PROVIDER: ${{ matrix.provider }}
157+
run: |
158+
VERSION_ARGS=""
159+
for VERSION in ${VERSIONS}; do
160+
VERSION_ARGS="${VERSION_ARGS} --version ${VERSION}"
161+
done
162+
uv run python dev/registry/extract_versions.py \
163+
--provider "${PROVIDER}" ${VERSION_ARGS} || true
164+
165+
- name: "Run breeze registry backfill"
166+
env:
167+
VERSIONS: ${{ inputs.versions }}
168+
PROVIDER: ${{ matrix.provider }}
169+
run: |
170+
VERSION_ARGS=""
171+
for VERSION in ${VERSIONS}; do
172+
VERSION_ARGS="${VERSION_ARGS} --version ${VERSION}"
173+
done
174+
breeze registry backfill \
175+
--provider "${PROVIDER}" ${VERSION_ARGS}
176+
177+
- name: "Download data files from S3 for build"
178+
env:
179+
S3_BUCKET: ${{ needs.prepare.outputs.bucket }}
180+
run: |
181+
aws s3 cp \
182+
"${S3_BUCKET}api/providers.json" \
183+
registry/src/_data/providers.json
184+
aws s3 cp \
185+
"${S3_BUCKET}api/modules.json" \
186+
registry/src/_data/modules.json
187+
188+
- name: "Setup pnpm"
189+
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4.2.0
190+
with:
191+
version: 9
192+
193+
- name: "Setup Node.js"
194+
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
195+
with:
196+
node-version: 20
197+
cache: 'pnpm'
198+
cache-dependency-path: 'registry/pnpm-lock.yaml'
199+
200+
- name: "Install Node.js dependencies"
201+
working-directory: registry
202+
run: pnpm install --frozen-lockfile
203+
204+
- name: "Build registry site"
205+
working-directory: registry
206+
env:
207+
REGISTRY_PATH_PREFIX: "/registry/"
208+
run: pnpm build
209+
210+
- name: "Sync backfilled version pages to S3"
211+
env:
212+
S3_BUCKET: ${{ needs.prepare.outputs.bucket }}
213+
CACHE_CONTROL: "public, max-age=300"
214+
VERSIONS: ${{ inputs.versions }}
215+
PROVIDER: ${{ matrix.provider }}
216+
run: |
217+
for VERSION in ${VERSIONS}; do
218+
echo "Syncing ${PROVIDER}/${VERSION}..."
219+
aws s3 sync \
220+
"registry/_site/providers/${PROVIDER}/${VERSION}/" \
221+
"${S3_BUCKET}providers/${PROVIDER}/${VERSION}/" \
222+
--cache-control "${CACHE_CONTROL}"
223+
aws s3 sync \
224+
"registry/_site/api/providers/${PROVIDER}/${VERSION}/" \
225+
"${S3_BUCKET}api/providers/${PROVIDER}/${VERSION}/" \
226+
--cache-control "${CACHE_CONTROL}"
227+
done
228+
229+
publish-versions:
230+
needs: [prepare, backfill]
231+
runs-on: ubuntu-latest
232+
name: "Publish versions.json"
233+
steps:
234+
- name: "Checkout repository"
235+
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
236+
with:
237+
persist-credentials: false
238+
239+
- name: "Install Breeze"
240+
uses: ./.github/actions/breeze
241+
with:
242+
python-version: "3.12"
243+
244+
- name: "Install AWS CLI v2"
245+
run: |
246+
curl -sSf \
247+
"https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" \
248+
-o /tmp/awscliv2.zip
249+
unzip -q /tmp/awscliv2.zip -d /tmp
250+
rm /tmp/awscliv2.zip
251+
sudo /tmp/aws/install --update
252+
rm -rf /tmp/aws/
253+
254+
- name: "Configure AWS credentials"
255+
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7 # v6.0.0
256+
with:
257+
aws-access-key-id: ${{ secrets.DOCS_AWS_ACCESS_KEY_ID }}
258+
aws-secret-access-key: ${{ secrets.DOCS_AWS_SECRET_ACCESS_KEY }}
259+
aws-region: us-east-2
260+
261+
- name: "Publish version metadata"
262+
env:
263+
S3_BUCKET: ${{ needs.prepare.outputs.bucket }}
264+
run: >
265+
breeze registry publish-versions
266+
--s3-bucket "${S3_BUCKET}"

.github/workflows/registry-build.yml

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -239,7 +239,14 @@ jobs:
239239
S3_BUCKET: ${{ steps.destination.outputs.bucket }}
240240
run: |
241241
aws s3 sync registry/_site/ "${S3_BUCKET}" \
242-
--cache-control "${REGISTRY_CACHE_CONTROL}"
242+
--cache-control "${REGISTRY_CACHE_CONTROL}" \
243+
--exclude "pagefind/*"
244+
# Pagefind generates content-hashed filenames (e.g. en_181da6f.pf_index).
245+
# Each rebuild produces new hashes, so --delete is needed to remove stale
246+
# index files. This is separate from the main sync which intentionally
247+
# omits --delete to preserve files written by other steps (publish-versions).
248+
aws s3 sync registry/_site/pagefind/ "${S3_BUCKET}pagefind/" \
249+
--cache-control "${REGISTRY_CACHE_CONTROL}" --delete
243250
244251
- name: "Publish version metadata"
245252
env:

Dockerfile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1201,7 +1201,7 @@ function install_from_external_spec() {
12011201
installation_command_flags="apache-airflow[${AIRFLOW_EXTRAS}]${AIRFLOW_VERSION_SPECIFICATION}"
12021202
else
12031203
echo
1204-
echo "${COLOR_RED}The '${INSTALLATION_METHOD}' installation method is not supported${COLOR_RESET}"
1204+
echo "${COLOR_RED}The '${AIRFLOW_INSTALLATION_METHOD}' installation method is not supported${COLOR_RESET}"
12051205
echo
12061206
echo "${COLOR_YELLOW}Supported methods are ('.', 'apache-airflow')${COLOR_RESET}"
12071207
echo

Dockerfile.ci

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -936,7 +936,7 @@ function install_from_external_spec() {
936936
installation_command_flags="apache-airflow[${AIRFLOW_EXTRAS}]${AIRFLOW_VERSION_SPECIFICATION}"
937937
else
938938
echo
939-
echo "${COLOR_RED}The '${INSTALLATION_METHOD}' installation method is not supported${COLOR_RESET}"
939+
echo "${COLOR_RED}The '${AIRFLOW_INSTALLATION_METHOD}' installation method is not supported${COLOR_RESET}"
940940
echo
941941
echo "${COLOR_YELLOW}Supported methods are ('.', 'apache-airflow')${COLOR_RESET}"
942942
echo

airflow-core/docs/administration-and-deployment/listeners.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ You can write listeners to enable Airflow to notify you when events happen.
2424
.. warning::
2525

2626
Listeners are an advanced feature of Airflow. They are not isolated from the Airflow components they run in, and
27-
can slow down or in come cases take down your Airflow instance. As such, extra care should be taken when writing listeners.
27+
can slow down or in some cases take down your Airflow instance. As such, extra care should be taken when writing listeners.
2828

2929
Airflow supports notifications for the following events:
3030

airflow-core/docs/administration-and-deployment/logging-monitoring/callbacks.rst

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -163,3 +163,13 @@ Here's an example of using a custom notifier:
163163
164164
For a list of community-managed Notifiers, see :doc:`apache-airflow-providers:core-extensions/notifications`.
165165
For more information on writing a custom Notifier, see the :doc:`Notifiers <../../howto/notifications>` how-to page.
166+
167+
Deadline Alert Callbacks
168+
^^^^^^^^^^^^^^^^^^^^^^^^
169+
170+
In addition to the Dag/task lifecycle callbacks above, Airflow supports **Deadline Alert** callbacks which
171+
trigger when a Dag run exceeds a configured time threshold. Deadline Alert callbacks use
172+
:class:`~airflow.sdk.AsyncCallback` (runs in the Triggerer) or :class:`~airflow.sdk.SyncCallback`
173+
(runs in the executor) and are configured on the Dag via the ``deadline`` parameter.
174+
175+
For full details, see :doc:`/howto/deadline-alerts`.

0 commit comments

Comments
 (0)