Skip to content

Commit 400898d

Browse files
✏️ Update docs
1 parent a2a6e5c commit 400898d

File tree

2 files changed

+17
-17
lines changed

2 files changed

+17
-17
lines changed

dags/kids_first/example_study.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,4 @@
1-
import os
21
from airflow.sdk import Variable
3-
from datetime import datetime
42

53
from cosmos import (
64
DbtDag,

docs/guides/running-models-in-airflow.md

Lines changed: 17 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -43,8 +43,7 @@ sections. The purpose of this DAG is to run models in the study
4343
`kf_example_study` every day.
4444

4545
```python
46-
import os
47-
from datetime import datetime
46+
from airflow.sdk import Variable
4847

4948
from cosmos import (
5049
DbtDag,
@@ -53,30 +52,31 @@ from cosmos import (
5352
ExecutionConfig,
5453
RenderConfig,
5554
)
55+
from cosmos.profiles import PostgresUserPasswordProfileMapping
5656

5757
profile_config = ProfileConfig(
58-
profile_name=os.environ["DBT_PROFILE_NAME"],
59-
profiles_yml_filepath=os.environ["DBT_PROFILES_YML_PATH"],
58+
profile_name=Variable.get("DBT_PROFILE_NAME"),
6059
target_name="prd",
60+
profile_mapping=PostgresUserPasswordProfileMapping(
61+
conn_id="postgres_dev_svc",
62+
profile_args={"schema": "prd"},
63+
),
6164
)
6265

6366
example_study_dag = DbtDag(
6467
project_config=ProjectConfig(
65-
"/opt/airflow/dbt/deidentified_etl",
68+
Variable.get("DBT_PROJECT_DIR"),
6669
install_dbt_deps=True,
6770
),
6871
profile_config=profile_config,
6972
execution_config=ExecutionConfig(
70-
dbt_executable_path=os.environ["DBT_EXECUTABLE_PATH"],
73+
dbt_executable_path=Variable.get("DBT_EXECUTABLE_PATH"),
7174
),
7275
render_config=RenderConfig(select=["config.meta.study:kf_example_study"]),
7376
# normal dag parameters
7477
schedule="@daily",
75-
start_date=datetime(2026, 1, 1),
76-
catchup=False,
7778
dag_id="kf_example_study",
7879
tags=["POC", "Kids First"],
79-
default_args={"retries": 2},
8080
)
8181
```
8282

@@ -98,18 +98,20 @@ from cosmos import (
9898
ExecutionConfig,
9999
RenderConfig,
100100
)
101+
from cosmos.profiles import PostgresUserPasswordProfileMapping
101102
```
102103

103-
In the example, the `import os` is used to extract environment variables on the
104-
machine used to run dbt and `import datetime` is used to identify when the
105-
DAG should start being run.
104+
In the example, the `from airflow.sdk import Variable` is used to extract
105+
environment variables in the airflow environment used by commands in the DAG.
106106

107107
### `profile_config`
108108

109109
The `profile_config` is used by cosmos to identify the profile to be used by dbt
110-
commands. The values for `profile_name` and `profiles_yml_filepath` should use
111-
the indicated environment variables. Acceptable values for `target_name` are
112-
`qa` and `prd`
110+
commands. The value for `profile_name` should use the indicated variables. Acceptable values for `target_name` are `qa` and `prd`.
111+
112+
The `profile_mapping` takes advantage of an airflow connection object to connect
113+
to the warehouse. Pass it the indicated connection id and make sure to set the
114+
`schema` appropriately for the `target_name`
113115

114116
### the model DAG
115117

0 commit comments

Comments
 (0)