Open your IDE or source code editor and select the option to clone the repository
Paste the repository link in the URL field and submit.
It requires "airflow-env" virtual environment configured locally.
- Configuring Airflow database connection
Airflow is by default configured to use SQLite database. Configuration can be seen on local machine
~/airflow/airflow.cfgundersql_alchemy_conn.Installing required dependency for MySQL connection in
airflow-envon local machine.$ pyenv activate airflow-env $ pip install PyMySQL
Now set
sql_alchemy_conn = mysql+pymysql://root:@127.0.0.1:23306/airflow?charset=utf8mb4in file~/airflow/airflow.cfgon local machine.
- Debugging an example DAG
Add Interpreter to PyCharm pointing interpreter path to
~/.pyenv/versions/airflow-env/bin/python, which is virtual environmentairflow-envcreated with pyenv earlier. For adding an Interpreter go toFile -> Setting -> Project: airflow -> Python Interpreter.In PyCharm IDE open airflow project, directory
/files/dagsof local machine is by default mounted to docker machine when breeze airflow is started. So any DAG file present in this directory will be picked automatically by scheduler running in docker machine and same can be seen onhttp://127.0.0.1:28080.Copy any example DAG present in the
/airflow/example_dagsdirectory to/files/dags/.Add a
__main__block at the end of your DAG file to make it runnable. It will run aback_filljob:if __name__ == "__main__": dag.clear() dag.run()
Add
AIRFLOW__CORE__EXECUTOR=DebugExecutorto Environment variable of Run Configuration.Click on Add configuration
Add Script Path and Environment Variable to new Python configuration
Now Debug an example dag and view the entries in tables such as
dag_run, xcometc in MySQL Workbench.
Click on the branch symbol in the status bar
Give a name to a branch and checkout
Follow the Quick start for typical development tasks.






