This project sets up Apache Airflow in a Docker environment using Docker Compose. It facilitates the import of predefined users, connections, and variables to simplify the deployment and initial setup of Airflow.
Before starting, ensure that Docker and Docker Compose are installed on your machine. The instructions below assume you have access to your operating system's command line.
-
Airflow Initialization:
- Run the following command to initialize the database and prepare the Airflow environment:
docker-compose run --rm airflow-init
- Run the following command to initialize the database and prepare the Airflow environment:
-
User Import:
- To import Airflow users from a JSON file, execute:
docker-compose run --rm airflow-cli users import /opt/airflow/config/users.json
- To import Airflow users from a JSON file, execute:
-
Connection Import:
- To import connections from a YAML file, use the command:
docker-compose run --rm airflow-cli connections import /opt/airflow/config/connections.yml
- To import connections from a YAML file, use the command:
-
Variable Import:
- To import variables from a JSON file, execute:
docker-compose run --rm airflow-cli variables import /opt/airflow/config/variables.json
- To import variables from a JSON file, execute:
-
Run Airflow:
- To start all Airflow services in daemon mode, use:
docker-compose up -d
- To start all Airflow services in daemon mode, use:
After executing the above steps, the Airflow webserver will be available on port 8080 of your localhost. You can access the Airflow web interface by navigating to http://localhost:8080 in your browser.
- Logs: Service logs can be monitored through Docker Compose with the command:
docker-compose logs -f
- Stopping Services:
- To stop all Airflow services, execute:
docker-compose down
For more information on how to use Apache Airflow, consult the official Airflow documentation.