|
1 | 1 | # docker-airflow |
2 | | -[](https://circleci.com/gh/puckel/docker-airflow) |
3 | | -[](https://hub.docker.com/r/puckel/docker-airflow/) |
4 | | -[]() |
5 | | -[]() |
6 | 2 |
|
7 | | -This repository contains **Dockerfile** of [airflow](https://github.com/apache/incubator-airflow) for [Docker](https://www.docker.com/)'s [automated build](https://registry.hub.docker.com/u/puckel/docker-airflow/) published to the public [Docker Hub Registry](https://registry.hub.docker.com/). |
8 | | - |
9 | | -## Informations |
10 | | - |
11 | | -* Based on Debian Jessie official Image [debian:jessie](https://registry.hub.docker.com/_/debian/) and uses the official [Postgres](https://hub.docker.com/_/postgres/) as backend and [RabbitMQ](https://hub.docker.com/_/rabbitmq/) as queue |
12 | | -* Install [Docker](https://www.docker.com/) |
13 | | -* Install [Docker Compose](https://docs.docker.com/compose/install/) |
14 | | -* Following the Airflow release from [Python Package Index](https://pypi.python.org/pypi/airflow) |
15 | | - |
16 | | -## Installation |
17 | | - |
18 | | -Pull the image from the Docker repository. |
19 | | - |
20 | | - docker pull puckel/docker-airflow |
21 | | - |
22 | | -## Build |
23 | | - |
24 | | -For example, if you need to install [Extra Packages](http://pythonhosted.org/airflow/installation.html#extra-package), edit the Dockerfile and than build-it. |
25 | | - |
26 | | - docker build --rm -t puckel/docker-airflow . |
27 | | - |
28 | | -## Usage |
29 | | - |
30 | | -By default, docker-airflow run Airflow with **SequentialExecutor** : |
31 | | - |
32 | | - docker run -d -p 8080:8080 puckel/docker-airflow |
33 | | - |
34 | | -If you want to run other executor, you've to use the docker-compose.yml files provided in this repository. |
35 | | - |
36 | | -For **LocalExecutor** : |
37 | | - |
38 | | - docker-compose -f docker-compose-LocalExecutor.yml up -d |
39 | | - |
40 | | -For **CeleryExecutor** : |
41 | | - |
42 | | - docker-compose -f docker-compose-CeleryExecutor.yml up -d |
43 | | - |
44 | | -NB : If you don't want to have DAGs example loaded (default=True), you've to set the following environment variable : |
45 | | - |
46 | | -`LOAD_EX=n` |
47 | | - |
48 | | - docker run -d -p 8080:8080 -e LOAD_EX=n puckel/docker-airflow |
49 | | - |
50 | | -If you want to use Ad hoc query, make sure you've configured connections: |
51 | | -Go to Admin -> Connections and Edit "mysql_default" set this values (equivalent to values in airflow.cfg/docker-compose.yml) : |
52 | | -- Host : mysql |
53 | | -- Schema : airflow |
54 | | -- Login : airflow |
55 | | -- Password : airflow |
56 | | - |
57 | | -Check [Airflow Documentation](http://pythonhosted.org/airflow/) |
58 | | - |
59 | | -## UI Links |
60 | | - |
61 | | -- Airflow: [localhost:8080](http://localhost:8080/) |
62 | | -- Flower: [localhost:5555](http://localhost:5555/) |
63 | | -- RabbitMQ: [localhost:15672](http://localhost:15672/) |
64 | | - |
65 | | -When using OSX with boot2docker, use: open http://$(boot2docker ip):8080 |
66 | | - |
67 | | -## Scale the number of workers |
68 | | - |
69 | | -Easy scaling using docker-compose: |
70 | | - |
71 | | - docker-compose scale worker=5 |
72 | | - |
73 | | -This can be used to scale to a multi node setup using docker swarm. |
74 | | - |
75 | | -## Links |
76 | | - |
77 | | - - Airflow on Kubernetes [kube-airflow](https://github.com/mumoshu/kube-airflow) |
78 | | - |
79 | | -# Wanna help? |
80 | | - |
81 | | -Fork, improve and PR. ;-) |
| 3 | +Airbnb Airflow image based on |
| 4 | +[https://github.com/puckel/docker-airflow](https://github.com/puckel/docker-airflow). |
0 commit comments