Skip to content
This repository was archived by the owner on May 14, 2025. It is now read-only.

Commit 8bde8c1

Browse files
committed
Add docker compose-up for Local-server
resolves #2040
1 parent ca848af commit 8bde8c1

File tree

4 files changed

+241
-3
lines changed

4 files changed

+241
-3
lines changed

spring-cloud-dataflow-docs/src/main/asciidoc/getting-started.adoc

Lines changed: 197 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,6 @@ We then build an introductory Spring Cloud Data Flow application, discussing som
1010
--
1111

1212

13-
1413
[[getting-started-system-requirements]]
1514
== System Requirements
1615

@@ -26,8 +25,204 @@ For the deployed streams applications communicate, either link:http://www.rabbit
2625
If you would like to have the feature of upgrading and rolling back applications in Streams at runtime, you should install the Spring Cloud Skipper server.
2726

2827

28+
[[getting-started-deploying-spring-cloud-dataflow-docker]]
29+
== Getting Started with Docker Compose
30+
31+
As of Spring Cloud Data Flow 1.4, a Docker Compose file is provided to quickly bring up Spring Cloud Data Flow and its dependencies without having to obtain them manually.
32+
When running, a composed system includes the latest GA release of Spring Cloud Data Flow Local Server using the Kafka binder for communication.
33+
Docker Compose is required and it's recommended to use the link:https://docs.docker.com/compose/install/[latest version].
34+
35+
. Download the Spring Cloud Data Flow Local Server Docker Compose file:
36+
+
37+
[source,bash,subs=attributes]
38+
----
39+
wget https://raw.githubusercontent.com/spring-cloud/spring-cloud-dataflow/master/spring-cloud-dataflow-server-local/docker-compose.yml
40+
----
41+
+
42+
. Start Docker Compose
43+
+
44+
In the directory where you downloaded `docker-compose.yml`, start the system, as follows:
45+
+
46+
[source,bash,subs=attributes]
47+
----
48+
$ docker-compose up
49+
----
50+
+
51+
NOTE: By default Docker Compose will use locally available images.
52+
If for example when using the `latest` tag, execute `docker-compose pull` prior to `docker-compose up` to ensure the latest image is downloaded.
53+
+
54+
. Launch the Spring Cloud Data Flow Dashboard
55+
+
56+
Spring Cloud Data Flow will be ready for use once the `docker-compose` command stops emitting log messages.
57+
At this time, in your browser navigate to the link:http://localhost:9393/dashboard[Spring Cloud Data Flow Dashboard].
58+
By default the latest GA releases of Stream and Task applications will be imported automatically.
59+
+
60+
. Create a Stream
61+
+
62+
To create a stream, first navigate to the "Streams" menu link then click the "Create Stream" link.
63+
Enter `time | log` into the "Create Stream" textarea then click the "CREATE STREAM" button.
64+
Enter "ticktock" for the stream name and click the "Deploy Stream(s) checkbox as show in the following image:
65+
+
66+
.Creating a Stream
67+
image::{dataflow-asciidoc}/images/dataflow-stream-create.png[Creating a Stream, scaledwidth="60%"]
68+
+
69+
Then click "OK" which will return back to the Definitions page.
70+
The stream will be in "deploying" status and move to "deployed" when finished.
71+
You may need to refresh your browser to see the updated status.
72+
+
73+
. View Stream Logs
74+
+
75+
To view the stream logs, navigate to the "Runtime" menu link and click the "ticktock.log" link.
76+
Copy the path in the "stdout" text box on the dashboard and in another console type:
77+
+
78+
[source,bash,subs=attributes]
79+
----
80+
$ docker exec -it dataflow-server tail -f /path/from/stdout/textbox/in/dashboard
81+
----
82+
+
83+
You should now see the output of the log sink, printing a timestamp once per second.
84+
Press CTRL+c to end the `tail`.
85+
+
86+
. Delete a Stream
87+
+
88+
To delete the stream, first navigate to the "Streams" menu link in the dashboard then click the checkbox on the "ticktock" row.
89+
Click the "DESTROY ALL 1 SELECTED STREAMS" button and then "YES" to destroy the stream.
90+
+
91+
. Destroy the Quick Start environment
92+
+
93+
To destroy the Quick Start environment, in another console from where the `docker-compose.yml` is located, type as follows:
94+
+
95+
[source,bash,subs=attributes]
96+
----
97+
$ docker-compose down
98+
----
99+
+
100+
101+
102+
[[getting-started-customizing-spring-cloud-dataflow-docker]]
103+
=== Docker Compose Customization
104+
105+
Out of the box Spring Cloud Data Flow will use the H2 embedded database for storing state, Kafka for communication and no analytics.
106+
Customizations can be made to these components by editing the `docker-compose.yml` file as described below.
107+
108+
. To use MySQL rather than the H2 embedded database, add the following configuration under the `services` section:
109+
+
110+
[source,yaml,subs=attributes]
111+
----
112+
mysql:
113+
image: mariadb:10.2
114+
environment:
115+
MYSQL_DATABASE: dataflow
116+
MYSQL_USER: root
117+
MYSQL_ROOT_PASSWORD: rootpw
118+
expose:
119+
- 3306
120+
----
121+
+
122+
The following entries need to be added to the `environment` block of the `dataflow-server` service definition:
123+
+
124+
[source,yaml,subs=attributes]
125+
----
126+
- spring_datasource_url=jdbc:mysql://mysql:3306/dataflow
127+
- spring_datasource_username=root
128+
- spring_datasource_password=rootpw
129+
- spring_datasource_driver-class-name=org.mariadb.jdbc.Driver
130+
----
131+
+
132+
Update the `depends_on` attribute of the `dataflow-server` service definition to include:
133+
+
134+
[source,yaml,subs=attributes]
135+
----
136+
- mysql
137+
----
138+
+
139+
140+
. To use RabbitMQ instead of Kafka for communication, replace the following configuration under the `services` section:
141+
+
142+
[source,yaml,subs=attributes]
143+
----
144+
kafka:
145+
image: wurstmeister/kafka:0.10.1.0
146+
expose:
147+
- "9092"
148+
environment:
149+
- KAFKA_ADVERTISED_PORT=9092
150+
- KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
151+
depends_on:
152+
- zookeeper
153+
zookeeper:
154+
image: wurstmeister/zookeeper
155+
expose:
156+
- "2181"
157+
environment:
158+
- KAFKA_ADVERTISED_HOST_NAME=zookeeper
159+
----
160+
+
161+
With:
162+
+
163+
[source,yaml,subs=attributes]
164+
----
165+
rabbitmq:
166+
image: rabbitmq:3.7
167+
expose:
168+
- "5672"
169+
----
170+
+
171+
In the `dataflow-server` services configuration block, add the following `environment` entry:
172+
+
173+
[source,yaml,subs=attributes]
174+
----
175+
- spring.cloud.dataflow.applicationProperties.stream.spring.rabbitmq.host=rabbitmq
176+
----
177+
+
178+
Then replace:
179+
+
180+
[source,yaml,subs=attributes]
181+
----
182+
depends_on:
183+
- kafka
184+
----
185+
+
186+
With:
187+
+
188+
[source,yaml,subs=attributes]
189+
----
190+
depends_on:
191+
- rabbitmq
192+
----
193+
+
194+
And finally, modify the `app-import` service definition `command` attribute to replace `http://bit.ly/Celsius-SR1-stream-applications-kafka-10-maven` with `http://bit.ly/Celsius-SR1-stream-applications-rabbit-maven`.
195+
196+
197+
. To enable analytics using redis as a backend, add the following configuration under the `services` section:
198+
+
199+
[source,yaml,subs=attributes]
200+
----
201+
redis:
202+
image: redis:2.8
203+
expose:
204+
- "6379"
205+
----
206+
+
207+
Update the `depends_on` attribute of the `dataflow-server` service definition to include:
208+
+
209+
[source,yaml,subs=attributes]
210+
----
211+
- redis
212+
----
213+
+
214+
Then add the following entries to the `environment` block of the `dataflow-server` service definition:
215+
+
216+
[source,yaml,subs=attributes]
217+
----
218+
- spring.cloud.dataflow.applicationProperties.stream.spring.redis.host=redis
219+
- spring_redis_host=redis
220+
----
221+
+
222+
223+
29224
[[getting-started-deploying-spring-cloud-dataflow]]
30-
== Installation
225+
== Getting Started with Manual Installation
31226

32227
. Download the Spring Cloud Data Flow Server and Shell apps:
33228
+
210 KB
Loading

spring-cloud-dataflow-docs/src/main/asciidoc/index.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
= Spring Cloud Data Flow Reference Guide
2-
Sabby Anandan; Marius Bogoevici; Eric Bottard; Mark Fisher; Ilayaperumal Gopinathan; Gunnar Hillert; Mark Pollack; Patrick Peralta; Glenn Renfro; Thomas Risberg; Dave Syer; David Turanski; Janne Valkealahti; Oleg Zhurakousky; Jay Bryant; Vinicius Carvalho
2+
Sabby Anandan; Marius Bogoevici; Eric Bottard; Mark Fisher; Ilayaperumal Gopinathan; Gunnar Hillert; Mark Pollack; Patrick Peralta; Glenn Renfro; Thomas Risberg; Dave Syer; David Turanski; Janne Valkealahti; Oleg Zhurakousky; Jay Bryant; Vinicius Carvalho; Chris Schaefer
33
:doctype: book
44
:toc: left
55
:toclevels: 4
Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
version: '3'
2+
3+
services:
4+
kafka:
5+
image: wurstmeister/kafka:0.10.1.0
6+
expose:
7+
- "9092"
8+
environment:
9+
- KAFKA_ADVERTISED_PORT=9092
10+
- KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
11+
depends_on:
12+
- zookeeper
13+
zookeeper:
14+
image: wurstmeister/zookeeper
15+
expose:
16+
- "2181"
17+
environment:
18+
- KAFKA_ADVERTISED_HOST_NAME=zookeeper
19+
dataflow-server:
20+
image: springcloud/spring-cloud-dataflow-server-local:latest
21+
container_name: dataflow-server
22+
ports:
23+
- "9393:9393"
24+
environment:
25+
- spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.kafka.binder.brokers=kafka:9092
26+
- spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.kafka.binder.zkNodes=zookeeper:2181
27+
depends_on:
28+
- kafka
29+
app-import:
30+
image: alpine:3.7
31+
depends_on:
32+
- dataflow-server
33+
command: >
34+
/bin/sh -c "
35+
while ! nc -z dataflow-server 9393;
36+
do
37+
sleep 1;
38+
done;
39+
wget -qO- 'http://dataflow-server:9393/apps' --post-data='uri=http://bit.ly/Celsius-SR1-stream-applications-kafka-10-maven&force=true';
40+
echo 'Stream apps imported'
41+
wget -qO- 'http://dataflow-server:9393/apps' --post-data='uri=http://bit.ly/Clark-GA-task-applications-maven&force=true';
42+
echo 'Task apps imported'"
43+

0 commit comments

Comments
 (0)