Skip to content

Commit b7b7de5

Browse files
committed
Updated readmes to improve copy pastability
Also added instructions for native compilation for those apps that it is useful Added documentation for native compilation when using Single Step Batch Processing Updated documentaiton for native compilation for single app processing
1 parent 704667d commit b7b7de5

File tree

13 files changed

+144
-49
lines changed

13 files changed

+144
-49
lines changed

docs/src/main/asciidoc/batch-starter.adoc

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -365,6 +365,32 @@ itself by setting the following properties:
365365

366366
See the https://docs.spring.io/spring-batch/docs/4.3.x/api/org/springframework/batch/item/kafka/KafkaItemReader.html[`KafkaItemReader` documentation].
367367

368+
[[nativeCompilation]]
369+
=== Native Compilation
370+
The advantage of Single Step Batch Processing is that it lets you dynamically select which reader and writer beans to use at runtime when you use the JVM.
371+
However, when you use native compilation, you must determine the reader and writer at build time instead of runtime.
372+
The following example does so:
373+
374+
[source,xml]
375+
<plugin>
376+
<groupId>org.springframework.boot</groupId>
377+
<artifactId>spring-boot-maven-plugin</artifactId>
378+
<executions>
379+
<execution>
380+
<id>process-aot</id>
381+
<goals>
382+
<goal>process-aot</goal>
383+
</goals>
384+
<configuration>
385+
<jvmArguments>
386+
-Dspring.batch.job.flatfileitemreader.name=fooReader
387+
-Dspring.batch.job.flatfileitemwriter.name=fooWriter
388+
</jvmArguments>
389+
</configuration>
390+
</execution>
391+
</executions>
392+
</plugin>
393+
368394
[[item-processors]]
369395
== ItemProcessor Configuration
370396

docs/src/main/asciidoc/features.adoc

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -373,9 +373,8 @@ NOTE: The exit code for the application will be 1 if the task fails because this
373373
is enabled and another task is running with the same task name.
374374

375375
==== Single Instance Usage for Spring AOT And Native Compilation
376-
To utilize Spring Cloud Task's single instance feature when creating a natively compiled app, the feature needs to enabled at build time.
377-
This is done by adding the `process-aot` execution and setting the `spring.cloud.task.single-step-instance-enabled=true` as a JVM argument as shown below:
378-
376+
To use Spring Cloud Task's single-instance feature when creating a natively compiled app, you need to enable the feature at build time.
377+
To do so, add the process-aot execution and set `spring.cloud.task.single-step-instance-enabled=true` as a JVM argument, as follows:
379378
[source,xml]
380379
<plugin>
381380
<groupId>org.springframework.boot</groupId>

spring-cloud-task-samples/batch-events/README.adoc

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -18,22 +18,22 @@ Note: More information on this topic is available https://docs.spring.io/spring-
1818

1919
== Build:
2020

21-
[source,shell,indent=2]
21+
[source,shell]
2222
----
23-
$ ./mvnw clean install
23+
./mvnw clean install
2424
----
2525

2626
== Execution:
2727

28-
[source,shell,indent=2]
28+
[source,shell]
2929
----
30-
$ java -jar target/batch-events-3.0.0.jar
30+
java -jar target/batch-events-3.0.0.jar
3131
----
3232

33-
For example you can listen for specific job execution events on a specified channel with a Spring Cloud Stream Sink
33+
For example, you can listen for specific job-execution events on a specified channel with a Spring Cloud Stream Sink
3434
like the https://github.com/spring-cloud/stream-applications/tree/main/applications/sink/log-sink[log sink] using the following:
3535

36-
[source,shell,indent=2]
36+
[source,shell]
3737
----
3838
$ java -jar <PATH_TO_LOG_SINK_JAR>/log-sink-rabbit-3.1.1.jar --server.port=9090
3939
--spring.cloud.stream.bindings.input.destination=job-execution-events

spring-cloud-task-samples/batch-job/README.adoc

Lines changed: 18 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -13,14 +13,28 @@ This is a Spring Cloud Task application that executes two simple Spring Batch Jo
1313

1414
== Build:
1515

16-
[source,shell,indent=2]
16+
[source,shell]
1717
----
18-
$ mvn clean package
18+
mvn clean package
1919
----
2020

2121
== Run:
2222

23-
[source,shell,indent=2]
23+
[source,shell]
2424
----
25-
$ java -jar target/batch-job-3.0.0.jar
25+
java -jar target/batch-job-3.0.0.jar
26+
----
27+
28+
== Native Build:
29+
30+
[source,shell]
31+
----
32+
mvn -Pnative clean package
33+
----
34+
35+
== Native Run:
36+
37+
[source,shell]
38+
----
39+
./target/batch-job
2640
----

spring-cloud-task-samples/jpa-sample/README.adoc

Lines changed: 18 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -15,14 +15,28 @@ a data store.
1515

1616
== Build:
1717

18-
[source,shell,indent=2]
18+
[source,shell]
1919
----
20-
$ mvn clean package
20+
mvn clean package
2121
----
2222

2323
== Run:
2424

25-
[source,shell,indent=2]
25+
[source,shell]
2626
----
27-
$ java -jar target/jpa-sample-3.0.0.jar
27+
java -jar target/jpa-sample-3.0.0.jar
28+
----
29+
30+
== Native Build:
31+
32+
[source,shell]
33+
----
34+
mvn -Pnative clean package
35+
----
36+
37+
== Native Run:
38+
39+
[source,shell]
40+
----
41+
./target/jpa-sample
2842
----

spring-cloud-task-samples/multiple-datasources/README.adoc

Lines changed: 18 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -18,16 +18,30 @@ which one to be used for the Spring Cloud Task repository.
1818

1919
== Build:
2020

21-
[source,shell,indent=2]
21+
[source,shell]
2222
----
23-
$ mvn clean package
23+
mvn clean package
2424
----
2525

2626
== Execute sample using 2 embedded databases (default):
2727

28-
[source,shell,indent=2]
28+
[source,shell]
29+
----
30+
java -jar target/multiple-datasources-3.0.0.jar
31+
----
32+
33+
== Native Build:
34+
35+
[source,shell]
36+
----
37+
mvn -Pnative clean package
38+
----
39+
40+
== RUn sample using 2 embedded databases (default) with native app:
41+
42+
[source,shell]
2943
----
30-
$ java -jar target/multiple-datasources-3.0.0.jar
44+
./target/multiple-datasources
3145
----
3246

3347
== Execute sample using 2 external databases:

spring-cloud-task-samples/partitioned-batch-job/README.adoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -9,14 +9,14 @@ An example of the usage of the `DeployerPartitionHandler` and
99

1010
== Build:
1111

12-
[source,shell,indent=2]
12+
[source,shell]
1313
----
14-
$ ./mvnw clean install
14+
./mvnw clean install
1515
----
1616

1717
== Execute:
1818

19-
[source,shell,indent=2]
19+
[source,shell]
2020
----
2121
export SPRING_APPLICATION_JSON='{"spring.datasource.url":"jdbc:mariadb://localhost:3306/<your database>","spring.datasource.password":"<your password>","spring.datasource.username":"<your username>","spring.datasource.driverClassName":"org.mariadb.jdbc.Driver"}'
2222
java -jar target/partitioned-batch-job-3.0.0.jar

spring-cloud-task-samples/single-step-batch-job/README.adoc

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -23,16 +23,16 @@ The profiles that are available are:
2323

2424
== Build:
2525

26-
[source,shell,indent=2]
26+
[source,shell]
2727
----
28-
$ mvn clean package
28+
mvn clean package
2929
----
3030

3131
== Run:
3232

33-
[source,shell,indent=2]
33+
[source,shell]
3434
----
35-
$ java -jar target/single-step-batch-job-3.0.0-SNAPSHOT.jar --spring.config.name=<property file containing batch, reader, and writer properties>
35+
java -jar target/single-step-batch-job-3.0.0-SNAPSHOT.jar --spring.config.name=<property file containing batch, reader, and writer properties>
3636
----
3737

3838
== Examples

spring-cloud-task-samples/task-events/README.adoc

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -8,24 +8,24 @@ This is a task application that emits events on a channel named `task-events`
88

99
== Build:
1010

11-
[source,shell,indent=2]
11+
[source,shell]
1212
----
13-
$ ./mvnw clean install
13+
./mvnw clean install
1414
----
1515

1616
== Execution:
1717

18-
[source,shell,indent=2]
18+
[source,shell]
1919
----
20-
$ java -jar target/task-events-3.0.0.RELEASE.jar
20+
java -jar target/task-events-3.0.0.RELEASE.jar
2121
----
2222

2323
You can listen for the events on the task-events channel with a Spring Cloud Stream Sink
2424
like the https://github.com/spring-cloud/stream-applications/tree/main/applications/sink/log-sink[log sink] using the following:
2525

26-
[source,shell,indent=2]
26+
[source,shell]
2727
----
28-
$ java -jar <PATH_TO_LOG_SINK_JAR>/log-sink-rabbit-3.1.1.jar --server.port=9090 --spring.cloud.stream.bindings.input.destination=task-events
28+
java -jar <PATH_TO_LOG_SINK_JAR>/log-sink-rabbit-3.1.1.jar --server.port=9090 --spring.cloud.stream.bindings.input.destination=task-events
2929
----
3030

3131
== Dependencies:

spring-cloud-task-samples/task-observations/README.adoc

Lines changed: 19 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -9,18 +9,32 @@ metrics at the end of the application using the SimpleMeterRegistry.
99

1010
== Classes:
1111

12-
* TaskMetricsApplication - the Spring Boot Main Application
12+
* TaskObservationsApplication - the Spring Boot Main Application
1313

1414
== Build:
1515

16-
[source,shell,indent=2]
16+
[source,shell]
1717
----
18-
$ mvn clean package
18+
mvn clean package
1919
----
2020

2121
== Run:
2222

23-
[source,shell,indent=2]
23+
[source,shell]
2424
----
25-
$ java -jar target/task-metrics-3.0.0-SNAPSHOT.jar
25+
java -jar target/task-observations-3.0.0.jar
26+
----
27+
28+
== Native Build:
29+
30+
[source,shell]
31+
----
32+
mvn -Pnative clean package
33+
----
34+
35+
== Native Run:
36+
37+
[source,shell]
38+
----
39+
./target/task-observations
2640
----

0 commit comments

Comments
 (0)