Skip to content

Commit afbecd2

Browse files
garyrussellartembilan
authored andcommitted
GH-1690: Polish Quick Start Docs
Resolves #1690 **cherry-pick to 2.6.x (fix kafka-clients version to 2.6.1)** (cherry picked from commit 5e46e02)
1 parent e978ae4 commit afbecd2

File tree

1 file changed

+94
-146
lines changed

1 file changed

+94
-146
lines changed
Lines changed: 94 additions & 146 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,13 @@
11
[[quick-tour]]
2-
=== Quick Tour for the Impatient
3-
4-
This is the five-minute tour to get started with Spring Kafka.
2+
=== Quick Tour
53

64
Prerequisites: You must install and run Apache Kafka.
7-
Then you must grab the spring-kafka JAR and all of its dependencies.
5+
Then you must put the spring-kafka JAR and all of its dependencies on your class path.
86
The easiest way to do that is to declare a dependency in your build tool.
9-
The following example shows how to do so with Maven:
107

8+
If you are not using Spring Boot, declare the `spring-kafka` jar as a dependency in your project.
9+
10+
.Maven
1111
====
1212
[source,xml,subs="+attributes"]
1313
----
@@ -19,17 +19,17 @@ The following example shows how to do so with Maven:
1919
----
2020
====
2121

22-
The following example shows how to do so with Gradle:
23-
22+
.Gradle
2423
====
2524
[source,groovy,subs="+attributes"]
2625
----
2726
compile 'org.springframework.kafka:spring-kafka:{project-version}'
2827
----
2928
====
3029

31-
IMPORTANT: When using Spring Boot, omit the version and Boot will automatically bring in the correct version that is compatible with your Boot version:
30+
IMPORTANT: When using Spring Boot, (and you haven't used start.spring.io to create your project), omit the version and Boot will automatically bring in the correct version that is compatible with your Boot version:
3231

32+
.Maven
3333
====
3434
[source,xml,subs="+attributes"]
3535
----
@@ -40,114 +40,110 @@ IMPORTANT: When using Spring Boot, omit the version and Boot will automatically
4040
----
4141
====
4242

43-
The following example shows how to do so with Gradle:
44-
43+
.Gradle
4544
====
4645
[source,groovy,subs="+attributes"]
4746
----
4847
compile 'org.springframework.kafka:spring-kafka'
4948
----
5049
====
5150

51+
However, the quickest way to get started is to use https://start.spring.io[start.spring.io] (or the wizards in Spring Tool Suits and Intellij IDEA) and create a project, selecting 'Spring for Apache Kafka' as a dependency.
5252

5353
[[compatibility]]
5454
==== Compatibility
5555

5656
This quick tour works with the following versions:
5757

58-
* Apache Kafka Clients 2.4.1
58+
* Apache Kafka Clients 2.6.1
5959
* Spring Framework 5.3.x
6060
* Minimum Java version: 8
6161

62-
==== A Very, Very Quick Example
62+
==== Getting Started
6363

64-
As the following example shows, you can use plain Java to send and receive a message:
64+
The simplest way to get started is to use https://start.spring.io[start.spring.io] (or the wizards in Spring Tool Suits and Intellij IDEA) and create a project, selecting 'Spring for Apache Kafka' as a dependency.
65+
Refer to the https://docs.spring.io/spring-boot/docs/current/reference/html/spring-boot-features.html#boot-features-kafka[Spring Boot documentation] for more information about its opinionated auto configuration of the infrastructure beans.
6566

67+
Here is a minimal consumer application.
68+
69+
===== Spring Boot Consumer App
70+
71+
.Application
6672
====
67-
[source,java]
73+
[source, java]
6874
----
69-
@Test
70-
public void testAutoCommit() throws Exception {
71-
logger.info("Start auto");
72-
ContainerProperties containerProps = new ContainerProperties("topic1", "topic2");
73-
final CountDownLatch latch = new CountDownLatch(4);
74-
containerProps.setMessageListener(new MessageListener<Integer, String>() {
75-
76-
@Override
77-
public void onMessage(ConsumerRecord<Integer, String> message) {
78-
logger.info("received: " + message);
79-
latch.countDown();
80-
}
81-
82-
});
83-
KafkaMessageListenerContainer<Integer, String> container = createContainer(containerProps);
84-
container.setBeanName("testAuto");
85-
container.start();
86-
Thread.sleep(1000); // wait a bit for the container to start
87-
KafkaTemplate<Integer, String> template = createTemplate();
88-
template.setDefaultTopic("topic1");
89-
template.sendDefault(0, "foo");
90-
template.sendDefault(2, "bar");
91-
template.sendDefault(0, "baz");
92-
template.sendDefault(2, "qux");
93-
template.flush();
94-
assertTrue(latch.await(60, TimeUnit.SECONDS));
95-
container.stop();
96-
logger.info("Stop auto");
75+
@SpringBootApplication
76+
public class Application {
77+
78+
public static void main(String[] args) {
79+
SpringApplication.run(Application.class, args);
80+
}
81+
82+
@Bean
83+
public NewTopic topic() {
84+
return TopicBuilder.name("topic1")
85+
.partitions(10)
86+
.replicas(1)
87+
.build();
88+
}
89+
90+
@KafkaListener(id = "myId", topics = "topic1")
91+
public void listen(String in) {
92+
System.out.println(in);
93+
}
9794
9895
}
9996
----
97+
====
10098

101-
[source, java]
99+
.application.properties
100+
====
101+
[source, properties]
102102
----
103-
private KafkaMessageListenerContainer<Integer, String> createContainer(
104-
ContainerProperties containerProps) {
105-
Map<String, Object> props = consumerProps();
106-
DefaultKafkaConsumerFactory<Integer, String> cf =
107-
new DefaultKafkaConsumerFactory<Integer, String>(props);
108-
KafkaMessageListenerContainer<Integer, String> container =
109-
new KafkaMessageListenerContainer<>(cf, containerProps);
110-
return container;
111-
}
103+
spring.kafka.consumer.auto-offset-reset=earliest
104+
----
105+
====
112106

113-
private KafkaTemplate<Integer, String> createTemplate() {
114-
Map<String, Object> senderProps = senderProps();
115-
ProducerFactory<Integer, String> pf =
116-
new DefaultKafkaProducerFactory<Integer, String>(senderProps);
117-
KafkaTemplate<Integer, String> template = new KafkaTemplate<>(pf);
118-
return template;
119-
}
107+
The `NewTopic` bean causes the topic to be created on the broker; it is not needed if the topic already exists.
120108

121-
private Map<String, Object> consumerProps() {
122-
Map<String, Object> props = new HashMap<>();
123-
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
124-
props.put(ConsumerConfig.GROUP_ID_CONFIG, group);
125-
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, true);
126-
props.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "100");
127-
props.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, "15000");
128-
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, IntegerDeserializer.class);
129-
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
130-
return props;
131-
}
109+
===== Spring Boot Producer App
110+
111+
.Application
112+
====
113+
[source, java]
114+
----
115+
@SpringBootApplication
116+
public class Application {
117+
118+
public static void main(String[] args) {
119+
SpringApplication.run(Application.class, args);
120+
}
121+
122+
@Bean
123+
public NewTopic topic() {
124+
return TopicBuilder.name("topic1")
125+
.partitions(10)
126+
.replicas(1)
127+
.build();
128+
}
129+
130+
@Bean
131+
public ApplicationRunner runner(KafkaTemplate<String, String> template) {
132+
return args -> {
133+
template.send("topic1", "test");
134+
};
135+
}
132136
133-
private Map<String, Object> senderProps() {
134-
Map<String, Object> props = new HashMap<>();
135-
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
136-
props.put(ProducerConfig.RETRIES_CONFIG, 0);
137-
props.put(ProducerConfig.BATCH_SIZE_CONFIG, 16384);
138-
props.put(ProducerConfig.LINGER_MS_CONFIG, 1);
139-
props.put(ProducerConfig.BUFFER_MEMORY_CONFIG, 33554432);
140-
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, IntegerSerializer.class);
141-
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
142-
return props;
143137
}
144138
----
145139
====
146140

147-
==== With Java Configuration
141+
==== With Java Configuration (No Spring Boot)
142+
143+
IMPORTANT: Spring for Apache Kafka is designed to be used in a Spring Application Context.
144+
For example, if you create the listener container yourself outside of a Spring context, not all functions will work unless you satisfy all of the `...Aware` interfaces that the container implements.
148145

149-
You can do the same work as appears in the previous example with Spring configuration in Java.
150-
The following example shows how to do so:
146+
Here is an example of an application that does not use Spring Boot.
151147

152148
====
153149
[source,java]
@@ -180,13 +176,15 @@ public class Config {
180176
181177
@Bean
182178
public ConsumerFactory<Integer, String> consumerFactory() {
183-
return new DefaultKafkaConsumerFactory<>(consumerConfigs());
179+
return new DefaultKafkaConsumerFactory<>(consumerProps());
184180
}
185181
186-
@Bean
187-
public Map<String, Object> consumerConfigs() {
182+
private Map<String, Object> consumerProps() {
188183
Map<String, Object> props = new HashMap<>();
189-
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafka.getBrokersAsString());
184+
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
185+
props.put(ConsumerConfig.GROUP_ID_CONFIG, group);
186+
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, IntegerDeserializer.class);
187+
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
190188
...
191189
return props;
192190
}
@@ -198,13 +196,15 @@ public class Config {
198196
199197
@Bean
200198
public ProducerFactory<Integer, String> producerFactory() {
201-
return new DefaultKafkaProducerFactory<>(producerConfigs());
199+
return new DefaultKafkaProducerFactory<>(senderProps());
202200
}
203201
204-
@Bean
205-
public Map<String, Object> producerConfigs() {
202+
private Map<String, Object> senderProps() {
206203
Map<String, Object> props = new HashMap<>();
207-
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafka.getBrokersAsString());
204+
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
205+
props.put(ProducerConfig.LINGER_MS_CONFIG, 10);
206+
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, IntegerSerializer.class);
207+
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
208208
...
209209
return props;
210210
}
@@ -216,7 +216,9 @@ public class Config {
216216
217217
}
218218
----
219+
====
219220

221+
====
220222
[source, java]
221223
----
222224
public class Listener {
@@ -232,58 +234,4 @@ public class Listener {
232234
----
233235
====
234236

235-
==== Even Quicker, with Spring Boot
236-
237-
Spring Boot can make things even simpler.
238-
The following Spring Boot application sends three messages to a topic, receives them, and stops:
239-
240-
====
241-
[source, java]
242-
----
243-
@SpringBootApplication
244-
public class Application implements CommandLineRunner {
245-
246-
public static Logger logger = LoggerFactory.getLogger(Application.class);
247-
248-
public static void main(String[] args) {
249-
SpringApplication.run(Application.class, args).close();
250-
}
251-
252-
@Autowired
253-
private KafkaTemplate<String, String> template;
254-
255-
private final CountDownLatch latch = new CountDownLatch(3);
256-
257-
@Override
258-
public void run(String... args) throws Exception {
259-
this.template.send("myTopic", "foo1");
260-
this.template.send("myTopic", "foo2");
261-
this.template.send("myTopic", "foo3");
262-
latch.await(60, TimeUnit.SECONDS);
263-
logger.info("All received");
264-
}
265-
266-
@KafkaListener(topics = "myTopic")
267-
public void listen(ConsumerRecord<?, ?> cr) throws Exception {
268-
logger.info(cr.toString());
269-
latch.countDown();
270-
}
271-
272-
}
273-
----
274-
====
275-
276-
Boot takes care of most of the configuration.
277-
When we use a local broker, the only properties we need are the following:
278-
279-
.application.properties
280-
====
281-
[source]
282-
----
283-
spring.kafka.consumer.group-id=foo
284-
spring.kafka.consumer.auto-offset-reset=earliest
285-
----
286-
====
287-
288-
We need the first property because we are using group management to assign topic partitions to consumers, so we need a group.
289-
The second property ensures the new consumer group gets the messages we sent, because the container might start after the sends have completed.
237+
As you can see, you have to define several infrastructure beans when not using Spring Boot.

0 commit comments

Comments
 (0)