Skip to content

Commit 51a672a

Browse files
committed
Clean up S4K guide; Added pages to nav
1 parent 5e5fc40 commit 51a672a

File tree

4 files changed

+71
-35
lines changed

4 files changed

+71
-35
lines changed
82.6 KB
Loading
Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
* xref:starlight/index.adoc[]
2+
** xref:starlight/kafka/index.adoc[]
Lines changed: 20 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,26 @@
1-
= DataStax Starlight Suite of APIs
1+
= DataStax Starlight Suite of Pulsar Extensions
22
:description:
33
:title: Get started with DataStax starlight quite of APIs
44
:page-aliases: starlight::index.adoc,starlight-suite::index.adoc
5-
:navtitle: DataStax Starlight Suite of APIs
5+
:navtitle: Starlight Extensions
6+
:page-aliases: docs@luna-streaming::starlight.adoc,luna-streaming:components:starlight.adoc
67

7-
The Starlight suite of APIs are a collection of Apache Pulsar protocol handlers that extend an existing Pulsar cluster. Some establish wire level protocol XXXX while others follow an interface specification. The goal of all the APIs is to create seamless interact with a Pulsar cluster. There are three APIs within the Starlight suite.
8+
The Starlight suite of extensions is a collection of Apache Pulsar protocol handlers that extend an existing Pulsar cluster. The goal of all the extensions are to create a native, seamless interaction with a Pulsar cluster using existing tooling and clients.
89

910
== Starlight for Kafka
10-
About the API +
11-
xref:use-cases-architectures:starlight/kafka/index.adoc[Get started now]
11+
12+
Starlight for Kafka brings native Apache Kafka® protocol support to Apache Pulsar by introducing a Kafka protocol handler on Pulsar brokers.
13+
14+
xref:use-cases-architectures:starlight/kafka/index.adoc[Get started now] | xref:starlight-for-kafka:ROOT:index.adoc[Configuring] | https://github.com/datastax/starlight-for-kafka[Source Code]
15+
16+
// == Starlight for RabbitMQ
17+
//
18+
// Starlight for RabbitMQ™ combines the industry-standard AMQP 0.9.1 (RabbitMQ) API with the cloud-native and horizontally scalable Pulsar streaming platform, providing a powerful way to modernize your RabbitMQ infrastructure, improve performance, and reduce costs.
19+
//
20+
// xref:use-cases-architectures:starlight/rabbitmq/index.adoc[Get started now] | xref:starlight-for-rabbitmq:ROOT:index.adoc[Configuring] | https://github.com/datastax/starlight-for-rabbitmq[Source Code^]
21+
//
22+
// == Starlight for JMS
23+
//
24+
// Starlight for JMS allows enterprises to take advantage of the scalability and resiliency of a modern streaming platform to run their existing JMS applications. Because Pulsar is open-source and cloud-native, Starlight for JMS enables enterprises to move their JMS applications to run on-premises and in any cloud environment.
25+
//
26+
// xref:use-cases-architectures:starlight/jms/index.adoc[Get started now] | xref:starlight-for-jms:ROOT:index.adoc[Configuring] | https://github.com/datastax/starlight-for-jms[Source Code^]

modules/use-cases-architectures/pages/starlight/kafka/index.adoc

Lines changed: 49 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -1,24 +1,29 @@
1-
= DataStax Starlight for Kafka Pulsar extension
1+
= Getting started with the Starlight for Kafka extension
2+
3+
:description: Learn how to get started using the Starlight for Kafka extension with Pulsar and get hands on with Kafka producer and consumer interacting with a topic.
4+
:title: Getting started with the Starlight for Kafka extension
5+
:navtitle: Kafka
26

37
Starlight for Kafka brings the native Apache Kafka protocol support to Apache Pulsar by introducing a Kafka protocol handler on Pulsar brokers. By adding the Starlight for Kafka protocol handler to your existing Pulsar cluster, you can migrate your existing Kafka applications and services to Pulsar without modifying the code.
48

5-
Visit the project's full documentation xref:starlight-for-kafka:ROOT:index.adoc[here].
9+
If source code is your thing, visit the https://github.com/datastax/starlight-for-kafka[project's repo on GitHub^]{external-link-icon}.
610

711
== Architecture reference
812

9-
|===
10-
a|image:s4k-architecture.png[Starlight for Kafka Architecture]
11-
|===
13+
image:s4k-architecture.png[Starlight for Kafka Architecture]
1214

1315
== Establishing the Kafka protocol handler
1416

15-
Before you can use an existing Kafka client with Pulsar you are going to need the Starlight for Kafka protocol handler installed in the Pulsar cluster. There are 3 popular ways to complete this:
17+
Before you can use a Kafka client can interact with your Pulsar cluster, you need the Starlight for Kafka protocol handler installed in the cluster. Installation looks a bit different depending on where your Pulsar cluster is running. Choose the option that best fits your needs.
1618

1719
[tabs]
1820
====
1921
Astra Streaming::
2022
+
2123
--
24+
25+
If you want a working Kafka extension as quickly as possible, this is your best bet. This is also a good option for those that already have a streaming tenant and are looking to extend it.
26+
2227
. Sign in to your Astra account and navigate to your streaming tenant.
2328
+
2429
TIP: Don't have a streaming tenant? Follow our "xref:astra-streaming:getting-started:index.adoc[]" guide.
@@ -41,7 +46,7 @@ Your Astra Streaming tenant is ready for prime time! Continue to the next sectio
4146
Luna Streaming::
4247
+
4348
--
44-
The Starlight for Kafka extension is included in the `luna-streaming-all` image used to deploy your Luna cluster. The Luna helm chart makes deploying extensions quite easy. Follow the "xref:luna-streaming:components:starlight-for-kafka.adoc[]" guide to create a simple Pulsar cluster with the Starlight for Kafka extension ready for use.
49+
The Starlight for Kafka extension is included in the `luna-streaming-all` image used to deploy a Luna cluster. The Luna helm chart makes deploying the Kafka extension quite easy. Follow the "xref:luna-streaming:components:starlight-for-kafka.adoc[]" guide to create a simple Pulsar cluster with the Starlight for Kafka extension ready for use.
4550
--
4651
Self Managed::
4752
+
@@ -52,41 +57,50 @@ Already got your own Pulsar Cluster? Or maybe your using a standalone cluster? S
5257

5358
== Messaging with Starlight for Kafka
5459

55-
Starlight for Kafka supports quite a few different use cases. Remember there is a Pulsar cluster between the producer client and consumer client. Which means can interchange the type of producer and consumer that best fits your needs.
60+
Starlight for Kafka supports quite a few different use cases. With a Pulsar cluster between producers and consumers you can interchange the type of producer and consumer to fit your needs. *The below examples are using an Astra Streaming tenant as the Kafka bootstrap server.* If you are using Luna or self-managed, switch the bootstrap server URL for your own.
61+
62+
=== Retrieve Kafka connection properties in Astra Streaming
63+
64+
While on the "Connect" tab in the Astra Streaming portal, the "kafka" area will provide important connection information. You will need that to create a working Kafka client or using the CLI.
5665

57-
The below examples are using an Astra Streaming tenant as the Kafka bootstrap server. They assume you have completed the enablement steps above in the "Astra Streaming" tab.
66+
image:kafka-client-settings.png[Astra Streaming kafka settings]
67+
68+
TIP: While reviewing the Kafka connection settings in the Astra portal, if you click the clipboard icon you will get those values as well as a working token to paste in code.
69+
70+
=== Produce and consume a message
5871

5972
[tabs]
6073
====
6174
Kafka CLI::
6275
+
6376
--
64-
65-
Download the latest Kafka dist https://www.apache.org/dyn/closer.cgi?path=/kafka/3.3.1/kafka_2.13-3.3.1.tgz[here]. With the tar ball extracted, the producer cli is in the 'bin' folder.
77+
Download the latest Kafka dist https://www.apache.org/dyn/closer.cgi?path=/kafka/3.3.1/kafka_2.13-3.3.1.tgz[here^]{external-link-icon}. With the tar ball extracted, the producer and consumer cli's are in the 'bin' folder.
6678
6779
. To get started, let's set a few variables. If you've completed our "xref:astra-streaming:getting-started:index.adoc[Getting started with Astra Streaming]" guide, the below values will be a perfect fit for your existing tenant.
6880
+
6981
[source,shell]
7082
----
71-
SERVICE_URL="pulsar+ssl://pulsar-gcp-uscentral1.streaming.datastax.com:9951"
72-
TENANT="my-stream-<rand>"
73-
NAMESPACE="my-namespace"
74-
TOPIC="my-topic"
83+
SERVICE_URL="<REPLACE_WITH_BOOTSTRAP_SERVER_URL>"
84+
TENANT="<REPLACE_WITH_TENANT_NAME>"
85+
NAMESPACE="<REPLACE_WITH_NAMESPACE>"
86+
TOPIC="<REPLACE_WITH_TOPIC>"
7587
----
88+
89+
. Now let's use those variables to enter in Kafka's producer shell.
7690
+
7791
[source,shell]
7892
----
7993
# cd kafka_2.13-3.3.1
8094
./bin/kafka-console-producer.sh --topic "$TENANT/$NAMESPACE/$TOPIC" --bootstrap-server "$SERVICE_URL"
8195
----
8296
83-
. If all goes as planned you will be in Kafka's producer shell. Type in a super memorable message, hit 'enter' to send, and then 'Ctrl-C' to exit the shell.
97+
. Type in a super memorable messages and hit 'enter' to send. Press 'Ctrl-C' to exit the shell.
8498
+
8599
[source,shell]
86100
----
87101
> This is my first S4K message.
88102
----
89-
+
103+
90104
A new message has been produced in the provided tenant/namespace/topic and is ready for consumption.
91105
92106
. Start the Kafka consumer shell.
@@ -97,11 +111,11 @@ A new message has been produced in the provided tenant/namespace/topic and is re
97111
./bin/kafka-console-consumer.sh --topic "$TENANT/$NAMESPACE/$TOPIC" --from-beginning --bootstrap-server "$SERVICE_URL"
98112
----
99113
100-
. The consumer should immediately find the new message added before and output its value.
114+
. The consumer should immediately find the new message and output its value.
101115
+
102116
[source,shell]
103117
----
104-
This my first message
118+
This is my first S4K message.
105119
----
106120
107121
. Press 'Ctrl-C' to exit the consumer shell.
@@ -111,13 +125,11 @@ Wow, you did it! Kafka producer and consumer with an Apache Pulsar cluster. How
111125
Kafka Client (Java)::
112126
+
113127
--
114-
While on the "Connect" tab in the Astra Streaming portal, the "kafka" area will provide important connection information. You will need that to create a working Kafka client.
128+
This example uses maven as the project structure. If you prefer gradle or another, this code should still be a good fit.
115129
116-
image:kafka-client-settings.png[Astra Streaming kafka settings]
130+
TIP: Visit our https://github.com/datastax/astra-streaming-examples[examples repo^]{external-link-icon} to see the complete source of this example.
117131
118-
TIP: While reviewing the Kafka connection settings in the Astra portal, if you click the clipboard icon you will get those values as well as a working token to paste in code.
119-
120-
. Create a new java project. We use maven in this example, but you can choose other flavors.
132+
. Create a new maven project.
121133
+
122134
[source,shell]
123135
----
@@ -135,7 +147,7 @@ include::{astra-streaming-examples-repo}/java/starlight-for-kafka/kafka-client/c
135147
</dependency>
136148
----
137149
138-
. Open the file "src/main/java/org/example/App.java" and replace the entire contents with the below code. Notice there are variable values that need replacing. This is where you can use those Kafka connection values retrieved previously.
150+
. Open the file "src/main/java/org/example/App.java" and replace the entire contents with the below code. Notice there are class variables that need replacing. Apply the values previously retrieved in Astra Streaming.
139151
+
140152
[source,java]
141153
----
@@ -144,21 +156,21 @@ include::{astra-streaming-examples-repo}/java/starlight-for-kafka/kafka-client/S
144156
+
145157
NOTE: Don't worry if your editor shows errors, this isn't a complete program... yet.
146158
147-
. Next bring in the following code to build the configuration that will be used by both the producer and consumer.
159+
. Bring in the following code to build the configuration that will be used by both the producer and consumer.
148160
+
149161
[source,java]
150162
----
151163
include::{astra-streaming-examples-repo}/java/starlight-for-kafka/kafka-client/StarlightForKafkaClient/src/main/java/org/example/App.java[tag=build-config]
152164
----
153165
154-
. Now paste the producer code below into the file. This is a very simple flow that sends a single message and awaits acknowledgment.
166+
. Now paste the producer code into the file. This is a very simple flow that sends a single message and awaits acknowledgment.
155167
+
156168
[source,java]
157169
----
158170
include::{astra-streaming-examples-repo}/java/starlight-for-kafka/kafka-client/StarlightForKafkaClient/src/main/java/org/example/App.java[tag=build-producer]
159171
----
160172
161-
. Finally past the consume code below into the file. This creates a basic subscription and retrieves the latest messages on the topic.
173+
. Past the consumer code into the file. This creates a basic subscription and retrieves the latest messages on the topic.
162174
+
163175
[source,java]
164176
----
@@ -180,9 +192,16 @@ java -jar target/StarlightForKafkaClient-1.0-SNAPSHOT-jar-with-dependencies.jar
180192
Successfully sent message
181193
182194
Found 1 total record(s)
183-
ConsumerRecord(topic = persistent://my-tenant-007/my-namespace/my-topic, partition = 0, leaderEpoch = null, offset = 22, CreateTime = 1673545962124, serialized key size = 8, serialized value size = 11, headers = RecordHeaders(headers = [], isReadOnly = false), key = ���h, value = Hello World)
195+
ConsumerRecord(topic = persistent://my-tenant-007/my-namespace/my-topic, partition = 0, leaderEpoch = null, offset = 22, CreateTime = 1673545962124, serialized key size = 8, serialized value size = 11, headers = RecordHeaders(headers = [], isReadOnly = false), key = xxxxx, value = Hello World)
184196
----
185197
186198
Congrats! You have just used the Kafka client to send and receive messages in Pulsar. Next stop is the moon!
187199
--
188-
====
200+
====
201+
202+
The Starlight for Kafka documentation provides more specifics about the below topics and more. Visit those for more detail.
203+
204+
* xref:starlight-for-kafka:operations:starlight-kafka-kstreams.adoc[]
205+
* xref:starlight-for-kafka:operations:starlight-kafka-implementation.adoc[]
206+
* xref:starlight-for-kafka:operations:starlight-kafka-monitor.adoc[Monitoring]
207+
* xref:starlight-for-kafka:operations:starlight-kafka-security.adoc[]

0 commit comments

Comments
 (0)