Skip to content

Commit 93a5fab

Browse files
docs: Draft a new guide for service contexts (#528)
* Draft new guide for service contexts * Update attributes * SME feedback * Update caption per SME feedback * Peer review feedback
1 parent 8d8e5ae commit 93a5fab

File tree

1 file changed

+391
-0
lines changed

1 file changed

+391
-0
lines changed
Lines changed: 391 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,391 @@
1+
////
2+
START GENERATED ATTRIBUTES
3+
WARNING: This content is generated by running npm --prefix .build run generate:attributes
4+
////
5+
6+
//All OpenShift Application Services
7+
:org-name: Application Services
8+
:product-long-rhoas: OpenShift Application Services
9+
:community:
10+
:imagesdir: ./images
11+
:property-file-name: app-services.properties
12+
:samples-git-repo: https://github.com/redhat-developer/app-services-guides
13+
:base-url: https://github.com/redhat-developer/app-services-guides/tree/main/docs/
14+
15+
//OpenShift Application Services CLI
16+
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
17+
:command-ref-url-cli: commands
18+
:installation-guide-url-cli: rhoas/rhoas-cli-installation/README.adoc
19+
20+
//OpenShift Streams for Apache Kafka
21+
:product-long-kafka: OpenShift Streams for Apache Kafka
22+
:product-kafka: Streams for Apache Kafka
23+
:product-version-kafka: 1
24+
:service-url-kafka: https://console.redhat.com/application-services/streams/
25+
:getting-started-url-kafka: kafka/getting-started-kafka/README.adoc
26+
:kafka-bin-scripts-url-kafka: kafka/kafka-bin-scripts-kafka/README.adoc
27+
:kafkacat-url-kafka: kafka/kcat-kafka/README.adoc
28+
:quarkus-url-kafka: kafka/quarkus-kafka/README.adoc
29+
:nodejs-url-kafka: kafka/nodejs-kafka/README.adoc
30+
:getting-started-rhoas-cli-url-kafka: kafka/rhoas-cli-getting-started-kafka/README.adoc
31+
:topic-config-url-kafka: kafka/topic-configuration-kafka/README.adoc
32+
:consumer-config-url-kafka: kafka/consumer-configuration-kafka/README.adoc
33+
:access-mgmt-url-kafka: kafka/access-mgmt-kafka/README.adoc
34+
:metrics-monitoring-url-kafka: kafka/metrics-monitoring-kafka/README.adoc
35+
:service-binding-url-kafka: kafka/service-binding-kafka/README.adoc
36+
:message-browsing-url-kafka: kafka/message-browsing-kafka/README.adoc
37+
38+
//OpenShift Service Registry
39+
:product-long-registry: OpenShift Service Registry
40+
:product-registry: Service Registry
41+
:registry: Service Registry
42+
:product-version-registry: 1
43+
:service-url-registry: https://console.redhat.com/application-services/service-registry/
44+
:getting-started-url-registry: registry/getting-started-registry/README.adoc
45+
:quarkus-url-registry: registry/quarkus-registry/README.adoc
46+
:getting-started-rhoas-cli-url-registry: registry/rhoas-cli-getting-started-registry/README.adoc
47+
:access-mgmt-url-registry: registry/access-mgmt-registry/README.adoc
48+
:content-rules-registry: https://access.redhat.com/documentation/en-us/red_hat_openshift_service_registry/1/guide/9b0fdf14-f0d6-4d7f-8637-3ac9e2069817[Supported Service Registry content and rules]
49+
:service-binding-url-registry: registry/service-binding-registry/README.adoc
50+
51+
//OpenShift Connectors
52+
:product-long-connectors: OpenShift Connectors
53+
:product-connectors: Connectors
54+
:product-version-connectors: 1
55+
:service-url-connectors: https://console.redhat.com/application-services/connectors
56+
:getting-started-url-connectors: connectors/getting-started-connectors/README.adoc
57+
58+
//OpenShift API Designer
59+
:product-long-api-designer: OpenShift API Designer
60+
:product-api-designer: API Designer
61+
:product-version-api-designer: 1
62+
:service-url-api-designer: https://console.redhat.com/application-services/api-designer/
63+
:getting-started-url-api-designer: api-designer/getting-started-api-designer/README.adoc
64+
65+
//OpenShift API Management
66+
:product-long-api-management: OpenShift API Management
67+
:product-api-management: API Management
68+
:product-version-api-management: 1
69+
:service-url-api-management: https://console.redhat.com/application-services/api-management/
70+
71+
////
72+
END GENERATED ATTRIBUTES
73+
////
74+
75+
[id="chap-connecting-client-applications-rhoas-cli"]
76+
= Connecting client applications to {product-long-rhoas} using the rhoas CLI
77+
:context: connecting-client-applications-rhoas-cli
78+
79+
[role="_abstract"]
80+
As a developer of applications and services, you might need to connect client applications to instances in cloud services such as {product-long-kafka} and {product-long-registry}.
81+
82+
For example, suppose you have the following applications running locally on your computer:
83+
84+
* One application that is designed to publish a stream of stock price updates
85+
* A second application that is designed to consume the stock price updates and chart them on a dashboard
86+
87+
In addition, suppose you have the following instances in {product-long-rhoas}:
88+
89+
* A Kafka instance in {product-kafka}
90+
* A {registry} instance in {product-registry}
91+
92+
Each time the first application produces a stock price update, you want to use the Kafka instance to forward the update as an event to the second, consuming application. In addition, you want each of the applications to use a schema in the {registry} instance to validate that messages conform to a particular format.
93+
94+
In this scenario, you need to connect your applications to your Kafka and {registry} instances. To achieve this, you can use the {product-long-rhoas} (`rhoas`) command-line interface (CLI) to create a _context_ for your service instances. You can then use another CLI command to generate the required connection information for each service instance in the context.
95+
96+
This guide describes how to use the `rhoas` CLI to create contexts and then generate the connection configuration information that client applications need to connect to service instances in those contexts.
97+
98+
//Additional line break to resolve mod docs generation error
99+
100+
[id="con-about-service-contexts_{context}"]
101+
== About service contexts
102+
103+
In {product-long-rhoas}, a service context is a defined set of instances running in cloud services such as {product-long-kafka} and {product-long-registry}. You might create different contexts for specific use cases, projects, or environments.
104+
105+
To create a context, you can use the {product-long-rhoas} (`rhoas`) command-line interface (CLI). New service instances that you create are automatically added to the context that is currently in use. You can switch between different contexts and add or remove service instances as required. You can include the same service instance in multiple contexts.
106+
107+
When you have created a service context, you can use a single CLI command to generate the configuration information that client applications need to connect to the instances in that context. You can generate connection configuration information in various formats such as an environment variables file (.env), a JSON file, a Java properties file, and a Kubernetes secret.
108+
109+
[id="proc-creating-new-service-contexts_{context}"]
110+
== Creating new service contexts
111+
112+
The following example shows how to use the CLI to create contexts in {product-long-rhoas} and then manage the service instances that are defined in those contexts.
113+
114+
115+
.Prerequisites
116+
* You have installed the {product-long-rhoas} (`rhoas`) CLI. For more information, see https://access.redhat.com/documentation/en-us/red_hat_openshift_application_services/1/guide/bb30ee92-9e0a-4fd6-a67f-aed8910d7da3#proc-installing-rhoas_installing-rhoas-cli[Installing the rhoas CLI^].
117+
118+
.Procedure
119+
120+
. Log in to the CLI.
121+
+
122+
[source,shell]
123+
----
124+
$ rhoas login
125+
----
126+
+
127+
The login command opens a sign-in process in your web browser.
128+
129+
. Use the CLI to create a new service context.
130+
+
131+
[source,shell]
132+
----
133+
$ rhoas context create --name development-context
134+
----
135+
+
136+
The new context becomes the _current_ (that is, active) context by default.
137+
138+
. Create a new Kafka instance in {product-long-kafka}.
139+
+
140+
[source,shell]
141+
----
142+
$ rhoas kafka create --name my-kafka-instance
143+
----
144+
+
145+
The CLI automatically adds the Kafka instance to the current context.
146+
147+
. Create a new {registry} instance in {product-long-registry}.
148+
+
149+
[source,shell]
150+
----
151+
$ rhoas service-registry create --name my-registry-instance
152+
----
153+
+
154+
The CLI also automatically adds the {registry} instance to the current context.
155+
156+
. Confirm that the Kafka and {registry} instances are in the current context and are running.
157+
+
158+
.Command to view status of service instances in current context
159+
[source,shell]
160+
----
161+
$ rhoas context status
162+
----
163+
+
164+
You see output that looks like the following example:
165+
+
166+
.Status of example context with single Kafka and {registry} instances
167+
[source,shell,subs="+quotes",options="nowrap"]
168+
----
169+
Service Context Name: development-context
170+
Context File Location: /home/_<user-name>_/.config/rhoas/contexts.json
171+
172+
Kafka
173+
-----------------------------------------------------------------------------
174+
ID: cafkr2jma40lhulbl1c0
175+
Name: my-kafka-instance
176+
Status: ready
177+
Bootstrap URL: kafka-inst-cafkr-jma--lhulbl-ca.bf2.kafka.rhcloud.com:443
178+
179+
Service Registry
180+
------------------------------------------------------------------------------
181+
ID: 0aa1dd8b-63d5-466c-9de8-7c03320a81c2
182+
Name: my-registry-instance
183+
Status: ready
184+
Registry URL: https://bu98.serviceregistry.rhcloud.com/t/0aa1dd8b-63d5-466c-9de8-7c03320a81c2
185+
----
186+
187+
. Create another service context.
188+
+
189+
[source,shell]
190+
----
191+
$ rhoas context create --name production-context
192+
----
193+
+
194+
The new service context becomes the current context.
195+
196+
. Add the Kafka instance that you created earlier in the procedure to the new service context.
197+
+
198+
[source,shell]
199+
----
200+
$ rhoas context set-kafka --name my-kafka-instance
201+
----
202+
+
203+
The Kafka instance is now part of both of the service contexts that you created.
204+
205+
[NOTE]
206+
====
207+
To remove a particular service from a context, use the `context unset` command with the `--services` flag and specify the service name, for example, `kafka` or `service-registry`. An example is shown.
208+
209+
[source,shell]
210+
----
211+
$ rhoas context unset --name development-context --services kafka
212+
----
213+
====
214+
215+
[role="_additional-resources"]
216+
.Additional resources
217+
* To learn more about the `context` commands that you can use to manage service contexts, see https://access.redhat.com/documentation/en-us/red_hat_openshift_application_services/1/guide/8bd088a6-b7b7-4e5d-832a-b0f0494f9070[CLI command reference (rhoas)^]
218+
219+
[id="proc-generating-connection-information-quarkus_{context}"]
220+
== Generating connection configuration information for a Quarkus application
221+
The following example shows how to connect an example Quarkus application to the service instances defined in a context in {product-long-rhoas}. https://quarkus.io/[Quarkus^] is a Kubernetes-native Java framework that is optimized for serverless, cloud, and Kubernetes environments.
222+
223+
The Quarkus application uses a topic in a Kafka instance to produce and consume a stream of quote values and display these on a web page. The application consists of two components:
224+
225+
* A producer component that periodically produces a new quote value and publishes this to a Kafka topic called `quotes`.
226+
* A consumer component that streams quote values from the Kafka topic. This component also has a minimal front end that uses Server-Sent Events to show the quote values on a web page.
227+
228+
In addition, the producer and consumer components serialize and deserialize Kafka messages using an Avro schema stored in {registry}. Use of the schema ensures that message values conform to a defined format.
229+
230+
.Prerequisites
231+
232+
ifndef::community[]
233+
* You have a Red Hat account.
234+
endif::[]
235+
* You have a service context with a Kafka and {registry} instance. For an example of creating this, see xref:con-about-service-contexts_{context}[Creating new service contexts].
236+
* https://github.com/git-guides/[Git^] is installed.
237+
* You have an IDE such as https://www.jetbrains.com/idea/download/[IntelliJ IDEA^], https://www.eclipse.org/downloads/[Eclipse^], or https://code.visualstudio.com/Download[VSCode^].
238+
* https://adoptopenjdk.net/[OpenJDK^] 11 or later is installed on Linux or MacOS. (The latest LTS version of OpenJDK is recommended.)
239+
* https://maven.apache.org/[Apache Maven^] 3.8.x or later is installed (for Quarkus 2.2.x).
240+
241+
242+
.Procedure
243+
244+
. On the command line, clone the {product-long-rhoas} https://github.com/redhat-developer/app-services-guides[Guides and Samples^] repository from GitHub.
245+
+
246+
[source,shell]
247+
----
248+
$ git clone https://github.com/redhat-developer/app-services-guides app-services-guides
249+
----
250+
251+
. In your IDE, open the `code-examples/quarkus-service-registry-quickstart` directory from the repository that you cloned.
252+
+
253+
You see that the sample Quarkus application has two components - a producer component and a consumer component. The producer component publishes a stream of quote values to a Kafka topic. The consumer component consumes these values and displays them on a web page.
254+
255+
. On the command line, create the `quotes` topic required by the Quarkus application.
256+
+
257+
[source,shell]
258+
----
259+
$ rhoas kafka topic create --name quotes
260+
----
261+
262+
. Ensure that you are using the service context that includes your Kafka and {registry} instances, as shown in the following example:
263+
+
264+
[source,shell]
265+
----
266+
$ rhoas context use --name development-context
267+
----
268+
269+
. In the guides and samples repository that you cloned, navigate to the directory for the Quarkus application.
270+
+
271+
[source,shell]
272+
----
273+
$ cd ~/app-services-guides/code-examples/quarkus-service-registry-quickstart/
274+
----
275+
276+
. Generate an environment variables file that contains the connection configuration information required by the producer component.
277+
+
278+
[source,shell]
279+
----
280+
$ rhoas generate-config --type env --output-file ./producer/.env
281+
----
282+
283+
. Copy the `.env` file to the directory for the consumer component, as shown in the following Linux example:
284+
+
285+
[source,shell]
286+
----
287+
$ cp ./producer/.env ./consumer/.env
288+
----
289+
+
290+
For a service context with single Kafka and {registry} instances, the `.env` file looks like the following example:
291+
+
292+
.Example environment variables file for connection configuration information
293+
[source,shell]
294+
----
295+
## Generated by rhoas cli
296+
## Kafka Configuration
297+
KAFKA_HOST=kafka-inst-cafkr-jma--lhulbl-ca.bf2.kafka.rhcloud.com:443
298+
## Service Registry Configuration
299+
SERVICE_REGISTRY_URL=https://bu98.serviceregistry.rhcloud.com/t/0aa1dd8b-63d5-466c-9de8-7c03320a81c2
300+
SERVICE_REGISTRY_CORE_PATH=/apis/registry/v2
301+
SERVICE_REGISTRY_COMPAT_PATH=/apis/ccompat/v6
302+
303+
## Authentication Configuration
304+
RHOAS_CLIENT_ID=srvc-acct-14295e3c-f72d-4bae-876c-3172a96eb7eb
305+
RHOAS_CLIENT_SECRET=5c3a20e0-d946-4edf-94d2-35db41f3b2ad
306+
RHOAS_OAUTH_TOKEN_URL=https://identity.api.openshift.com/auth/realms/rhoas/protocol/openid-connect/token
307+
----
308+
+
309+
As shown in the example, the file that you generate contains the endpoints for your service instances, and the credentials required to connect to those instances. The CLI automatically created a service account (under the environment variable name `RHOAS_CLIENT_ID`) that client applications can use to authenticate with the Kafka and {registry} instances.
310+
311+
. Set Access Control List (ACL) permissions to enable the new service account to access resources in the Kafka instance.
312+
+
313+
.Example command for granting access to Kafka instance
314+
[source,shell]
315+
----
316+
$ rhoas kafka acl grant-access --producer --consumer --service-account srvc-acct-14295e3c-f72d-4bae-876c-3172a96eb7eb --topic quotes --group all
317+
----
318+
+
319+
The command you entered allows applications to use the service account to produce and consume messages in the `quotes` topic. Applications can use any consumer group and producer.
320+
321+
. Use Role-Based Access Control (RBAC) to enable the new service account to access the {registry} instance and the artifacts (such as schemas) that it contains.
322+
+
323+
.Example command for granting access to {registry} instance
324+
[source,shell]
325+
----
326+
$ rhoas service-registry role add --role manager --service-account srvc-acct-14295e3c-f72d-4bae-876c-3172a96eb7eb
327+
----
328+
329+
. In the guides and samples repository, navigate to the directory for the producer component. Use Apache Maven to run the producer component in developer mode.
330+
+
331+
[source,shell,options="nowrap"]
332+
----
333+
$ cd ~/app-services-guides/code-examples/quarkus-service-registry-quickstart/producer
334+
$ mvn quarkus:dev
335+
----
336+
+
337+
The producer component starts to generate quote values to the `quotes` topic in the Kafka instance.
338+
+
339+
The Quarkus application also created an Avro schema called `quotes-value` in the {registry} instance. The producer and consumer components use the schema to ensure that message values conform to a defined format.
340+
+
341+
To view the contents of the `quotes-value` schema, run the following command:
342+
+
343+
[source,shell]
344+
----
345+
$ rhoas service-registry artifact get --artifact-id quotes-value
346+
----
347+
+
348+
You see output that looks like the following example:
349+
+
350+
.Example Avro schema in {registry}
351+
[source,shell]
352+
----
353+
{
354+
"type": "record",
355+
"name": "Quote",
356+
"namespace": "org.acme.kafka.quarkus",
357+
"fields": [
358+
{
359+
"name": "id",
360+
"type": {
361+
"type": "string",
362+
"avro.java.string": "String"
363+
}
364+
},
365+
{
366+
"name": "price",
367+
"type": "int"
368+
}
369+
]
370+
}
371+
----
372+
373+
. With the producer component still running, open a second command-line window or tab. In the guides and samples repository, navigate to the directory for the consumer component and run the component in developer mode.
374+
+
375+
[source,shell,options="nowrap"]
376+
----
377+
$ cd ~/app-services-guides/code-examples/quarkus-service-registry-quickstart/consumer
378+
$ mvn quarkus:dev
379+
----
380+
+
381+
The consumer component starts to consume the stream of quote values from the `quotes` topic.
382+
383+
. In a web browser, go to http://localhost:8080/quotes.html[^].
384+
+
385+
You see that the consumer component displays the stream of quote values on the web page. This output shows that the Quarkus application used the connection configuration information that you generated to connect to the Kafka and {registry} instances in your service context.
386+
387+
[role="_additional-resources"]
388+
.Additional resources
389+
* https://access.redhat.com/documentation/en-us/red_hat_openshift_application_services/1/guide/8bd088a6-b7b7-4e5d-832a-b0f0494f9070#_b7f033ec-6f0c-4b3c-89b0-cb1801de19f9[CLI command reference (rhoas)^]
390+
* https://access.redhat.com/documentation/en-us/red_hat_openshift_streams_for_apache_kafka/1/guide/2f4bf7cf-5de2-4254-8274-6bf71673f407[ Managing account access in {product-long-kafka}^]
391+
* https://access.redhat.com/documentation/en-us/red_hat_openshift_service_registry/1/guide/7717db0b-9fad-4fff-91b7-b311b63290a4[Managing account access in {product-long-registry}^]

0 commit comments

Comments
 (0)