Skip to content

Commit f23ebce

Browse files
committed
SRVKE-695: Updated event source descriptions
1 parent cf21ccf commit f23ebce

10 files changed

+23
-37
lines changed

_topic_map.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3083,7 +3083,7 @@ Topics:
30833083
- Name: Event sources
30843084
Dir: event_sources
30853085
Topics:
3086-
- Name: Getting started with event sources
3086+
- Name: Understanding event sources
30873087
File: knative-event-sources
30883088
- Name: Listing event sources and event source types
30893089
File: serverless-listing-event-sources

modules/serverless-functions-func-yaml-fields.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -134,4 +134,4 @@ The `runtime` field specifies the language runtime for your function, for exampl
134134

135135
== template
136136

137-
The `template` field specifies the type of the invocation event that triggers your function. You can set it to `http` for triggering with plain HTTP requests or to `events` for triggering with CloudEvents.
137+
The `template` field specifies the type of the invocation event that triggers your function. You can set it to `http` for triggering with plain HTTP requests or to `events` for triggering with cloud events.

modules/serverless-go-function-return-values.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ func Handle(ctx context.Context, res http.ResponseWriter, req *http.Request) {
2222
}
2323
----
2424

25-
Functions triggered by a CloudEvent might return nothing, `error`, or `CloudEvent` in order to push events into the Knative Eventing system. In this case, you must set a unique `ID`, proper `Source`, and a `Type` for the CloudEvent. The data can be populated from a defined structure, or from a `map`.
25+
Functions triggered by a cloud event might return nothing, `error`, or `CloudEvent` in order to push events into the Knative Eventing system. In this case, you must set a unique `ID`, proper `Source`, and a `Type` for the cloud event. The data can be populated from a defined structure, or from a `map`.
2626

2727
.Example CloudEvent response
2828
[source,go]

modules/serverless-invoking-go-functions-cloudevent.adoc

Lines changed: 5 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,7 @@
1-
// Module included in the following assemblies
2-
//
3-
// * /serverless/functions/serverless-developing-go-functions.adoc
4-
51
[id="serverless-invoking-go-functions-cloudevent_{context}"]
6-
= Functions triggered by a CloudEvent
2+
= Functions triggered by a cloud event
73

8-
When an incoming CloudEvent is received, the event is invoked by the link:https://cloudevents.github.io/sdk-go/[CloudEvents Golang SDK] and the `Event` type as a parameter.
4+
When an incoming cloud event is received, the event is invoked by the link:https://cloudevents.github.io/sdk-go/[CloudEvents Golang SDK] and the `Event` type as a parameter.
95

106
You can leverage the Golang link:https://golang.org/pkg/context/[Context] as an optional parameter in the function contract, as shown in the list of supported function signatures:
117

@@ -29,7 +25,7 @@ Handle(context.Context, cloudevents.Event) (*cloudevents.Event, error)
2925
[id="serverless-invoking-go-functions-cloudevent-example_{context}"]
3026
== CloudEvent trigger example
3127

32-
A CloudEvent is received which contains a JSON string in the data property:
28+
A cloud event is received which contains a JSON string in the data property:
3329

3430
[source,json]
3531
----
@@ -39,7 +35,7 @@ A CloudEvent is received which contains a JSON string in the data property:
3935
}
4036
----
4137

42-
To access this data, a structure must be defined which maps properties in the CloudEvent data, and retrieves the data from the incoming event. The following example uses the `Purchase` structure:
38+
To access this data, a structure must be defined which maps properties in the cloud event data, and retrieves the data from the incoming event. The following example uses the `Purchase` structure:
4339

4440
[source,go]
4541
----
@@ -58,7 +54,7 @@ func Handle(ctx context.Context, event cloudevents.Event) (err error) {
5854
}
5955
----
6056

61-
Alternatively, a Golang `encoding/json` package could be used to access the CloudEvent directly as JSON in the form of a bytes array:
57+
Alternatively, a Golang `encoding/json` package could be used to access the cloud event directly as JSON in the form of a bytes array:
6258

6359
[source,go]
6460
----

modules/serverless-invoking-go-functions-http.adoc

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,3 @@
1-
// Module included in the following assemblies
2-
//
3-
// * /serverless/functions/serverless-developing-go-functions.adoc
4-
51
[id="serverless-invoking-go-functions-http_{context}"]
62
= Functions triggered by an HTTP request
73

modules/serverless-invoking-python-functions.adoc

Lines changed: 1 addition & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,10 @@
1-
// Module included in the following assemblies
2-
//
3-
// * /serverless/functions/serverless-developing-python-functions.adoc
4-
51
[id="serverless-invoking-python-functions_{context}"]
62
= About invoking Python functions
73

84
Python functions can be invoked with a simple HTTP request. When an incoming request is received, functions are invoked with a `context` object as the first parameter. The `context` object is a Python class with two attributes:
95

106
* The `request` attribute is always present, and contains the Flask `request` object.
11-
* The second attribute, `cloud_event`, is populated if the incoming request is a `CloudEvent`.
7+
* The second attribute, `cloud_event`, is populated if the incoming request is a `CloudEvent` object.
128

139
Developers can access any `CloudEvent` data from the context object.
1410

modules/serverless-invoking-quarkus-functions.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
[id="serverless-invoking-quarkus-functions_{context}"]
66
= About invoking Quarkus functions
77

8-
You can create a Quarkus project that responds to CloudEvents, or one that responds to simple HTTP requests. CloudEvents in Knative are transported over HTTP as a POST request, so either function type can listen and respond to incoming HTTP requests.
8+
You can create a Quarkus project that responds to cloud events, or one that responds to simple HTTP requests. Cloud events in Knative are transported over HTTP as a POST request, so either function type can listen and respond to incoming HTTP requests.
99

1010
When an incoming request is received, Quarkus functions are invoked with an instance of a permitted type.
1111

serverless/admin_guide/serverless-cluster-admin-eventing.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ You can create Knative Eventing components with {ServerlessProductName} in the *
1111
// Event sources
1212
include::modules/serverless-creating-event-source-admin-web-console.adoc[leveloffset=+1]
1313

14-
See xref:../../serverless/event_sources/knative-event-sources.adoc#knative-event-sources[Getting started with event sources] for more information on which event source types are supported and can be created using by {ServerlessProductName}.
14+
See xref:../../serverless/event_sources/knative-event-sources.adoc#knative-event-sources[Understanding event sources] for more information on which event source types are supported and can be created using by {ServerlessProductName}.
1515

1616
// Brokers
1717
include::modules/serverless-creating-broker-admin-web-console.adoc[leveloffset=+1]
Lines changed: 11 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -1,25 +1,23 @@
11
include::modules/serverless-document-attributes.adoc[]
22
[id="knative-event-sources"]
3-
= Getting started with event sources
3+
= Understanding event sources
44
include::modules/common-attributes.adoc[]
55
:context: knative-event-sources
66

77
toc::[]
88

9-
An _event source_ is an object that links an event producer with an event _sink_, or consumer. A sink can be a Knative service, channel, or broker that receives events from an event source.
9+
A Knative _event source_ can be any Kubernetes object that generates or imports cloud events, and relays those events to another endpoint, known as a xref:../../serverless/knative_eventing/serverless-event-sinks.adoc#serverless-event-sinks[_sink_]. Sourcing events is critical to developing a distributed system that reacts to events.
10+
11+
You can create and manage Knative event sources by using the *Developer* perspective in the {product-title} web console, the `kn` CLI, or by applying YAML files.
1012

1113
Currently, {ServerlessProductName} supports the following event source types:
1214

13-
API server source:: Connects a sink to the Kubernetes API server.
14-
Ping source:: Periodically sends ping events with a constant payload. It can be used as a timer.
15-
Sink binding:: Connects core Kubernetes resource objects, such as `Deployment`, `Job`, or `StatefulSet` objects, with a sink.
16-
Container source:: Creates a custom event source by using an image.
17-
Knative Kafka source:: Connects a Kafka cluster to a sink as an event source.
15+
xref:../../serverless/event_sources/serverless-apiserversource.adoc#serverless-apiserversource[API server source]:: Brings Kubernetes API server events into Knative. The API server source fires a new event each time a Kubernetes resource is created, updated or deleted.
16+
17+
xref:../../serverless/event_sources/serverless-pingsource.adoc#serverless-pingsource[Ping source]:: Produces events with a fixed payload on a specified cron schedule.
18+
19+
xref:../../serverless/event_sources/serverless-sinkbinding.adoc#serverless-sinkbinding[Sink binding]:: Connects core Kubernetes resource objects, such as `Deployment`, `Job`, or `StatefulSet` objects, with a sink.
1820

19-
You can create and manage Knative event sources using the *Developer* perspective in the {product-title} web console, the `kn` CLI, or by applying YAML files.
21+
xref:../../serverless/event_sources/serverless-containersource.adoc#serverless-containersource[Container source]:: Starts a container image that generates cloud events and sends them to a sink. Container sources can also be used to support your own custom event sources in Knative.
2022

21-
* Create an xref:../../serverless/event_sources/serverless-apiserversource.adoc#serverless-apiserversource[API server source].
22-
* Create an xref:../../serverless/event_sources/serverless-pingsource.adoc#serverless-pingsource[ping source].
23-
* Create a xref:../../serverless/event_sources/serverless-sinkbinding.adoc#serverless-sinkbinding[sink binding].
24-
* Create a xref:../../serverless/event_sources/serverless-containersource.adoc#serverless-containersource[container source].
25-
* Create a xref:../../serverless/event_sources/serverless-kafka-source.adoc#serverless-kafka-source[Kafka source].
23+
xref:../../serverless/event_sources/serverless-kafka-source.adoc#serverless-kafka-source[Kafka source]:: Connects a Kafka cluster to a sink as an event source.

serverless/event_sources/serverless-kafka-source.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,6 @@ include::modules/serverless-kafka-source-yaml.adoc[leveloffset=+1]
2424
[id="additional-resources_serverless-kafka-source"]
2525
== Additional resources
2626

27-
* See xref:../../serverless/event_sources/knative-event-sources.adoc#knative-event-sources[Getting started with event sources].
27+
* See xref:../../serverless/event_sources/knative-event-sources.adoc#knative-event-sources[Understanding event sources].
2828
* See xref:../../serverless/knative_eventing/serverless-kafka.adoc#serverless-kafka[Knative Kafka].
2929
* See the link:https://access.redhat.com/documentation/en-us/red_hat_amq/7.6/html/amq_streams_on_openshift_overview/kafka-concepts_str#kafka-concepts-key_str[Red Hat AMQ Streams] documentation for more information about Kafka concepts.

0 commit comments

Comments
 (0)