Skip to content

Commit b508030

Browse files
committed
README.md: Add description about record_key
Signed-off-by: Takuro Ashie <[email protected]>
1 parent 00db3cf commit b508030

File tree

1 file changed

+34
-0
lines changed

1 file changed

+34
-0
lines changed

README.md

Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -337,6 +337,40 @@ For example, `$.source.ip` can be extracted with config `headers_from_record` an
337337

338338
> Using this config to remove unused fields is discouraged. A [filter plugin](https://docs.fluentd.org/v/0.12/filter) can be used for this purpose.
339339
340+
#### Send only a sub field as a message payload
341+
342+
If `record_key` is provided, the plugin sends only a sub field given by that key.
343+
The configuration format is jsonpath.
344+
345+
e.g. When the following configuration and the incoming record are given:
346+
347+
configuration:
348+
349+
<match **>
350+
@type kafka2
351+
[...]
352+
record_key '$.data'
353+
</match>
354+
355+
record:
356+
357+
{
358+
"specversion" : "1.0",
359+
"type" : "com.example.someevent",
360+
"id" : "C234-1234-1234",
361+
"time" : "2018-04-05T17:31:00Z",
362+
"datacontenttype" : "application/json",
363+
"data" : {
364+
"appinfoA" : "abc",
365+
"appinfoB" : 123,
366+
"appinfoC" : true
367+
},
368+
...
369+
}
370+
371+
only the `data` field will be serialized by the formatter and sent to Kafka.
372+
The toplevel `data` key will be removed.
373+
340374
### Buffered output plugin
341375

342376
This plugin uses ruby-kafka producer for writing data. This plugin is for v0.12. If you use v1, see `kafka2`.

0 commit comments

Comments
 (0)