You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You can check the [Amazon ECR Public official doc](https://docs.aws.amazon.com/AmazonECR/latest/public/get-set-up-for-amazon-ecr.html) for more details.
479
+
See the
480
+
[Amazon ECR Public official documentation](https://docs.aws.amazon.com/AmazonECR/latest/public/get-set-up-for-amazon-ecr.html)
For more see [the AWSfor Fluent Bit github repo](https://github.com/aws/aws-for-fluent-bit#public-images).
496
+
For more information, see the
497
+
[AWSfor Fluent Bit GitHub repo](https://github.com/aws/aws-for-fluent-bit#public-images).
492
498
493
499
## Advanced usage
494
500
495
501
### Use Apache Arrow for in-memory data processing
496
502
497
-
Starting from Fluent Bit v1.8, the Amazon S3 plugin includes the support for [Apache Arrow](https://arrow.apache.org/). The support is currently not enabled by default, as it depends on a shared version of `libarrow`as the prerequisite.
503
+
With Fluent Bit v1.8 or greater, the Amazon S3 plugin includes the support for
504
+
[Apache Arrow](https://arrow.apache.org/). Support isn't enabled by
505
+
default, and has a dependency on a shared version of `libarrow`.
498
506
499
-
To use this feature, `FLB_ARROW` must be turned on at compile time:
507
+
To use this feature, `FLB_ARROW` must be turned on at compile time. Use the following
508
+
commands:
500
509
501
510
```text
502
-
$cd build/
503
-
$cmake -DFLB_ARROW=On ..
504
-
$cmake --build .
511
+
cd build/
512
+
cmake -DFLB_ARROW=On ..
513
+
cmake --build .
505
514
```
506
515
507
-
Once compiled, Fluent Bit can upload incoming data to S3 in Apache Arrow format. For example:
516
+
Once compiled, Fluent Bit can upload incoming data to S3 in Apache Arrow format.
517
+
518
+
For example:
508
519
509
520
```python
510
521
[INPUT]
@@ -519,9 +530,13 @@ Once compiled, Fluent Bit can upload incoming data to S3 in Apache Arrow format.
519
530
Compression arrow
520
531
```
521
532
522
-
As shown in this example, setting `Compression` to `arrow` makes Fluent Bit to convert payload into Apache Arrow format.
533
+
Setting `Compression` to `arrow` makes Fluent Bit convert payload into Apache Arrow
534
+
format.
535
+
536
+
Load, analyze, and process stored data using popular data
537
+
processing tools such as Python pandas, Apache Spark and Tensorflow.
523
538
524
-
The stored data is very easy to load, analyze and process using popular data processing tools (such as Python pandas, Apache Spark and Tensorflow). The following code uses `pyarrow` to analyze the uploaded data:
539
+
The following example uses `pyarrow` to analyze the uploaded data:
0 commit comments