You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/en/tutorials/load/automating_json_log_loading_with_vector.md
+89-3Lines changed: 89 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,6 +8,92 @@ In this tutorial, we'll simulate generating logs locally, collect them using [Ve
8
8
9
9
Before you start, ensure you have the following prerequisites in place:
10
10
11
-
-**Amazon S3 Bucket**: An S3 bucket where logs collected by Vector will be stored. [Learn how to create an S3 bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/create-bucket-overview.html). In this tutorial, we use `s3://databend-doc/` as the location for staging the collected logs.
12
-
-**AWS Credentials**: AWS Access Key ID and Secret Access Key with sufficient permissions for accessing the S3 bucket. [Manage your AWS credentials](https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys).
13
-
-**Docker**: Ensure that [Docker](https://www.docker.com/) is installed on your local machine, as it will be used to set up Vector.
11
+
-**Amazon S3 Bucket**: An S3 bucket where logs collected by Vector will be stored. [Learn how to create an S3 bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/create-bucket-overview.html).
12
+
-**AWS Credentials**: AWS Access Key ID and Secret Access Key with sufficient permissions for accessing your S3 bucket. [Manage your AWS credentials](https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys).
13
+
-**AWS CLI**: Ensure that the [AWS CLI](https://aws.amazon.com/cli/) is installed and configured with the necessary permissions to access your S3 bucket.
14
+
-**Docker**: Ensure that [Docker](https://www.docker.com/) is installed on your local machine, as it will be used to set up Vector.
15
+
16
+
## Step 1: Create Target Folder in S3 Bucket
17
+
18
+
To store the logs collected by Vector, create a folder named logs in your S3 bucket. In this tutorial, we use `s3://databend-doc/logs/` as the target location.
19
+
20
+
This command creates an empty folder named `logs` in the `databend-doc` bucket:
1. Create a Vector configuration file named `vector.yaml` on your local machine. In this tutorial, we create it at `/Users/eric/Documents/vector.yaml` with the following content:
40
+
41
+
```yaml title='vector.yaml'
42
+
sources:
43
+
logs:
44
+
type: file
45
+
include:
46
+
- "/logs/app.log"
47
+
read_from: beginning
48
+
49
+
transforms:
50
+
extract_message:
51
+
type: remap
52
+
inputs:
53
+
- "logs"
54
+
source: |
55
+
. = parse_json(.message) ?? {}
56
+
57
+
sinks:
58
+
s3:
59
+
type: aws_s3
60
+
inputs:
61
+
- "extract_message"
62
+
bucket: databend-doc
63
+
region: us-east-2
64
+
key_prefix: "logs/"
65
+
content_type: "text/plain"
66
+
encoding:
67
+
codec: "native_json"
68
+
auth:
69
+
access_key_id: "<your-access-key-id>"
70
+
secret_access_key: "<your-secret-access-key>"
71
+
```
72
+
73
+
2. Start Vector using Docker, mapping the configuration file and local logs directory:
0 commit comments