Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
190 changes: 98 additions & 92 deletions pipeline/outputs/stackdriver_special_fields.md
Original file line number Diff line number Diff line change
@@ -1,34 +1,35 @@
# Stackdriver Special fields
# Stackdriver special fields

When the [google-logging-agent](https://cloud.google.com/logging/docs/agent) receives a structured log record, it treats [some fields](https://cloud.google.com/logging/docs/agent/configuration#special-fields) specially, allowing users to set specific fields in the LogEntry object that get written to the Logging API.

## Log Entry Fields
## LogEntry fields

Currently, we support some special fields in fluent-bit for setting fields on the LogEntry object:
| JSON log field | [LogEntry](https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry) field | type | Description |
Fluent Bit support some special fields for setting fields on the LogEntry object:

| JSON log field | [LogEntry](https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry) field | Type | Description |
| :--- | :--- | :--- | :--- |
| `logging.googleapis.com/logName` | `logName` | `string` | The log name to write this log to. |
| `logging.googleapis.com/labels` | `labels` | `object<string, string>` | The labels for this log. |
| `logging.googleapis.com/severity` | `severity` | [`LogSeverity` enum](https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry#LogSeverity) | The severity of this log. |
| `logging.googleapis.com/severity` | `severity` | [`LogSeverity` Enum](https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry#LogSeverity) | The severity of this log. |
| `logging.googleapis.com/monitored_resource` | `resource` | [`MonitoredResource`](https://cloud.google.com/logging/docs/reference/v2/rest/v2/MonitoredResource) (without `type`) | Resource labels for this log. |
| `logging.googleapis.com/operation` | `operation` | [`LogEntryOperation`](https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry#LogEntryOperation) | Additional information about a potentially long-running operation. |
| `logging.googleapis.com/insertId` | `insertId` | `string` | A unique identifier for the log entry. It is used to order logEntries. |
| `logging.googleapis.com/insertId` | `insertId` | `string` | A unique identifier for the log entry. It's used to order `logEntries`. |
| `logging.googleapis.com/sourceLocation` | `sourceLocation` | [`LogEntrySourceLocation`](https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry#LogEntrySourceLocation) | Additional information about the source code location that produced the log entry. |
| `logging.googleapis.com/http_request` | `httpRequest` | [`HttpRequest`](https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry#HttpRequest) | A common proto for logging HTTP requests. |
| `logging.googleapis.com/trace` | `trace` | `string` | Resource name of the trace associated with the log entry. |
| `logging.googleapis.com/traceSampled` | `traceSampled` | boolean | The sampling decision associated with this log entry. |
| `logging.googleapis.com/spanId` | `spanId` | `string` | The ID of the trace span associated with this log entry. |
| `timestamp` | `timestamp` | `object` ([protobuf `Timestamp` object format](https://protobuf.dev/reference/protobuf/google.protobuf/#timestamp)) | An object including the seconds and nanos fields that represents the time. |
| `timestampSecond` & `timestampNanos` | `timestamp` | `int` | The seconds and nanos that represents the time. |
| `timestamp` | `timestamp` | `object` ([protobuf `Timestamp` object format](https://protobuf.dev/reference/protobuf/google.protobuf/#timestamp)) | An object including the seconds and nanoseconds fields that represent the time. |
| `timestampSecond` and `timestampNanos` | `timestamp` | `int` | The seconds and nanoseconds that represent the time. |

## Other Special Fields
## Other special fields

| JSON log field | Description |
|:-------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------|
| `logging.googleapis.com/projectId` | Changes the project ID that this log will be written to. Ensure that you are authenticated to write logs to this project. |
| `logging.googleapis.com/local_resource_id` | Overrides the [configured `local_resource_id`](./stackdriver.md#resource-labels). |
| JSON log field | Description |
|:---------------|:------------|
| `logging.googleapis.com/projectId` | Changes the project ID that this log will be written to. Ensure that you are authenticated to write logs to this project. |
| `logging.googleapis.com/local_resource_id` | Overrides the [configured `local_resource_id`](./stackdriver.md#resource-labels). |

## Using Special Fields
## Use special fields

To use a special field, you must add a field with the right name and value to your log. Given an example structured log (internally in MessagePack but shown in JSON for demonstration):

Expand All @@ -38,7 +39,7 @@ To use a special field, you must add a field with the right name and value to yo
}
```

To use the `logging.googleapis.com/logName` special field, you would add it to your structured log as follows:
To use the `logging.googleapis.com/logName` special field, add it to your structured log as follows:

```json
{
Expand All @@ -47,7 +48,7 @@ To use the `logging.googleapis.com/logName` special field, you would add it to y
}
```

For the special fields that map to `LogEntry` prototypes, you will need to add them as objects with field names that match the proto. For example, to use the `logging.googleapis.com/operation`:
For the special fields that map to `LogEntry` prototypes, add them as objects with field names that match the proto. For example, to use the `logging.googleapis.com/operation`:

```json
{
Expand All @@ -63,62 +64,62 @@ For the special fields that map to `LogEntry` prototypes, you will need to add t

Adding special fields to logs is best done through the [`modify` filter](https://docs.fluentbit.io/manual/pipeline/filters/modify) for simple fields, or [a Lua script using the `lua` filter](https://docs.fluentbit.io/manual/pipeline/filters/lua) for more complex fields.

## Simple Type Special Fields
## Simple type special fields

For special fields with simple types (except for the [`logging.googleapis.com/insertId` field](#insert-id)), they will follow this pattern (demonstrated with the `logging.googleapis.com/logName` field):
Special fields with simple types (except for the [`logging.googleapis.com/insertId` field](#insert-id)) will follow this pattern (demonstrated with the `logging.googleapis.com/logName` field):

1. If the special field matches the type, it will be moved to the corresponding LogEntry field. For example:

```text
{
...
"logging.googleapis.com/logName": "my_log"
...
}
```

the logEntry will be:

```text
{
"jsonPayload": {
...
}
"logName": "my_log"
...
}
```

2. If the field is non-empty but an invalid, it will be left in the jsonPayload. For example:

```text
{
...
"logging.googleapis.com/logName": 12345
...
}
```

the logEntry will be:

```text
{
"jsonPayload": {
"logging.googleapis.com/logName": 12345
...
}
}
```
```text
{
...
"logging.googleapis.com/logName": "my_log"
...
}
```

the `logEntry` will be:

```text
{
"jsonPayload": {
...
}
"logName": "my_log"
...
}
```

1. If the field is non-empty but an invalid, it will be left in the `jsonPayload`. For example:

```text
{
...
"logging.googleapis.com/logName": 12345
...
}
```

the `logEntry` will be:

```text
{
"jsonPayload": {
"logging.googleapis.com/logName": 12345
...
}
}
```

### Exceptions

#### Insert ID

If the `logging.googleapis.com/insertId` field has an invalid type, the log will be rejected by the plugin and not sent to Cloud Logging.

#### Trace Sampled
#### Trace sampled

If the [`autoformat_stackdriver_trace` plugin configuration option]() is set to `true`, the value provided in the `trace` field will be formatted into the format that Cloud Logging expects along with the detected Project ID (from the Google Metadata server, configured in the plugin, or provided via special field).
If the`autoformat_stackdriver_trace` is set to `true`, the value provided in the `trace` field will be formatted into the format that Cloud Logging expects along with the detected Project ID (from the Google Metadata server, configured in the plugin, or provided using a special field).

For example, if `autoformat_stackdriver_trace` is enabled, this:

Expand Down Expand Up @@ -148,11 +149,11 @@ Will become this:

The `timestampSecond` and `timestampNano` fields don't map directly to the `timestamp` field in `LogEntry` so the parsing behaviour deviates from other special fields. Read more in the [Timestamp section](#timestamp).

## Proto Special Fields
## Proto special fields

For special fields that expect the format of a prototype from the `LogEntry` (except for the `logging.googleapis.com/monitored_resource` field) will follow this pattern (demonstrated with the `logging.googleapis.com/operation` field):

If any subfields of the proto are empty or in incorrect type, the plugin will set these subfields empty. For example:
If any sub-fields of the proto are empty or in incorrect type, the plugin will set these sub-fields empty. For example:

```text
{
Expand All @@ -166,7 +167,7 @@ If any subfields of the proto are empty or in incorrect type, the plugin will se
}
```

the logEntry will be:
the `logEntry` will be:

```text
{
Expand All @@ -181,7 +182,7 @@ the logEntry will be:
}
```

If the field itself is not a map, the plugin will leave this field untouched. For example:
If the field itself isn't a map, the plugin will leave this field untouched. For example:

```text
{
Expand All @@ -191,7 +192,7 @@ If the field itself is not a map, the plugin will leave this field untouched. Fo
}
```

the logEntry will be:
the `logEntry` will be:

```text
{
Expand All @@ -203,7 +204,7 @@ the logEntry will be:
}
```

If there are extra subfields, the plugin will add the recognized fields to the corresponding field in the LogEntry, and preserve the extra subfields in jsonPayload. For example:
If there are extra sub-fields, the plugin will add the recognized fields to the corresponding field in the LogEntry, and preserve the extra sub-fields in `jsonPayload`. For example:

```text
{
Expand All @@ -221,7 +222,7 @@ If there are extra subfields, the plugin will add the recognized fields to the c
}
```

the logEntry will be:
the `logEntry will be:

```text
{
Expand Down Expand Up @@ -249,45 +250,48 @@ the logEntry will be:

The `logging.googleapis.com/monitored_resource` field is parsed in a special way, meaning it has some important exceptions:

The `type` field from the [`MonitoredResource` proto]() is not parsed out of the special field. It is read from the [`resource` plugin configuration option](https://docs.fluentbit.io/manual/pipeline/outputs/stackdriver#configuration-parameters). If it is supplied in the `logging.googleapis.com/monitored_resource` special field, it will not be recognized.
The `type` field from the `MonitoredResource` proto isn't parsed out of the special field. It's read from the [`resource` plugin configuration option](https://docs.fluentbit.io/manual/pipeline/outputs/stackdriver#configuration-parameters). If it's supplied in the `logging.googleapis.com/monitored_resource` special field, it won't be recognized.

The `labels` field is expected to be an `object<string, string>`. If any fields have a value that is not a string, the value is ignored and not preserved. The plugin logs an error and drops the field.
The `labels` field is expected to be an `object<string, string>`. If any fields have a value that's not a string, the value is ignored and not preserved. The plugin logs an error and drops the field.

If no valid `labels` field is found, or if all entries in the `labels` object provided are invalid, the `logging.googleapis.com/monitored_resource` field is dropped in favour of automatically setting resource labels using other available information based on the configured `resource` type.

## Timestamp

We support two formats of time-related fields:
Fluent Bit supports the following formats of time-related fields:

Format 1 - timestamp:
Log body contains a `timestamp` field that includes the seconds and nanos fields.
- `timestamp`

```text
{
"timestamp": {
"seconds": CURRENT_SECONDS,
"nanos": CURRENT_NANOS
Log body contains a `timestamp` field that includes the seconds and nanoseconds fields.

```text
{
"timestamp": {
"seconds": CURRENT_SECONDS,
"nanos": CURRENT_NANOS
}
}
}
```
```

Format 2 - timestampSeconds/timestampNanos:
Log body contains both the `timestampSeconds` and `timestampNanos` fields.
- `timestampSeconds`/`timestampNanos`

```text
{
"timestampSeconds": CURRENT_SECONDS,
"timestampNanos": CURRENT_NANOS
}
```
Log body contains both the `timestampSeconds` and `timestampNanos` fields.

```text
{
"timestampSeconds": CURRENT_SECONDS,
"timestampNanos": CURRENT_NANOS
}
```

If one of the following JSON timestamp representations is present in a structured record, the plugin collapses them into a single representation in the timestamp field in the `LogEntry` object.

Without time-related fields, the plugin will set the current time as timestamp.

### Format 1

Set the input log as followed:
Set the input log as follows:

```text
{
"timestamp": {
Expand All @@ -298,7 +302,7 @@ Set the input log as followed:
}
```

the logEntry will be:
the `logEntry` will be:

```text
{
Expand All @@ -312,7 +316,8 @@ the logEntry will be:

### Format 2

Set the input log as followed:
Set the input log as follows:

```text
{
"timestampSeconds":1596149787,
Expand All @@ -321,7 +326,7 @@ Set the input log as followed:
}
```

the logEntry will be:
the `logEntry` will be:

```text
{
Expand All @@ -333,4 +338,5 @@ the logEntry will be:
}
```

If the `timestamp` object or the `timestampSeconds` and `timestampNanos` fields end up being invalid, they will remain in the `jsonPayload` untouched.
If the `timestamp` object or the `timestampSeconds` and `timestampNanos` fields end
up being invalid, they will remain in the `jsonPayload` untouched.
1 change: 1 addition & 0 deletions vale-styles/FluentBit/Headings.yml
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,7 @@ exceptions:
- LaunchDarkly
- Linux
- LogDNA
- LogEntry
- LTSV
- macOS
- Marketplace
Expand Down
3 changes: 3 additions & 0 deletions vale-styles/FluentBit/Spelling-exceptions.txt
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,7 @@ endcode
endhint
endtab
endtabs
Enum
Eventhouse
Exabeam
Fargate
Expand Down Expand Up @@ -140,6 +141,8 @@ preprocessor
Profiler
Prometheus
PromQL
proto
protobuf
Protobuf
proxying
Pulumi
Expand Down