You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: crowdsec-docs/docs/log_processor/intro.mdx
+34-10Lines changed: 34 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,13 +4,13 @@ title: Introduction
4
4
sidebar_position: 1
5
5
---
6
6
7
-
The Log Processor is one of the core component of the Security Engine to:
7
+
The Log Processor is a core component of the Security Engine. It:
8
8
9
-
-Read logs from [Data Sources](log_processor/data_sources/introduction.md)in the form of Acquistions.
10
-
-Parse the logs and extract relevant information using [Parsers](log_processor/parsers/introduction.mdx).
11
-
-Enrich the parsed information with additional context such as GEOIP, ASN using [Enrichers](log_processor/parsers/enricher.md).
12
-
-Monitor the logs for patterns of interest known as[Scenarios](log_processor/scenarios/introduction.mdx).
13
-
-Push alerts to the Local API (LAPI) for alert/decisions to be stored within the database.
9
+
-Reads logs from [Data Sources](log_processor/data_sources/introduction.md)via Acquistions.
10
+
-Parses logs and extract relevant information using [Parsers](log_processor/parsers/introduction.mdx).
11
+
-Enriches the parsed information with additional context such as GEOIP, ASN using [Enrichers](log_processor/parsers/enricher.md).
12
+
-Monitors patterns of interest via[Scenarios](log_processor/scenarios/introduction.mdx).
13
+
-Pushes alerts to the Local API (LAPI), where alert/decisions are stored.
14
14
15
15
!TODO: Add diagram of the log processor pipeline
16
16
- Read logs from datasources
@@ -21,7 +21,7 @@ The Log Processor is one of the core component of the Security Engine to:
21
21
22
22
## Introduction
23
23
24
-
The Log Processor is an internal core component of the Security Engine in charge of reading logs from Data Sources, parsing them, enriching them, and monitoring them for patterns of interest.
24
+
The Log Processor reads logs from Data Sources, parses and enriches them, and monitors them for patterns of interest.
25
25
26
26
Once a pattern of interest is detected, the Log Processor will push alerts to the Local API (LAPI) for alert/decisions to be stored within the database.
27
27
@@ -35,10 +35,10 @@ Data Sources are individual modules that can be loaded at runtime by the Log Pro
35
35
36
36
Acquisitions are the configuration files that define how the Log Processor should read logs from a Data Source. Acquisitions are defined in YAML format and are loaded by the Log Processor at runtime.
37
37
38
-
We have two ways to define Acquisitions within the [configuration directory](/u/troubleshooting/security_engine#where-is-configuration-stored):
38
+
We support two ways to define Acquisitions in the [configuration directory](/u/troubleshooting/security_engine#where-is-configuration-stored):
39
39
40
-
-`acquis.yaml` file: This used to be only place to define Acquisitions prior to `1.5.0`. This file is still supported for backward compatibility.
41
-
-`acquis.d`folder: This is a directory where you can define multiple Acquisitions in separate files. This is useful when you want to auto generate files using an external application such as ansible.
40
+
-`acquis.yaml` file: the legacy, single-file configuration (still supported)
41
+
-`acquis.d`directory: a directory of multiple acquisition files (since v1.5.0, recommended for any non-trivial setup)
42
42
43
43
```yaml title="Example Acquisition Configuration"
44
44
## /etc/crowdsec/acquis.d/file.yaml
@@ -50,8 +50,32 @@ labels:
50
50
type: syslog
51
51
```
52
52
53
+
When CrowdSec is installed via a package manager on a fresh system, the post-install step may run `cscli setup` in **unattended** mode.
54
+
It detects installed services and common log file locations, installs the related Hub collections, and generates acquisition files under `acquis.d/setup.<service>.yaml`, e.g. `setup.linux.yaml`).
55
+
56
+
Generated files are meant to be managed by crowdsec; don’t edit them in place. If you need changes, delete the generated file and create your own.
57
+
58
+
When upgrading or reinstalling crowdsec, it detects non-generated or modified files and won’t overwrite your custom acquisitions.
59
+
60
+
:::caution
61
+
62
+
Make sure the same data sources aren’t ingested more than once: duplicating inputs can artificially increase scenario sensitivity.
63
+
64
+
:::
65
+
66
+
Examples:
67
+
68
+
- If an application logs to both `journald` and `/var/log/*`, you usually only need one of them.
69
+
70
+
- If an application writes to `/var/log/syslog` or `/var/log/messages`, it’s already acquired by `setup.linux.yaml` (since 1.7) or `acquis.yam`. You don’t need to add a separate acquisition for the same logs.
71
+
72
+
For config-managed deployments (e.g., Ansible), set the environment variable `CROWDSEC_SETUP_UNATTENDED_DISABLE` to any non-empty value to skip the automated setup.
73
+
In that case, ensure you configure at least one data source and install the OS collection (e.g., crowdsecurity/linux).
74
+
53
75
For more information on Data Sources and Acquisitions, see the [Data Sources](log_processor/data_sources/introduction.md) documentation.
54
76
77
+
For more information on the automated configuration, see the command `cscli setup`.
78
+
55
79
## Collections
56
80
57
81
Collections are used to group together Parsers, Scenarios, and Enrichers that are related to a specific application. For example the `crowdsecurity/nginx` collection contains all the Parsers, Scenarios, and Enrichers that are needed to parse logs from an NGINX web server and detect patterns of interest.
0 commit comments