Skip to content

Commit 9ce5d50

Browse files
committed
document post-install behavior of "cscli setup unattended"
1 parent 511f92a commit 9ce5d50

File tree

1 file changed

+34
-10
lines changed

1 file changed

+34
-10
lines changed

crowdsec-docs/docs/log_processor/intro.mdx

Lines changed: 34 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -4,13 +4,13 @@ title: Introduction
44
sidebar_position: 1
55
---
66

7-
The Log Processor is one of the core component of the Security Engine to:
7+
The Log Processor is a core component of the Security Engine. It:
88

9-
- Read logs from [Data Sources](log_processor/data_sources/introduction.md) in the form of Acquistions.
10-
- Parse the logs and extract relevant information using [Parsers](log_processor/parsers/introduction.mdx).
11-
- Enrich the parsed information with additional context such as GEOIP, ASN using [Enrichers](log_processor/parsers/enricher.md).
12-
- Monitor the logs for patterns of interest known as [Scenarios](log_processor/scenarios/introduction.mdx).
13-
- Push alerts to the Local API (LAPI) for alert/decisions to be stored within the database.
9+
- Reads logs from [Data Sources](log_processor/data_sources/introduction.md) via Acquistions.
10+
- Parses logs and extract relevant information using [Parsers](log_processor/parsers/introduction.mdx).
11+
- Enriches the parsed information with additional context such as GEOIP, ASN using [Enrichers](log_processor/parsers/enricher.md).
12+
- Monitors patterns of interest via [Scenarios](log_processor/scenarios/introduction.mdx).
13+
- Pushes alerts to the Local API (LAPI), where alert/decisions are stored.
1414

1515
!TODO: Add diagram of the log processor pipeline
1616
- Read logs from datasources
@@ -21,7 +21,7 @@ The Log Processor is one of the core component of the Security Engine to:
2121

2222
## Introduction
2323

24-
The Log Processor is an internal core component of the Security Engine in charge of reading logs from Data Sources, parsing them, enriching them, and monitoring them for patterns of interest.
24+
The Log Processor reads logs from Data Sources, parses and enriches them, and monitors them for patterns of interest.
2525

2626
Once a pattern of interest is detected, the Log Processor will push alerts to the Local API (LAPI) for alert/decisions to be stored within the database.
2727

@@ -35,10 +35,10 @@ Data Sources are individual modules that can be loaded at runtime by the Log Pro
3535

3636
Acquisitions are the configuration files that define how the Log Processor should read logs from a Data Source. Acquisitions are defined in YAML format and are loaded by the Log Processor at runtime.
3737

38-
We have two ways to define Acquisitions within the [configuration directory](/u/troubleshooting/security_engine#where-is-configuration-stored) :
38+
We support two ways to define Acquisitions in the [configuration directory](/u/troubleshooting/security_engine#where-is-configuration-stored):
3939

40-
- `acquis.yaml` file: This used to be only place to define Acquisitions prior to `1.5.0`. This file is still supported for backward compatibility.
41-
- `acquis.d` folder: This is a directory where you can define multiple Acquisitions in separate files. This is useful when you want to auto generate files using an external application such as ansible.
40+
- `acquis.yaml` file: the legacy, single-file configuration (still supported)
41+
- `acquis.d` directory: a directory of multiple acquisition files (since v1.5.0, recommended for any non-trivial setup)
4242

4343
```yaml title="Example Acquisition Configuration"
4444
## /etc/crowdsec/acquis.d/file.yaml
@@ -50,8 +50,32 @@ labels:
5050
type: syslog
5151
```
5252
53+
When CrowdSec is installed via a package manager on a fresh system, the post-install step may run `cscli setup` in **unattended** mode.
54+
It detects installed services and common log file locations, installs the related Hub collections, and generates acquisition files under `acquis.d/setup.<service>.yaml`, e.g. `setup.linux.yaml`).
55+
56+
Generated files are meant to be managed by crowdsec; don’t edit them in place. If you need changes, delete the generated file and create your own.
57+
58+
When upgrading or reinstalling crowdsec, it detects non-generated or modified files and won’t overwrite your custom acquisitions.
59+
60+
:::caution
61+
62+
Make sure the same data sources aren’t ingested more than once: duplicating inputs can artificially increase scenario sensitivity.
63+
64+
:::
65+
66+
Examples:
67+
68+
- If an application logs to both `journald` and `/var/log/*`, you usually only need one of them.
69+
70+
- If an application writes to `/var/log/syslog` or `/var/log/messages`, it’s already acquired by `setup.linux.yaml` (since 1.7) or `acquis.yam`. You don’t need to add a separate acquisition for the same logs.
71+
72+
For config-managed deployments (e.g., Ansible), set the environment variable `CROWDSEC_SETUP_UNATTENDED_DISABLE` to any non-empty value to skip the automated setup.
73+
In that case, ensure you configure at least one data source and install the OS collection (e.g., crowdsecurity/linux).
74+
5375
For more information on Data Sources and Acquisitions, see the [Data Sources](log_processor/data_sources/introduction.md) documentation.
5476

77+
For more information on the automated configuration, see the command `cscli setup`.
78+
5579
## Collections
5680

5781
Collections are used to group together Parsers, Scenarios, and Enrichers that are related to a specific application. For example the `crowdsecurity/nginx` collection contains all the Parsers, Scenarios, and Enrichers that are needed to parse logs from an NGINX web server and detect patterns of interest.

0 commit comments

Comments
 (0)