You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: crowdsec-docs/docs/getting_started/intro.md
+5-1Lines changed: 5 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,6 +18,10 @@ See [Version Matrix](/getting_started/versions_matrix.md) for a list of supporte
18
18
19
19
## Why is my Security Engine classed as a "log processor" within the console?
20
20
21
-
The `Security Engine` comes compiled with a number of optional features that can be enabled or disabled at runtime. One of these features is called the "LAPI" (Local API). If this feature is disabled at runtime, the Security Engine will be classed as a "log processor" within the console as it will only be able to process logs and forward the alerts to the local API you define in the configuration.
21
+
The `Security Engine` comes compiled with a number of core components that can be enabled or disabled at runtime.
22
+
23
+
One of these features is called the "LAPI" (Local API). If this feature is disabled at runtime, the Security Engine will be classed as a "log processor" within the console as it will only be able to process logs and forward the alerts to the local API you define in the configuration.
24
+
25
+
Read more about the [Log Processor](log_processor/intro.mdx) and the [Local API](local_api/intro.md).
22
26
23
27
Most commonly this is the case when you are running in a distributed setup, where you have a central server that is running the LAPI and a number of remote servers that are running the "Log processors".
Copy file name to clipboardExpand all lines: crowdsec-docs/docs/intro.mdx
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -49,9 +49,9 @@ In addition to the core "detect and react" mechanism, CrowdSec is committed to s
49
49
50
50
Under the hood, the Security Engine has various components:
51
51
52
-
- The Log Processor is in charge of detection: it analyzes logs from [various data sources](data_sources/intro) or [HTTP requests](appsec/intro) from web servers.
52
+
- The [Log Processor](log_processor/intro.mdx) is in charge of detection: it analyzes logs from [various data sources](data_sources/intro) or [HTTP requests](appsec/intro) from web servers.
53
53
- The [Appsec](appsec/intro) feature is part of the Log Processor and filters HTTP Requests from the compatible web servers.
54
-
- The [Local API](/local_api/intro.md) acts as a middle man:
54
+
- The [Local API](local_api/intro.md) acts as a middle man:
55
55
- Between the [Log Processors](/docs/data_sources/intro) and the [Remediation Components](/u/bouncers/intro) which are in charge of enforcing decisions.
56
56
- And with the [Central API](/central_api/intro.md) to share alerts and receive blocklists.
57
57
- The [Remediation Components](/u/bouncers/intro) - also known as bouncers - block malicious IPs at your chosen level—whether via IpTables, firewalls, web servers, or reverse proxies. [See the full list on our CrowdSec Hub.](https://app.crowdsec.net/hub/remediation-components)
As the [Log Processor](log_processor/intro.mdx) processes logs, it will detect patterns of interest known as [Scenarios](log_processor/scenarios/introduction.mdx). When a scenario is detected, an alert is generated and sent to the [Local API](local_api/intro.md) (LAPI) for evaluation.
9
+
10
+
When the alert is generated you can define additional Alert Context that can be sent along with the alert to give you context about the alert. This can be useful when you host multiple applications on the same server and you want to know which application generated the alert.
11
+
12
+
### Format
13
+
14
+
The format of Alert Context are key value pairs that are sent along with the alert. When you install some [Collections](log_processor/collections/intro.md) you will see that they come with Alert Context pre-configured.
15
+
16
+
For example if you install the `crowdsecurity/nginx` collection you will see that the `http_base` context is added:
17
+
18
+
```yaml
19
+
#this context file is intended to provide minimal and useful information about HTTP scenarios.
20
+
context:
21
+
target_uri:
22
+
- evt.Meta.http_path
23
+
user_agent:
24
+
- evt.Meta.http_user_agent
25
+
method:
26
+
- evt.Meta.http_verb
27
+
status:
28
+
- evt.Meta.http_status
29
+
```
30
+
31
+
Contexts are stored within the `contexts` directory within the root of the `config` directory, you can see the directory based on your OS [here](/u/troubleshooting/security_engine#where-is-configuration-stored).
32
+
33
+
:::info
34
+
As an example the default directory for linux is `/etc/crowdsec/` so the `contexts` directory would be `/etc/crowdsec/contexts/`
35
+
:::
36
+
37
+
Here a quick breakdown of the context file:
38
+
39
+
- `context` : This is the root key of the context file.
40
+
- `target_uri` : This is the key that will be used as the "name" of the context.
41
+
- `evt.Meta.http_path` : This is the expression that will be evaluated to get the value of the context. In this case it will be the `http_path` field from the event.
42
+
43
+
The next key value pair would be `user_agent` and so on.
44
+
45
+
## Next Steps?
46
+
47
+
We have written a full guide on Alert Context that you can find [here](/u/user_guides/alert_context). This guide will show you how to create your own Alert Context and how to use it within your scenarios.
Copy file name to clipboardExpand all lines: crowdsec-docs/docs/log_processor/intro.mdx
+14-28Lines changed: 14 additions & 28 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -25,9 +25,11 @@ The Log Processor is an internal core component of the Security Engine in charge
25
25
26
26
Once a pattern of interest is detected, the Log Processor will push alerts to the Local API (LAPI) for alert/decisions to be stored within the database.
27
27
28
+
All subcategories below are related to the Log Processor and its functionalities. If you are utilizing a multi server architecture, you will only need to configure the functionality that you want to use on the Log Processor.
29
+
28
30
## Data Sources
29
31
30
-
Data Sources are individual modules that can be loaded at runtime by the Log Processor to read logs from various sources. To define a Data Source, you will need to create an acquisition configuration file.
32
+
Data Sources are individual modules that can be loaded at runtime by the Log Processor to read logs from various sources. To use a Data Source, you will need to create an acquisition configuration file.
31
33
32
34
### Acquistions
33
35
@@ -64,40 +66,24 @@ The parsing pipeline is broken down into multiple stages:
64
66
- `s01-parse` : This is the second stage responsible for extracting relevant information from the normalized logs based on the application type to be used by `s02-enrich` and the [Scenarios](log_processor/scenarios/introduction.mdx).
65
67
- `s02-enrich` : This is the third stage responsible for enriching the extracted information with additional context such as GEOIP, ASN etc.
66
68
67
-
:::info
68
-
We will give a breif overview of each stage, however, for most users this documentation is not required to get started with CrowdSec but can be used to understand the inner workings of the Log Processor.
69
-
:::
70
-
71
-
#### `s00-raw`
72
-
73
-
This stage is responsible for normalizing logs from various [Data Sources](log_processor/data_sources/introduction.md) into a predictable format for `s01-parse` and `s02-enrich` to work on.
74
-
75
-
For example if you have a `syslog` Data Source and a `container` Data Source writing the same application log lines you wouldnt want `s01-parse` to handle this logic twice, since `s00-raw` can normalize the logs into a predictable format.
76
-
77
-
For most instances we have already created these `s00-raw` parsers for you are available to view on the [Hub](https://hub.crowdsec.net/).
78
-
79
-
#### `s01-parse`
80
-
81
-
The stage is responsible for extracting relevant information from the normalized logs based on the application type.
82
-
83
-
The application type is defined in different ways based on the Data Source. Please refer to the [Data Sources](log_processor/data_sources/introduction.md) documentation for more information.
69
+
You can see more information on Parsers in the [documentation](log_processor/parsers/introduction.mdx).
84
70
85
-
We list all available applications we support on the [Hub](https://hub.crowdsec.net/) and within the readme of the collection our users provide an example Acquisition configuration.
71
+
### Scenarios
86
72
87
-
#### `s02-enrich`
73
+
Scenarios are the patterns of interest that the Log Processor is monitoring for. When a pattern of interest is detected, the Log Processor will push alerts to the Local API (LAPI) for alert/decisions to be stored within the database.
88
74
89
-
The aim of this stage is to enrich the extracted information with additional context such as GEOIP, ASN etc.
75
+
The patterns can be as simple as tracking the number of failed login attempts or as complex as tracking logging in from multiple countries within a short period of time which can be a indicator of a compromised account or VPN usage.
90
76
91
-
However, the stage can also be used to perform whitelist checks, however, we have dedicated documentation for this [here](log_processor/whitelist/introduction.md).
77
+
The community provides a number of scenarios on the [Hub](https://hub.crowdsec.net/) that you can install and use. If you would like to create your own, see the [Scenarios](log_processor/scenarios/introduction.mdx) documentation.
92
78
93
-
Currently we have a few enrichers available on the [Hub](https://hub.crowdsec.net/), that are installed by default so you dont need to worry about this stage unless you want to create your own.
79
+
### whitelists
94
80
95
-
For more information on Parsers, see the [Parsers](log_processor/parsers/introduction.mdx) documentation.
81
+
Whitelists are used to exclude certain events from being processed by the Log Processor. For example, you may want to exclude certain IP addresses from being processed by the Log Processor.
96
82
97
-
### Scenarios
83
+
You can see more information on Whitelists in the [documentation](log_processor/whitelist/introduction.md).
98
84
99
-
Scenarios are the patterns of interest that the Log Processor is monitoring for. When a pattern of interest is detected, the Log Processor will push alerts to the Local API (LAPI) for alert/decisions to be stored within the database.
85
+
### Alert Context
100
86
101
-
The patterns can be as simple as tracking the number of failed login attempts or as complex as tracking logging in from multiple countries within a short period of time which can be a indicator of a compromised account or VPN usage.
87
+
Alert Context is additional context that can sent with an alert to the LAPI. This context can be shown locally via `cscli` or within the [CrowdSec Console](https://app.crowdsec.net/signup) if you opt in to share context when you enroll your instance.
102
88
103
-
The community provides a number of scenarios on the [Hub](https://hub.crowdsec.net/) that you can install and use. If you would like to create your own, see the [Scenarios](log_processor/scenarios/introduction.mdx) documentation.
89
+
You can read more about Alert Context in the [documentation](log_processor/alert_context/intro.md).
Copy file name to clipboardExpand all lines: crowdsec-docs/docs/log_processor/parsers/introduction.mdx
+27-5Lines changed: 27 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -34,13 +34,35 @@ Parsers are organized into stages to allow pipelines and branching in parsing. A
34
34
</div>
35
35
</div>
36
36
37
-
Each parser can add, change or even delete data from the event. The current approach is:
38
-
-`s00-raw`: takes care of the overall log structure (ie. extract log lines from JSON blob, [parse syslog protocol](https://hub.crowdsec.net/author/crowdsecurity/configurations/syslog-logs) info)
39
-
-`s01-parse`: parses the *actual* log line ([ssh](https://hub.crowdsec.net/author/crowdsecurity/configurations/sshd-logs), [nginx](https://hub.crowdsec.net/author/crowdsecurity/configurations/nginx-logs) etc.)
40
-
-`s02-enrich`: does some post processing, such as [geoip-enrich](https://hub.crowdsec.net/author/crowdsecurity/configurations/geoip-enrich) or post-parsing of [http events to provide more context](https://hub.crowdsec.net/author/crowdsecurity/configurations/http-logs)
37
+
The parsing pipeline is broken down into multiple stages:
41
38
39
+
-`s00-raw` : This is the first stage which aims to normalize the logs from various [Data Sources](log_processor/data_sources/introduction.md) into a predictable format for `s01-parse` and `s02-enrich` to work on.
40
+
-`s01-parse` : This is the second stage responsible for extracting relevant information from the normalized logs based on the application type to be used by `s02-enrich` and the [Scenarios](log_processor/scenarios/introduction.mdx).
41
+
-`s02-enrich` : This is the third stage responsible for enriching the extracted information with additional context such as GEOIP, ASN etc.
42
42
43
-
Once an event has successfully exited the parsing pipeline, it is ready to be matched against scenarios. As you might expect, each parser relies on the information that is parsed during previous stages.
43
+
### `s00-raw`
44
+
45
+
This stage is responsible for normalizing logs from various [Data Sources](log_processor/data_sources/introduction.md) into a predictable format for `s01-parse` and `s02-enrich` to work on.
46
+
47
+
For example if you have a `syslog` Data Source and a `container` Data Source writing the same application log lines you wouldnt want `s01-parse` to handle this logic twice, since `s00-raw` can normalize the logs into a predictable format.
48
+
49
+
For most instances we have already created these `s00-raw` parsers for you are available to view on the [Hub](https://hub.crowdsec.net/).
50
+
51
+
### `s01-parse`
52
+
53
+
The stage is responsible for extracting relevant information from the normalized logs based on the application type.
54
+
55
+
The application type is defined in different ways based on the Data Source. Please refer to the [Data Sources](log_processor/data_sources/introduction.md) documentation for more information.
56
+
57
+
We list all available applications we support on the [Hub](https://hub.crowdsec.net/) and within the readme of the collection our users provide an example Acquisition configuration.
58
+
59
+
### `s02-enrich`
60
+
61
+
The aim of this stage is to enrich the extracted information with additional context such as GEOIP, ASN etc.
62
+
63
+
However, the stage can also be used to perform whitelist checks, however, we have dedicated documentation for this [here](log_processor/whitelist/introduction.md).
64
+
65
+
Currently we have a few enrichers available on the [Hub](https://hub.crowdsec.net/), that are installed by default so you dont need to worry about this stage unless you want to create your own.
Copy file name to clipboardExpand all lines: crowdsec-docs/sidebars.js
+9-21Lines changed: 9 additions & 21 deletions
Original file line number
Diff line number
Diff line change
@@ -1,24 +1,3 @@
1
-
/**
2
-
* Creating a sidebar enables you to:
3
-
- create an ordered group of docs
4
-
- render a sidebar for each doc of that group
5
-
- provide next/previous navigation
6
-
7
-
The sidebars can be generated from the filesystem, or explicitly defined here.
8
-
9
-
Create as many sidebars as you want.
10
-
11
-
12
-
13
-
- regrouper les choses oriente utilisateur (genre gestion des decisions etc.) dans la partie user guide
14
-
- avoir un item 'profil' de premier level (ou on parle des notifs/plugins)
15
-
- regrouper central et local api (? hes:bof)
16
-
- regrouper : dashboard et observability
17
-
- dupliquer des liens vers le format de config des log_processor/parsers/log_processor/scenarios/... entre la partie "log_processor/parsers" et la partie "configuration files format"
18
-
19
-
20
-
*/
21
-
22
1
module.exports={
23
2
// By default, Docusaurus generates a sidebar from the docs folder structure
0 commit comments