Custom Elasticsearch pipeline not processing any logs #4361
Replies: 4 comments 2 replies
-
I have seen #3547 where the syslog parser is modified directly but had avoided doing that as I assumed that was not best practice? Also in that instance it was dedicated to only processing that kind of logs whereas this is not the case for me. I can see logs are definitely being received as I can see them in Kibana when filtering by syslog, they are just unparsed. |
Beta Was this translation helpful? Give feedback.
-
The initial pipeline for events is defined in either filebeat.yml, or in the Logstash output for the particular type of event. |
Beta Was this translation helpful? Give feedback.
-
Hi.. i have the same issue.. ive been searching for any clues and stumbled upon this thread.. did you ever figure this out? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I have a standalone install of Security Onion 2.3.50 installed on Ubuntu 18.04.
I am trying to send syslogs from a firewall that is not supported by the default syslog parser,
/opt/so/conf/elasticsearch/ingest/syslog
. After a lot of reading and banging my head against the keyboard I have got to a point where I have:/opt/so/saltstack/local/salt/elasticsearch/files/ingest/test
.sudo so-elasticsearch-restart
.sudo so-elasticsearch-pipelines-list
.However,
sudo so-elasticsearch-pipeline-stats test
returns that the pipeline has not processed any entries.I had not really changed anything from the default syslog parser apart from the name and the grok patterns.
What I'm struggling to find any information on is how Elasticsearch knows which pipeline to use when it first receives a log? I understand you can direct input to another pipeline however I grep'd the parser folder for
syslog
and the only results were thesyslog
parser and thezeek.syslog
parser.Beta Was this translation helpful? Give feedback.
All reactions