-
Notifications
You must be signed in to change notification settings - Fork 9
Description
Another couple of Elasticsearch mapping error issues. Look similar to others but with a new particular field. In these cases: filter warned can't get text on a START_OBJECT and startTime failed to parse as an int (it was a date string)
Cloudtrail has such a high number of fields (I see your mapping limit of 1000 being broken and raise you a mapping limit of 2000 being broken!) that adding custom logic for every problem case field feels like a losing battle. But if you're interested I'm sure I can find another half-dozen or so errors like this and provide more details to help fix.
It would be excellent to share a good mapping that can dynamically map the fields that are likely required and map everything else as a string (or rather text, keyword multi-field since Elasticsearch 2.x is long gone).
I have managed to find this template which looks promising, but would love to know if there were any more efforts floating around that come recommended by anybody here?
- Version: 3.0.0
- Operating System: Ubuntu server
- Config File (if you have sensitive info, please remove it):
input {
s3 {
bucket => "xxxxxxxxxxxxxxx"
type => "cloudtrail"
tags => [cloudtrail"]
region => "eu-west-1"
prefix => "xxxxxxxxxx"
Interval => 60
codec => "cloudtrail"
backup_to_bucket => "xxxxxxxx"
backup_add_prefix => "ingested/"
delete => true
exclude_pattern => ".*/CloudTrail\-Digest/.*"
}
}
output {
elasticsearch {
flush_size => 1000
hosts => ["xxxxxxxxxxx"]
index => "logstash-cloudtrail-%{+YYYY.MM}"
user => "xxxxxxxxx"
password => "xxxxxxxxxx"
}