Skip to content
Discussion options

You must be logged in to vote

I think you would need to re-index the logs-syslog-so index to another named index, or delete it for the new data stream to be created. Otherwise, Elastic will use the template for the old index and Logstash will continue writing to it, since it matches the logs-syslog-so name.

This script may work for you, but please test it before using it in production.

https://github.com/weslambert/securityonion-elastic-misc/blob/2.x/so-elasticsearch-reindex

For an index of that size it may take a very long time.

Replies: 3 comments 15 replies

Comment options

You must be logged in to vote
1 reply
@weslambert
Comment options

Comment options

You must be logged in to vote
14 replies
@Maurice-De
Comment options

@kvkamin
Comment options

@Maurice-De
Comment options

@weslambert
Comment options

Answer selected by Maurice-De
@Maurice-De
Comment options

@Maurice-De
Comment options

Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
2.4
Labels
None yet
3 participants