You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- `indexes_extraction(1)` - to extract indexes from search logs
Updated reports/alerts:
- `AllSplunkEnterpriseLevel - Splunkd Crash Logs Have Appeared in Production` - updated based on email feedback to use sourcetype (as source matching needed wildcards)
- `IndexerLevel - Slow peer from remote searches` - corrected comment in search only
- `SearchHeadLevel - Search Queries summary exact match`
- `SearchHeadLevel - Search Queries summary non-exact match`
- `SearchHeadLevel - SmartStore cache misses - dashboards`
- `SearchHeadLevel - SmartStore cache misses - savedsearches`
- `SearchHeadLevel - SmartStore cache misses - combined`
- `SearchHeadLevel - Datamodel REST endpoint indexes in use`
- `SearchHeadLevel - indexes per savedsearch`
- `SearchHeadLevel - Indexes for savedsearch without subsearches`
- `SearchHeadLevel - indexes per dashboard`
Updated reports/alerts:
- `AllSplunkEnterpriseLevel - Splunk Scheduler excessive delays in executing search`
- `AllSplunkEnterpriseLevel - sendmodalert errors`
- `SearchHeadLevel - Alerts that have not fired an action in X days
- `SearchHeadLevel - Scheduled Search Efficiency`
To extract savedsearch_name (as I found you can have savedsearches with double quotes in the title).
Copy file name to clipboardExpand all lines: README.md
+25Lines changed: 25 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -359,6 +359,31 @@ These are appear to be from premium apps but it does imply that there is a mecha
359
359
Feel free to open an issue on github or use the contact author on the SplunkBase link and I will try to get back to you when possible, thanks!
360
360
361
361
## Release Notes
362
+
### 4.0.6
363
+
New macros:
364
+
-`indexes_extraction(1)` - to extract indexes from search logs
365
+
366
+
Updated reports/alerts:
367
+
-`AllSplunkEnterpriseLevel - Splunkd Crash Logs Have Appeared in Production` - updated based on email feedback to use sourcetype (as source matching needed wildcards)
368
+
-`IndexerLevel - Slow peer from remote searches` - corrected comment in search only
search = ```Attempt to find alerts that are scheduled but not firing any actions, the alerts may need further review or may no longer be required. The app regex is in here because of some creative alert naming, X:app=Y is a real alert name in my environment!```\
| rex "savedsearch_id=\"[^;]+;[^;]+;(?P<savedsearch_name>.*?)\"," \
4070
4073
| stats avg(run_time) as average_runtime_in_sec count(savedsearch_name) as num_times_per_week sum(run_time) as total_runtime_sec by savedsearch_name user app host\
```We now deal with cases where search earliest/latest times were not specified, assume all time is about 1 year in the past and latest time was the search run time``` \
| stats values(timestamp) AS _time, values(total_run_time) AS total_run_time, values(event_count) AS event_count, values(scan_count) AS scan_count, values(search_et) AS search_et, values(search_lt) AS search_lt, values(savedsearch_name) AS savedsearch_name, values(multi) AS multi, max(duration_index) AS duration_index, max(duration_rawdata) AS duration_rawdata, max(cache_index_hits) AS cache_index_hits, max(cache_index_miss) AS cache_index_miss, max(cache_index_hit_duration) AS cache_index_hit_duration, max(cache_index_miss_duration) AS cache_index_miss_duration, max(cache_rawdata_hits) AS cache_rawdata_hits, max(cache_rawdata_miss) AS cache_rawdata_miss, max(cache_rawdata_hit_duration) AS cache_rawdata_hit_duration, max(cache_rawdata_miss_duration) AS cache_rawdata_miss_duration, values(provenance) AS provenance by user, type, indexes, search_head_cluster, search_id, app_name \
4727
4724
| eval period=search_lt-search_et \
@@ -4815,10 +4812,7 @@ search = | multisearch \
4815
4812
| rex field=search "(?P<esstylewildcard>\(\s*index=\*\s+OR\s+index=_\*\s*\))" \
4816
4813
| rex mode=sed field=search "s/search index=\s*\S+\s+index\s*=/search index=/g" \
4817
4814
```Extract out index= or index IN (a,b,c) but avoid NOT index in (...) and NOT index=... and also NOT (...anything) statements``` \
4818
-
| rex field=search "(?s)(NOT\s+index(\s*=\s*|::)[^ ]+)|(NOT\s+\([^\)]+\))|(index(\s*=\s*|::)\"?(?P<indexregex>[\*A-Za-z0-9-_]+))" max_match=50 \
4819
-
| rex field=search "(?s)(NOT\s+index\s+[iI][nN]\s*\([^\)]+)|(index\s+[iI][nN]\s*\((?P<indexin>([^\)\"]+)|\"[^\)\"]+\"))" max_match=50 \
search = ```This warning when occurring repetitively tends to indicate some kind of issue that will require the file to be manually removed. For example a zero sized metadata file that cannot be reaped by the dispatch reaper``` \
6201
+
search = ```This alert attempts to find peers that are relatively slow compared to other peers. I've used a hardcoded time rather than a variable as it appeared to work well. ``` \
6208
6202
index=_internal `indexerhosts` source=*remote_searches.log terminated: OR closed: \
6209
6203
| regex search!="^(pretypeahead|copybuckets)" \
6210
6204
| rex "(?s) elapsedTime=(?P<elapsedTime>[0-9\.]+),( cpuTime=\S+,)? search='(?P<search>.*?)(', savedsearch_name|\", drop_count=\d+)" \
@@ -7021,13 +7015,7 @@ invocations_command_search_index_bucketcache_miss>0 OR invocations_command_searc
| stats latest(mostRecent) AS mostRecent, count as number_of_runs values(host) as host values(total_hours_searched) AS total_hours_searched values(total_days_searched) AS total_days_searched max(run_time) AS max_run_time avg(run_time) AS avg_run_time sum(run_time) AS sum_run_time sum(total_cache_miss) as total_cache_miss max(result_count) AS result_count max(event_count) AS event_count max(searched_buckets) AS searched_buckets values(info) AS info values(numofsearchesinquery) AS numofsearchesinquery, values(app) AS app by users search \
7079
-
| rex field=search "(?s)(NOT\s+index(\s*=\s*|::[^ ]+)|(NOT\s+\([^\)]+\))|(index(\s*=\s*|::)\"?(?P<indexregex>[\*A-Za-z0-9-_]+))" max_match=50 \
7080
-
| rex field=search "(?s)(NOT\s+index\s+[iI][nN]\s*\([^\)]+)|(index\s+[iI][nN]\s*\((?P<indexin>([^\)\"]+)|\"[^\)\"]+\"))" max_match=50 \
| stats latest(mostRecent) AS mostRecent, count as number_of_runs, values(host) as host values(total_hours_searched) AS total_hours_searched values(total_days_searched) AS total_days_searched max(run_time) AS max_run_time avg(run_time) AS avg_run_time sum(run_time) AS sum_run_time sum(total_cache_miss) as total_cache_miss max(result_count) AS result_count max(event_count) AS event_count max(searched_buckets) AS searched_buckets values(info) AS info values(numofsearchesinquery) AS numofsearchesinquery, values(provenance) AS provenance, values(app) AS app by users search \
7134
-
| rex field=search "(?s)(NOT\s+index(\s*=\s*|::)[^ ]+)|(NOT\s+\([^\)]+\))|(index(\s*=\s*|::)\"?(?P<indexregex>[\*A-Za-z0-9-_]+))" max_match=50 \
7135
-
| rex field=search "(?s)(NOT\s+index\s+[iI][nN]\s*\([^\)]+)|(index\s+[iI][nN]\s*\((?P<indexin>([^\)\"]+)|\"[^\)\"]+\"))" max_match=50 \
0 commit comments