Skip to content

Commit cd5da0e

Browse files
committed
New macros:
- `indexes_extraction(1)` - to extract indexes from search logs Updated reports/alerts: - `AllSplunkEnterpriseLevel - Splunkd Crash Logs Have Appeared in Production` - updated based on email feedback to use sourcetype (as source matching needed wildcards) - `IndexerLevel - Slow peer from remote searches` - corrected comment in search only - `SearchHeadLevel - Search Queries summary exact match` - `SearchHeadLevel - Search Queries summary non-exact match` - `SearchHeadLevel - SmartStore cache misses - dashboards` - `SearchHeadLevel - SmartStore cache misses - savedsearches` - `SearchHeadLevel - SmartStore cache misses - combined` - `SearchHeadLevel - Datamodel REST endpoint indexes in use` - `SearchHeadLevel - indexes per savedsearch` - `SearchHeadLevel - Indexes for savedsearch without subsearches` - `SearchHeadLevel - indexes per dashboard` Updated reports/alerts: - `AllSplunkEnterpriseLevel - Splunk Scheduler excessive delays in executing search` - `AllSplunkEnterpriseLevel - sendmodalert errors` - `SearchHeadLevel - Alerts that have not fired an action in X days - `SearchHeadLevel - Scheduled Search Efficiency` To extract savedsearch_name (as I found you can have savedsearches with double quotes in the title).
1 parent bc32a2f commit cd5da0e

File tree

4 files changed

+57
-65
lines changed

4 files changed

+57
-65
lines changed

README.md

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -359,6 +359,31 @@ These are appear to be from premium apps but it does imply that there is a mecha
359359
Feel free to open an issue on github or use the contact author on the SplunkBase link and I will try to get back to you when possible, thanks!
360360

361361
## Release Notes
362+
### 4.0.6
363+
New macros:
364+
- `indexes_extraction(1)` - to extract indexes from search logs
365+
366+
Updated reports/alerts:
367+
- `AllSplunkEnterpriseLevel - Splunkd Crash Logs Have Appeared in Production` - updated based on email feedback to use sourcetype (as source matching needed wildcards)
368+
- `IndexerLevel - Slow peer from remote searches` - corrected comment in search only
369+
- `SearchHeadLevel - Search Queries summary exact match`
370+
- `SearchHeadLevel - Search Queries summary non-exact match`
371+
- `SearchHeadLevel - SmartStore cache misses - dashboards`
372+
- `SearchHeadLevel - SmartStore cache misses - savedsearches`
373+
- `SearchHeadLevel - SmartStore cache misses - combined`
374+
- `SearchHeadLevel - Datamodel REST endpoint indexes in use`
375+
- `SearchHeadLevel - indexes per savedsearch`
376+
- `SearchHeadLevel - Indexes for savedsearch without subsearches`
377+
- `SearchHeadLevel - indexes per dashboard`
378+
379+
Updated reports/alerts:
380+
- `AllSplunkEnterpriseLevel - Splunk Scheduler excessive delays in executing search`
381+
- `AllSplunkEnterpriseLevel - sendmodalert errors`
382+
- `SearchHeadLevel - Alerts that have not fired an action in X days
383+
- `SearchHeadLevel - Scheduled Search Efficiency`
384+
385+
To extract savedsearch_name (as I found you can have savedsearches with double quotes in the title).
386+
362387
### 4.0.5
363388
New alerts:
364389
- `AllSplunkEnterpriseLevel - Splunk servers with resource starvation v2`

default/app.conf

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ supported_themes = light,dark
1414
[launcher]
1515
author = Gareth Anderson
1616
description = Alerts and dashboards as described in the Splunk 2017 conf presentation How did you get so big?
17-
version = 4.0.5
17+
version = 4.0.6
1818

1919
[package]
2020
id = SplunkAdmins

default/macros.conf

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -925,3 +925,19 @@ iseval = 0
925925
[splunkadmins_events_per_second]
926926
definition = desc.savedsearch_name IN ("Example")
927927
iseval = 0
928+
929+
[indexes_extraction(1)]
930+
args = search
931+
definition = rex field=search "(?s)(NOT\s+index(\s*=\s*|::)[^ ]+)|(NOT\s+\([^\)]+\))|(index(\s*=\s*|::)\"?(?P<indexregex>[\*A-Za-z0-9-_]+))" max_match=50 \
932+
| rex field=search "(?s)(NOT\s+index\s+[iI][nN]\s*\([^\)]+)|(index\s+[iI][nN]\s*\((?P<indexinall>[%\)]+))" max_match=50 \
933+
| rex field=indexinall "(?s)\s*(\"(?P<indexin>[^\", ]+))|(?P<indexin2>[^,\s\"]+)" max_match=50 \
934+
| makemv tokenizer="([^, ]+)" indexin \
935+
| eval indexes=mvappend(indexregex,indexin) \
936+
| eval indexes=if(isnotnull(esstylewildcard),mvfilter(NOT match(indexes,"^_?\*$")),indexes) \
937+
| eval wildcard=mvfilter(match(indexes,"\*")) \
938+
| where isnull(wildcard) \
939+
| eval indexes=mvmap(indexes, replace(lower(indexes), "\"", "")) \
940+
| eval indexes=mvmap(indexes, trim(replace(indexes, "'", ""))) \
941+
| eval indexes=mvdedup(indexes) \
942+
| makemv indexes tokenizer=(\S+)
943+
iseval = 0

default/savedsearches.conf

Lines changed: 15 additions & 64 deletions
Original file line numberDiff line numberDiff line change
@@ -1604,6 +1604,7 @@ index=_internal `splunkenterprisehosts` sourcetype=scheduler app=* scheduled_tim
16041604
| eval time=strftime(_time,"%+") \
16051605
| eval delay_in_start = (dispatch_time - scheduled_time) \
16061606
| where delay_in_start>100\
1607+
| rex "savedsearch_id=\"[^;]+;[^;]+;(?P<savedsearch_name>.*?)\"," \
16071608
| eval scheduled_time=strftime(scheduled_time,"%+") \
16081609
| eval dispatch_time=strftime(dispatch_time,"%+") \
16091610
| rename time AS endTime \
@@ -1970,6 +1971,7 @@ request.ui_dispatch_view = search
19701971
search = ```Attempt to find alerts that are scheduled but not firing any actions, the alerts may need further review or may no longer be required. The app regex is in here because of some creative alert naming, X:app=Y is a real alert name in my environment!```\
19711972
index=_internal source="*scheduler.log" sourcetype=scheduler `searchheadhosts` alert_actions!="" \
19721973
| rex ", app=\"(?P<app>[^\"]+)\","\
1974+
| rex "savedsearch_id=\"[^;]+;[^;]+;(?P<savedsearch_name>.*?)\"," \
19731975
| stats count by savedsearch_name, app \
19741976
| append \
19751977
[| rest `splunkadmins_restmacro` /servicesNS/-/-/saved/searches \
@@ -2245,7 +2247,7 @@ OR "sendmodalert - Invoking modular alert action"\
22452247
`splunkadmins_sendmodalert_errors`\
22462248
| rex field=results_file "[/\\\]dispatch[/\\\](?P<sid>[^/]+)"\
22472249
| eval sid=if(isnull(sid),"NOMATCH",sid)\
2248-
| join sid type=outer [search index=_internal source="*scheduler.log" sourcetype=scheduler `splunkenterprisehosts` | table sid, savedsearch_name, app, user]\
2250+
| join sid type=outer [search index=_internal source="*scheduler.log" sourcetype=scheduler `splunkenterprisehosts` | rex "savedsearch_id=\"[^;]+;[^;]+;(?P<savedsearch_name>.*?)\"," | table sid, savedsearch_name, app, user]\
22492251
| cluster showcount=true\
22502252
| table host, savedsearch_name, app, user, _raw, _time, cluster_count\
22512253
| eval mostRecent = strftime(mostRecent, "%+")\
@@ -4067,6 +4069,7 @@ request.ui_dispatch_view = search
40674069
search = ```This likely came from a Splunk conf presentation but I cannot remember which one so cannot attribute the original author!\
40684070
Determine the length of time a scheduled search takes to run compared to how often it is configured to run, excluding acceleration jobs```\
40694071
index=_internal `searchheadhosts` sourcetype=scheduler source=*scheduler.log (user=*) savedsearch_name!="_ACCELERATE_DM*"\
4072+
| rex "savedsearch_id=\"[^;]+;[^;]+;(?P<savedsearch_name>.*?)\"," \
40704073
| stats avg(run_time) as average_runtime_in_sec count(savedsearch_name) as num_times_per_week sum(run_time) as total_runtime_sec by savedsearch_name user app host\
40714074
| eval ran_every_x_mins=round(60/(num_times_per_week/168))\
40724075
| eval average_runtime_duration=tostring(round(average_runtime_in_sec/60,2), "duration")\
@@ -4712,16 +4715,10 @@ search = | multisearch \
47124715
```We now deal with cases where search earliest/latest times were not specified, assume all time is about 1 year in the past and latest time was the search run time``` \
47134716
| eval search_lt=if(search_lt=="N/A",timestamp,search_lt), search_et=if(search_et=="N/A",now()-(365*24*60*60),search_et) \
47144717
```Extract out index= or index IN (a,b,c) but avoid NOT index in (...) and NOT index=... and also NOT (...anything) statements``` \
4715-
| rex field=search "(?s)(NOT\s+index(\s*=\s*|::)[^ ]+)|(NOT\s+\([^\)]+\))|(index(\s*=\s*|::)\"?(?P<indexregex>[\*A-Za-z0-9-_]+))" max_match=50 \
4716-
| rex field=search "(?s)(NOT\s+index\s+[iI][nN]\s*\([^\)]+)|(index\s+[iI][nN]\s*\((?P<indexin>([^\)\"]+)|\"[^\)\"]+\"))" max_match=50 \
4717-
| makemv tokenizer="([^, ]+)" indexin \
4718-
| eval indexes=mvappend(indexregex,indexin) \
4718+
| `indexes_extraction(search)` \
47194719
| eval indexes=if(isnotnull(esstylewildcard),mvfilter(NOT match(indexes,"^_?\*$")),indexes) \
47204720
| eval wildcard=mvfilter(match(indexes,"\*")) \
47214721
| where isnull(wildcard) \
4722-
| eval indexes=mvmap(indexes, replace(lower(indexes), "\"", "")) \
4723-
| eval indexes=mvmap(indexes, trim(replace(indexes, "'", ""))) \
4724-
| eval indexes=mvdedup(indexes) \
47254722
| eval multi=if(mvcount(indexes)>1,"true","false") \
47264723
| stats values(timestamp) AS _time, values(total_run_time) AS total_run_time, values(event_count) AS event_count, values(scan_count) AS scan_count, values(search_et) AS search_et, values(search_lt) AS search_lt, values(savedsearch_name) AS savedsearch_name, values(multi) AS multi, max(duration_index) AS duration_index, max(duration_rawdata) AS duration_rawdata, max(cache_index_hits) AS cache_index_hits, max(cache_index_miss) AS cache_index_miss, max(cache_index_hit_duration) AS cache_index_hit_duration, max(cache_index_miss_duration) AS cache_index_miss_duration, max(cache_rawdata_hits) AS cache_rawdata_hits, max(cache_rawdata_miss) AS cache_rawdata_miss, max(cache_rawdata_hit_duration) AS cache_rawdata_hit_duration, max(cache_rawdata_miss_duration) AS cache_rawdata_miss_duration, values(provenance) AS provenance by user, type, indexes, search_head_cluster, search_id, app_name \
47274724
| eval period=search_lt-search_et \
@@ -4815,10 +4812,7 @@ search = | multisearch \
48154812
| rex field=search "(?P<esstylewildcard>\(\s*index=\*\s+OR\s+index=_\*\s*\))" \
48164813
| rex mode=sed field=search "s/search index=\s*\S+\s+index\s*=/search index=/g" \
48174814
```Extract out index= or index IN (a,b,c) but avoid NOT index in (...) and NOT index=... and also NOT (...anything) statements``` \
4818-
| rex field=search "(?s)(NOT\s+index(\s*=\s*|::)[^ ]+)|(NOT\s+\([^\)]+\))|(index(\s*=\s*|::)\"?(?P<indexregex>[\*A-Za-z0-9-_]+))" max_match=50 \
4819-
| rex field=search "(?s)(NOT\s+index\s+[iI][nN]\s*\([^\)]+)|(index\s+[iI][nN]\s*\((?P<indexin>([^\)\"]+)|\"[^\)\"]+\"))" max_match=50 \
4820-
| makemv tokenizer="([^, ]+)" indexin \
4821-
| eval indexes=mvappend(indexregex,indexin) \
4815+
| `indexes_extraction(search)` \
48224816
| eval indexes=if(isnotnull(esstylewildcard),mvfilter(NOT match(indexes,"^_?\*$")),indexes) \
48234817
| eval wildcard=mvfilter(match(indexes,"\*")) \
48244818
| where isnotnull(wildcard) OR isnull(indexes) \
@@ -6204,7 +6198,7 @@ quantity = 0
62046198
relation = greater than
62056199
request.ui_dispatch_app = SplunkAdmins
62066200
request.ui_dispatch_view = search
6207-
search = ```This warning when occurring repetitively tends to indicate some kind of issue that will require the file to be manually removed. For example a zero sized metadata file that cannot be reaped by the dispatch reaper``` \
6201+
search = ```This alert attempts to find peers that are relatively slow compared to other peers. I've used a hardcoded time rather than a variable as it appeared to work well. ``` \
62086202
index=_internal `indexerhosts` source=*remote_searches.log terminated: OR closed: \
62096203
| regex search!="^(pretypeahead|copybuckets)" \
62106204
| rex "(?s) elapsedTime=(?P<elapsedTime>[0-9\.]+),( cpuTime=\S+,)? search='(?P<search>.*?)(', savedsearch_name|\", drop_count=\d+)" \
@@ -7021,13 +7015,7 @@ invocations_command_search_index_bucketcache_miss>0 OR invocations_command_searc
70217015
| search total_cache_miss>0 \
70227016
| search provenance=*Dashboard* \
70237017
| eval total_hours_searched=round(total_hours_searched,1) \
7024-
| rex field=search "(?s)(NOT\s+index(\s*=\s*|::)[^ ]+)|(NOT\s+\([^\)]+\))|(index(\s*=\s*|::)\"?(?P<indexregex>[\*A-Za-z0-9-_]+))" max_match=50 \
7025-
| rex field=search "(?s)(NOT\s+index\s+[iI][nN]\s*\([^\)]+)|(index\s+[iI][nN]\s*\((?P<indexin>([^\)\"]+)|\"[^\)\"]+\"))" max_match=50 \
7026-
| makemv tokenizer="([^, ]+)" indexin \
7027-
| eval indexes=mvappend(indexregex,indexin) \
7028-
| eval indexes=mvmap(indexes, replace(lower(indexes), "\"", "")) \
7029-
| eval indexes=mvmap(indexes, trim(replace(indexes, "'", ""))) \
7030-
| eval indexes=mvdedup(indexes) \
7018+
| `indexes_extraction(search)` \
70317019
| eval has_pipe=if(match(search,"\|"),"true",null()) \
70327020
| rex field=search "(?P<search>[^\|]+\|)" \
70337021
| eval search = if(isnotnull(has_pipe),search . " ... (trimmed)",search)\
@@ -7076,13 +7064,7 @@ invocations_command_search_index_bucketcache_miss>0 OR invocations_command_searc
70767064
| `base64decode(base64appname)` \
70777065
| eval app3="N/A", app=coalesce(app,app2,base64appname,app3) \
70787066
| stats latest(mostRecent) AS mostRecent, count as number_of_runs values(host) as host values(total_hours_searched) AS total_hours_searched values(total_days_searched) AS total_days_searched max(run_time) AS max_run_time avg(run_time) AS avg_run_time sum(run_time) AS sum_run_time sum(total_cache_miss) as total_cache_miss max(result_count) AS result_count max(event_count) AS event_count max(searched_buckets) AS searched_buckets values(info) AS info values(numofsearchesinquery) AS numofsearchesinquery, values(app) AS app by users search \
7079-
| rex field=search "(?s)(NOT\s+index(\s*=\s*|::[^ ]+)|(NOT\s+\([^\)]+\))|(index(\s*=\s*|::)\"?(?P<indexregex>[\*A-Za-z0-9-_]+))" max_match=50 \
7080-
| rex field=search "(?s)(NOT\s+index\s+[iI][nN]\s*\([^\)]+)|(index\s+[iI][nN]\s*\((?P<indexin>([^\)\"]+)|\"[^\)\"]+\"))" max_match=50 \
7081-
| makemv tokenizer="([^, ]+)" indexin \
7082-
| eval indexes=mvappend(indexregex,indexin) \
7083-
| eval indexes=mvmap(indexes, replace(lower(indexes), "\"", "")) \
7084-
| eval indexes=mvmap(indexes, trim(replace(indexes, "'", ""))) \
7085-
| eval indexes=mvdedup(indexes) \
7067+
| `indexes_extraction(search)` \
70867068
| rex max_match=100 field=search "tag=(?<tags>[^\s+\||\)]+)" \
70877069
| rex max_match=100 field=search "eventtype=(?<eventtypes>[^\s+\||\)]+)" \
70887070
| rex max_match=100 field=search "(?<macros>\`[^\s]+\`)" \
@@ -7131,13 +7113,7 @@ invocations_command_search_index_bucketcache_miss>0 OR invocations_command_searc
71317113
| search total_cache_miss>0 \
71327114
| eval total_hours_searched=round(total_hours_searched,1) \
71337115
| stats latest(mostRecent) AS mostRecent, count as number_of_runs, values(host) as host values(total_hours_searched) AS total_hours_searched values(total_days_searched) AS total_days_searched max(run_time) AS max_run_time avg(run_time) AS avg_run_time sum(run_time) AS sum_run_time sum(total_cache_miss) as total_cache_miss max(result_count) AS result_count max(event_count) AS event_count max(searched_buckets) AS searched_buckets values(info) AS info values(numofsearchesinquery) AS numofsearchesinquery, values(provenance) AS provenance, values(app) AS app by users search \
7134-
| rex field=search "(?s)(NOT\s+index(\s*=\s*|::)[^ ]+)|(NOT\s+\([^\)]+\))|(index(\s*=\s*|::)\"?(?P<indexregex>[\*A-Za-z0-9-_]+))" max_match=50 \
7135-
| rex field=search "(?s)(NOT\s+index\s+[iI][nN]\s*\([^\)]+)|(index\s+[iI][nN]\s*\((?P<indexin>([^\)\"]+)|\"[^\)\"]+\"))" max_match=50 \
7136-
| makemv tokenizer="([^, ]+)" indexin \
7137-
| eval indexes=mvappend(indexregex,indexin) \
7138-
| eval indexes=mvmap(indexes, replace(lower(indexes), "\"", "")) \
7139-
| eval indexes=mvmap(indexes, trim(replace(indexes, "'", ""))) \
7140-
| eval indexes=mvdedup(indexes) \
7116+
| `indexes_extraction(search)`
71417117
| eval has_pipe=if(match(search,"\|"),"true",null())\
71427118
| rex max_match=100 field=search "tag=(?<tags>[^\s+\||\)]+)" \
71437119
| rex max_match=100 field=search "eventtype=(?<eventtypes>[^\s+\||\)]+)" \
@@ -7363,7 +7339,7 @@ quantity = 0
73637339
relation = greater than
73647340
request.ui_dispatch_app = SplunkAdmins
73657341
request.ui_dispatch_view = search
7366-
search = index=_internal source=*crash.log \
7342+
search = index=_internal sourcetype=splunkd_crash_log \
73677343
| stats count by source, host, sourcetype \
73687344
| eval indexer_cluster=`indexer_cluster_name(host)` \
73697345
| eval search_head=host \
@@ -8228,14 +8204,7 @@ search = | rest `splunkadmins_restmacro` timeout=900 /servicesNS/-/-/data/models
82288204
| `splunkadmins_macro_sub('eai:data')` \
82298205
| regex eai:data="index\s*(=|[iI][nN])" \
82308206
| rex field=eai:data "(?P<esstylewildcard>\(\s*index=\*\s+OR\s+index=_\*\s*\))" \
8231-
| rex field=eai:data "(?sm)(NOT\s+index\s*(=|::)\s*[^ ]+)|(NOT\s+\([^\)]+\))|(index\s*(=|::)\s*(\\\)?\"?(?P<indexregex>[\*A-Za-z0-9-_]+))" max_match=50 \
8232-
| rex field=eai:data "(?sm)(NOT\s+index\s+[iI][nN]\s*\([^\)]+)|(index\s+[iI][nN]\s*\((?P<indexin>([^\)\"]+)|\"[^\)\"]+\"))" max_match=50 \
8233-
| makemv tokenizer="([^, ]+)" indexin \
8234-
| eval indexes=mvappend(indexregex,indexin) \
8235-
| eval indexes=if(isnotnull(esstylewildcard),mvfilter(NOT match(indexes,"^_?\*$")),indexes) \
8236-
| eval indexes=mvmap(indexes, replace(lower(indexes), "\"", "")) \
8237-
| eval indexes=mvmap(indexes, trim(replace(indexes, "'", ""))) \
8238-
| eval indexes=mvdedup(indexes) \
8207+
| `indexes_extraction(eai:data)` \
82398208
| table title, indexes, eai:data, eai:acl.app
82408209

82418210
[SearchHeadLevel - Job performance data per indexer]
@@ -8475,13 +8444,7 @@ search = | rest /servicesNS/-/-/saved/searches f=next_scheduled_time f=search f=
84758444
| nomv prepipe_subsearch \
84768445
| fillnull prepipe_subsearch value=" " \
84778446
| eval prepipe = prepipe . " " . prepipe_subsearch \
8478-
| rex field=prepipe "(?s)(NOT\s+index(\s*=\s*|::)[^ ]+)|(NOT\s+\([^\)]+\))|(index(\s*=\s*|::)\"?(?P<indexregex>[\*A-Za-z0-9-_]+))" max_match=50 \
8479-
| rex field=prepipe "(?s)(NOT\s+index\s+[iI][nN]\s*\([^\)]+)|(index\s+[iI][nN]\s*\((?P<indexin>([^\)\"]+)|\"[^\)\"]+\"))" max_match=50 \
8480-
| makemv tokenizer="([^, ]+)" indexin \
8481-
| eval indexes=mvappend(indexregex,indexin) \
8482-
| eval indexes=mvmap(indexes, replace(lower(indexes), "\"", "")) \
8483-
| eval indexes=mvmap(indexes, trim(replace(indexes, "'", ""))) \
8484-
| eval indexes=mvdedup(indexes) \
8447+
| `indexes_extraction(prepipe)` \
84858448
| eval count=mvcount(indexes) \
84868449
| rename eai:acl.app AS app, eai:acl.owner AS owner, eai:acl.sharing AS sharing \
84878450
| table title, app, indexes, count, owner, sharing, updated
@@ -8574,16 +8537,10 @@ search = index=_audit savedsearch_name="$savedsearch_name$" host IN ($host$) \
85748537
| regex search="^\s*(\|?)\s*(search|tstats|mstats|mcatalog|multisearch|union|set|summarize|datamodel|from\s*:?\s*datamodel|datamodelsimple)\s+" \
85758538
| regex search!="(\||^)\s*(append|union|multisearch|set|appendcols|appendpipe|join|map)" \
85768539
| rex field=search "(?s)^(?P<prepipe>\s*\|?([^\|]+))" \
8577-
| rex field=prepipe "(?s)(NOT\s+index(\s*=\s*|::)[^ ]+)|(NOT\s+\([^\)]+\))|(index(\s*=\s*|::)\"?(?P<indexregex>[\*A-Za-z0-9-_]+))" max_match=50 \
8578-
| rex field=prepipe "(?s)(NOT\s+index\s+[iI][nN]\s*\([^\)]+)|(index\s+[iI][nN]\s*\((?P<indexin>([^\)\"]+)|\"[^\)\"]+\"))" max_match=50 \
8579-
| makemv tokenizer="([^, ]+)" indexin \
8580-
| eval indexes=mvappend(indexregex,indexin) \
8540+
| `indexes_extraction(prepipe)` \
85818541
| eval indexes=if(isnotnull(esstylewildcard),mvfilter(NOT match(indexes,"^_?\*$")),indexes) \
85828542
| eval wildcard=mvfilter(match(indexes,"\*")) \
85838543
| where isnull(wildcard) \
8584-
| eval indexes=mvmap(indexes, replace(lower(indexes), "\"", "")) \
8585-
| eval indexes=mvmap(indexes, trim(replace(indexes, "'", ""))) \
8586-
| eval indexes=mvdedup(indexes) \
85878544
| eval count=mvcount(indexes) \
85888545
```| where count==1 \
85898546
| search indexes!=_* ```\
@@ -8881,13 +8838,7 @@ search = | rest `splunkadmins_restmacro` /servicesNS/-/-/data/ui/views f=eai:dat
88818838
| nomv prepipe_subsearch \
88828839
| fillnull prepipe_subsearch value=" " \
88838840
| eval prepipe = prepipe . " " . prepipe_subsearch \
8884-
| rex field=prepipe "(?s)(NOT\s+index(\s*=\s*|::)[^ ]+)|(NOT\s+\([^\)]+\))|(index(\s*=\s*|::)\"?(?P<indexregex>[\*A-Za-z0-9-_]+))" max_match=50 \
8885-
| rex field=prepipe "(?s)(NOT\s+index\s+[iI][nN]\s*\([^\)]+)|(index\s+[iI][nN]\s*\((?P<indexin>([^\)\"]+)|\"[^\)\"]+\"))" max_match=50 \
8886-
| makemv tokenizer="([^, ]+)" indexin \
8887-
| eval indexes=mvappend(indexregex,indexin) \
8888-
| eval indexes=mvmap(indexes, replace(lower(indexes), "\"", "")) \
8889-
| eval indexes=mvmap(indexes, trim(replace(indexes, "'", ""))) \
8890-
| eval indexes=mvdedup(indexes) \
8841+
| `indexes_extraction(prepipe)` \
88918842
| stats values(indexes) AS indexes by title
88928843

88938844
[AllSplunkEnterpriseLevel - Splunk servers with resource starvation v2]

0 commit comments

Comments
 (0)