-
Notifications
You must be signed in to change notification settings - Fork 25.4k
Set default_metric for time series on the index level #133464
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
1ac8ce9
to
5a22bb2
Compare
5a22bb2
to
c871bbd
Compare
Alternative approach to #132626 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had a couple of questions/nits, but overall LGTM!
@@ -783,13 +785,19 @@ public static AggregateMetricDoubleFieldSupportedMetrics getSupportedMetrics( | |||
(String) fieldProperties.get(AggregateMetricDoubleFieldMapper.Names.DEFAULT_METRIC), | |||
defaultMetric | |||
); | |||
} else { | |||
defaultMetric = indexLevelDefaultMetric.name().toLowerCase(Locale.ROOT); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If I'm reading this correctly, index.time_series.default_metric
will only apply to non-aggregate number fields that are being downsampled into aggregate metrics, and existing aggregate metrics with no specified default_metric will still fall back to "max".
Would it makes sense to instead have index.time_series.default_metric
specify the default metric in both cases?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this does make sense. Maybe the defaultMetric
variable should just be initialized by indexLevelDefaultMetric
(on line 777 in main branch)? Given that the default of the new index setting is also max.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I'm not entirely sure I understand the situation
I had an assumption that if you don't specify the default metric for an aggregate metric double field, then it's automatically assigned one (which would be max if max is present), which would mean all existing aggregate metric doubles have a default metric - is that not the case?
I was thinking I want to avoid a situation where someone already has an aggregate metric double, and then this setting gets introduced and it overwrites their old aggregate metric double's default setting
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you're right--in most cases, there will already be a default metric specified on the aggregate metric, since aggregate metrics are usually created by downsampling which adds a default metric using this logic.
But it is technically possible to manually create an aggregate metric field that has no default metric specified. In this case, the logic here is hard-coded to default to using "max" as the default metric. But it might make more sense to instead use the new index-level index.time_series.default_metric
here.
public enum AggregateMetricDoubleDefaultMetric { | ||
MIN, | ||
MAX, | ||
SUM, | ||
VALUE_COUNT; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: It feels a bit smelly to have this enum duplicate the other enum AggregateMetricDoubleFieldMapper.Metric
. It makes sense since that second enum only exists in the mapper-aggregate-metric
plug-in and so can't be referenced here, but I wonder if you could instead create one enum somewhere and have both classes refer to that one enum?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah we have a few of these enums....there's similarly one in the ES|QL side too and I would love to have just one enum but I'm not sure where it would live
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Larisa, this direction looks good to me. My main comment is around try to move the new setting to the downsample plugin module.
VALUE_COUNT; | ||
} | ||
|
||
public static final Setting<AggregateMetricDoubleDefaultMetric> TIME_SERIES_DEFAULT_METRIC = Setting.enumSetting( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can this setting be moved to the xpack downsample module? We can just parse it in TimeseriesFieldTypeHelper
and keep it around as a field there? (The setting can then be registered in Downsample
plugin class by overwriting the getSettings()
method)
If that works then there is no need to define the enum that Jordan added a comment about, given that I think we can just use AggregateMetricDoubleFieldMapper.Metric
directly in the downsample module (seems to have a compile dependency on mapper-aggregate-metric
module).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will try it out..!
@@ -30,6 +30,10 @@ public class DataStreamFeatures implements FeatureSpecification { | |||
|
|||
public static final NodeFeature FAILURE_STORE_IN_LOG_DATA_STREAMS = new NodeFeature("logs_data_streams.failure_store.enabled"); | |||
|
|||
public static final NodeFeature DOWNSAMPLE_INDEX_LEVEL_DEFAULT_METRIC = new NodeFeature( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Given that this is a downsample feature. I think we should add this feature to the downsample module? It doesn't seem to have a class extending FeatureSpecification
yet, but I think we can add that.
@@ -783,13 +785,19 @@ public static AggregateMetricDoubleFieldSupportedMetrics getSupportedMetrics( | |||
(String) fieldProperties.get(AggregateMetricDoubleFieldMapper.Names.DEFAULT_METRIC), | |||
defaultMetric | |||
); | |||
} else { | |||
defaultMetric = indexLevelDefaultMetric.name().toLowerCase(Locale.ROOT); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this does make sense. Maybe the defaultMetric
variable should just be initialized by indexLevelDefaultMetric
(on line 777 in main branch)? Given that the default of the new index setting is also max.
This PR adds in an index-level setting that allows you to specify which metric (min, max, sum, value_count) will be used as the default metric when downsampling indices containing gauges in time series indices.
When the setting is not present, it defaults to max.
If an aggregate_metric_double has a setting different from the index-level setting, the aggregate_metric_double's setting takes precedence for that field.
Example settings/mappings:
becomes this after downsampling: