Skip to content

Commit 60f20e5

Browse files
LucaCanalidongjoon-hyun
authored andcommitted
[SPARK-30060][CORE] Rename metrics enable/disable configs
### What changes were proposed in this pull request? This proposes to introduce a naming convention for Spark metrics configuration parameters used to enable/disable metrics source reporting using the Dropwizard metrics library: `spark.metrics.sourceNameCamelCase.enabled` and update 2 parameters to use this naming convention. ### Why are the changes needed? Currently Spark has a few parameters to enable/disable metrics reporting. Their naming pattern is not uniform and this can create confusion. Currently we have: `spark.metrics.static.sources.enabled` `spark.app.status.metrics.enabled` `spark.sql.streaming.metricsEnabled` ### Does this PR introduce any user-facing change? Update parameters for enabling/disabling metrics reporting new in Spark 3.0: `spark.metrics.static.sources.enabled` -> `spark.metrics.staticSources.enabled`, `spark.app.status.metrics.enabled` -> `spark.metrics.appStatusSource.enabled`. Note: `spark.sql.streaming.metricsEnabled` is left unchanged as it is already in use in Spark 2.x. ### How was this patch tested? Manually tested Closes apache#26692 from LucaCanali/uniformNamingMetricsEnableParameters. Authored-by: Luca Canali <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
1 parent 196ea93 commit 60f20e5

File tree

4 files changed

+11
-11
lines changed

4 files changed

+11
-11
lines changed

core/src/main/scala/org/apache/spark/internal/config/Status.scala

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -55,8 +55,8 @@ private[spark] object Status {
5555
.intConf
5656
.createWithDefault(Int.MaxValue)
5757

58-
val APP_STATUS_METRICS_ENABLED =
59-
ConfigBuilder("spark.app.status.metrics.enabled")
58+
val METRICS_APP_STATUS_SOURCE_ENABLED =
59+
ConfigBuilder("spark.metrics.appStatusSource.enabled")
6060
.doc("Whether Dropwizard/Codahale metrics " +
6161
"will be reported for the status of the running spark app.")
6262
.booleanConf

core/src/main/scala/org/apache/spark/internal/config/package.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -638,7 +638,7 @@ package object config {
638638
.createOptional
639639

640640
private[spark] val METRICS_STATIC_SOURCES_ENABLED =
641-
ConfigBuilder("spark.metrics.static.sources.enabled")
641+
ConfigBuilder("spark.metrics.staticSources.enabled")
642642
.doc("Whether to register static sources with the metrics system.")
643643
.booleanConf
644644
.createWithDefault(true)

core/src/main/scala/org/apache/spark/status/AppStatusSource.scala

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ import AppStatusSource.getCounter
2222
import com.codahale.metrics.{Counter, Gauge, MetricRegistry}
2323

2424
import org.apache.spark.SparkConf
25-
import org.apache.spark.internal.config.Status.APP_STATUS_METRICS_ENABLED
25+
import org.apache.spark.internal.config.Status.METRICS_APP_STATUS_SOURCE_ENABLED
2626
import org.apache.spark.metrics.source.Source
2727

2828
private [spark] class JobDuration(val value: AtomicLong) extends Gauge[Long] {
@@ -71,7 +71,7 @@ private[spark] object AppStatusSource {
7171
}
7272

7373
def createSource(conf: SparkConf): Option[AppStatusSource] = {
74-
Option(conf.get(APP_STATUS_METRICS_ENABLED))
74+
Option(conf.get(METRICS_APP_STATUS_SOURCE_ENABLED))
7575
.filter(identity)
7676
.map { _ => new AppStatusSource() }
7777
}

docs/monitoring.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -924,7 +924,7 @@ This is the component with the largest amount of instrumented metrics
924924

925925
- namespace=HiveExternalCatalog
926926
- **note:**: these metrics are conditional to a configuration parameter:
927-
`spark.metrics.static.sources.enabled` (default is true)
927+
`spark.metrics.staticSources.enabled` (default is true)
928928
- fileCacheHits.count
929929
- filesDiscovered.count
930930
- hiveClientCalls.count
@@ -933,7 +933,7 @@ This is the component with the largest amount of instrumented metrics
933933

934934
- namespace=CodeGenerator
935935
- **note:**: these metrics are conditional to a configuration parameter:
936-
`spark.metrics.static.sources.enabled` (default is true)
936+
`spark.metrics.staticSources.enabled` (default is true)
937937
- compilationTime (histogram)
938938
- generatedClassSize (histogram)
939939
- generatedMethodSize (histogram)
@@ -962,8 +962,8 @@ This is the component with the largest amount of instrumented metrics
962962
- queue.executorManagement.listenerProcessingTime (timer)
963963

964964
- namespace=appStatus (all metrics of type=counter)
965-
- **note:** Introduced in Spark 3.0. Conditional to configuration parameter:
966-
`spark.app.status.metrics.enabled=true` (default is false)
965+
- **note:** Introduced in Spark 3.0. Conditional to a configuration parameter:
966+
`spark.metrics.appStatusSource.enabled` (default is false)
967967
- stages.failedStages.count
968968
- stages.skippedStages.count
969969
- stages.completedStages.count
@@ -1057,7 +1057,7 @@ when running in local mode.
10571057

10581058
- namespace=HiveExternalCatalog
10591059
- **note:**: these metrics are conditional to a configuration parameter:
1060-
`spark.metrics.static.sources.enabled` (default is true)
1060+
`spark.metrics.staticSources.enabled` (default is true)
10611061
- fileCacheHits.count
10621062
- filesDiscovered.count
10631063
- hiveClientCalls.count
@@ -1066,7 +1066,7 @@ when running in local mode.
10661066

10671067
- namespace=CodeGenerator
10681068
- **note:**: these metrics are conditional to a configuration parameter:
1069-
`spark.metrics.static.sources.enabled` (default is true)
1069+
`spark.metrics.staticSources.enabled` (default is true)
10701070
- compilationTime (histogram)
10711071
- generatedClassSize (histogram)
10721072
- generatedMethodSize (histogram)

0 commit comments

Comments
 (0)