Skip to content

Commit aa76c01

Browse files
jiangzhodongjoon-hyun
authored andcommitted
[SPARK-51507] Support creating config map that can be mounted by Spark pods for apps
### What changes were proposed in this pull request? This PR introduces a new field `configMapSpecs`, which enables user to create configmap(s) which can be later mounted on driver and/or executors with values specified in SparkApplications. It also adds corresponding docs and unit tests. ### Why are the changes needed? There's a common user demand for the feature to create lightweight values / files override using configmap. This new feature saves the trouble for user to create configmap(s) beforehand - the config map would share the lifecycle of the corresponding Spark app. User may mount the ConfigMap as needed in driver and executor pod templates - one example is provided in the doc. In future we may further enhance this flow by mounting the created config map automatically to Spark pods. This can be achieved by supporting volume type of ConfigMap in Spark Kubernetes, or by adding mutating webhook & alter pod templates after creation. For now, this PR opens up the possibility of creating config map and mounting them with one single spec. ### Does this PR introduce _any_ user-facing change? No - not yet released ### How was this patch tested? Pass the CIs ### Was this patch authored or co-authored using generative AI tooling? No Closes #166 from jiangzho/cm. Authored-by: Zhou JIANG <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
1 parent 629d167 commit aa76c01

File tree

8 files changed

+234
-4
lines changed

8 files changed

+234
-4
lines changed

docs/spark_custom_resources.md

Lines changed: 56 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -138,6 +138,62 @@ default rule backed by the associated Service. It's recommended to always provid
138138
to make sure it's compatible with your
139139
[IngressController](https://kubernetes.io/docs/concepts/services-networking/ingress-controllers/).
140140

141+
## Create and Mount ConfigMap
142+
143+
It is possible to ask operator to create configmap so they can be used by driver and/or executor
144+
pods on the fly. `configMapSpecs` allows you to specify the desired metadata and data as string
145+
literals for the configmap(s) to be created.
146+
147+
```yaml
148+
spec:
149+
configMapSpecs:
150+
- name: "example-config-map"
151+
data:
152+
foo: "bar"
153+
```
154+
155+
Like other app-specific resources, the created configmap has owner reference to Spark driver and
156+
therefore shares the same lifecycle and garbage collection mechanism with the associated app.
157+
158+
This feature can be used to create lightweight override config files for given Spark app. For
159+
example, below snippet would create and mount a configmap with metrics property file, then use it
160+
in SparkConf:
161+
162+
```yaml
163+
spec:
164+
sparkConf:
165+
spark.metrics.conf: "/etc/metrics/metrics.properties"
166+
driverSpec:
167+
podTemplateSpec:
168+
spec:
169+
containers:
170+
- volumeMounts:
171+
- name: "config-override"
172+
mountPath: "/etc/metrics"
173+
readOnly: true
174+
volumes:
175+
- name: config-override
176+
configMap:
177+
name: metrics-configmap
178+
executorSpec:
179+
podTemplateSpec:
180+
spec:
181+
containers:
182+
- volumeMounts:
183+
- name: "config-override"
184+
mountPath: "/etc/metrics"
185+
readOnly: true
186+
volumes:
187+
- name: config-override
188+
configMap:
189+
name: metrics-configmap
190+
configMapSpecs:
191+
- name: "metrics-configmap"
192+
data:
193+
metrics.properties: "*.sink.jmx.class=org.apache.spark.metrics.sink.JmxSink\n"
194+
195+
```
196+
141197
## Understanding Failure Types
142198

143199
In addition to the general `Failed` state (that driver pod fails or driver container exits

spark-operator-api/src/main/java/org/apache/spark/k8s/operator/spec/ApplicationSpec.java

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -55,4 +55,5 @@ public class ApplicationSpec extends BaseSpec {
5555
protected BaseApplicationTemplateSpec driverSpec;
5656
protected BaseApplicationTemplateSpec executorSpec;
5757
protected List<DriverServiceIngressSpec> driverServiceIngressList;
58+
protected List<ConfigMapSpec> configMapSpecs;
5859
}
Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
/*
2+
* Licensed to the Apache Software Foundation (ASF) under one
3+
* or more contributor license agreements. See the NOTICE file
4+
* distributed with this work for additional information
5+
* regarding copyright ownership. The ASF licenses this file
6+
* to you under the Apache License, Version 2.0 (the
7+
* "License"); you may not use this file except in compliance
8+
* with the License. You may obtain a copy of the License at
9+
*
10+
* http://www.apache.org/licenses/LICENSE-2.0
11+
*
12+
* Unless required by applicable law or agreed to in writing,
13+
* software distributed under the License is distributed on an
14+
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
15+
* KIND, either express or implied. See the License for the
16+
* specific language governing permissions and limitations
17+
* under the License.
18+
*/
19+
20+
package org.apache.spark.k8s.operator.spec;
21+
22+
import java.util.Map;
23+
24+
import com.fasterxml.jackson.annotation.JsonInclude;
25+
import io.fabric8.generator.annotation.Required;
26+
import lombok.AllArgsConstructor;
27+
import lombok.Builder;
28+
import lombok.Data;
29+
import lombok.NoArgsConstructor;
30+
31+
@Data
32+
@NoArgsConstructor
33+
@AllArgsConstructor
34+
@Builder
35+
@JsonInclude(JsonInclude.Include.NON_NULL)
36+
public class ConfigMapSpec {
37+
@Required protected String name;
38+
protected Map<String, String> labels;
39+
protected Map<String, String> annotations;
40+
protected Map<String, String> data;
41+
}

spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppResourceSpec.java

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,9 @@
4343
import org.apache.spark.deploy.k8s.KubernetesDriverSpec;
4444
import org.apache.spark.deploy.k8s.SparkPod;
4545
import org.apache.spark.deploy.k8s.submit.KubernetesClientUtils;
46+
import org.apache.spark.k8s.operator.spec.ConfigMapSpec;
4647
import org.apache.spark.k8s.operator.spec.DriverServiceIngressSpec;
48+
import org.apache.spark.k8s.operator.utils.ConfigMapSpecUtils;
4749
import org.apache.spark.k8s.operator.utils.DriverServiceIngressUtils;
4850

4951
/**
@@ -66,7 +68,8 @@ public class SparkAppResourceSpec {
6668
public SparkAppResourceSpec(
6769
SparkAppDriverConf kubernetesDriverConf,
6870
KubernetesDriverSpec kubernetesDriverSpec,
69-
List<DriverServiceIngressSpec> driverServiceIngressList) {
71+
List<DriverServiceIngressSpec> driverServiceIngressList,
72+
List<ConfigMapSpec> configMapSpecs) {
7073
this.kubernetesDriverConf = kubernetesDriverConf;
7174
String namespace = kubernetesDriverConf.sparkConf().get(Config.KUBERNETES_NAMESPACE().key());
7275
Map<String, String> confFilesMap =
@@ -93,6 +96,7 @@ public SparkAppResourceSpec(
9396
this.driverResources.add(
9497
KubernetesClientUtils.buildConfigMap(
9598
kubernetesDriverConf.configMapNameDriver(), confFilesMap, new HashMap<>()));
99+
this.driverPreResources.addAll(ConfigMapSpecUtils.buildConfigMaps(configMapSpecs));
96100
this.driverResources.addAll(configureDriverServerIngress(sparkPod, driverServiceIngressList));
97101
this.driverPreResources.forEach(r -> setNamespaceIfMissing(r, namespace));
98102
this.driverResources.forEach(r -> setNamespaceIfMissing(r, namespace));

spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppSubmissionWorker.java

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,7 @@
3838
import org.apache.spark.deploy.k8s.submit.PythonMainAppResource;
3939
import org.apache.spark.deploy.k8s.submit.RMainAppResource;
4040
import org.apache.spark.k8s.operator.spec.ApplicationSpec;
41+
import org.apache.spark.k8s.operator.spec.ConfigMapSpec;
4142
import org.apache.spark.k8s.operator.spec.DriverServiceIngressSpec;
4243
import org.apache.spark.k8s.operator.utils.ModelUtils;
4344

@@ -84,7 +85,11 @@ public class SparkAppSubmissionWorker {
8485
public SparkAppResourceSpec getResourceSpec(
8586
SparkApplication app, KubernetesClient client, Map<String, String> confOverrides) {
8687
SparkAppDriverConf appDriverConf = buildDriverConf(app, confOverrides);
87-
return buildResourceSpec(appDriverConf, app.getSpec().getDriverServiceIngressList(), client);
88+
return buildResourceSpec(
89+
appDriverConf,
90+
app.getSpec().getDriverServiceIngressList(),
91+
app.getSpec().getConfigMapSpecs(),
92+
client);
8893
}
8994

9095
protected SparkAppDriverConf buildDriverConf(
@@ -131,12 +136,13 @@ protected SparkAppDriverConf buildDriverConf(
131136
protected SparkAppResourceSpec buildResourceSpec(
132137
SparkAppDriverConf kubernetesDriverConf,
133138
List<DriverServiceIngressSpec> driverServiceIngressList,
139+
List<ConfigMapSpec> configMapSpecs,
134140
KubernetesClient client) {
135141
KubernetesDriverBuilder builder = new KubernetesDriverBuilder();
136142
KubernetesDriverSpec kubernetesDriverSpec =
137143
builder.buildFromFeatures(kubernetesDriverConf, client);
138144
return new SparkAppResourceSpec(
139-
kubernetesDriverConf, kubernetesDriverSpec, driverServiceIngressList);
145+
kubernetesDriverConf, kubernetesDriverSpec, driverServiceIngressList, configMapSpecs);
140146
}
141147

142148
/**
Lines changed: 51 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
/*
2+
* Licensed to the Apache Software Foundation (ASF) under one
3+
* or more contributor license agreements. See the NOTICE file
4+
* distributed with this work for additional information
5+
* regarding copyright ownership. The ASF licenses this file
6+
* to you under the Apache License, Version 2.0 (the
7+
* "License"); you may not use this file except in compliance
8+
* with the License. You may obtain a copy of the License at
9+
*
10+
* http://www.apache.org/licenses/LICENSE-2.0
11+
*
12+
* Unless required by applicable law or agreed to in writing,
13+
* software distributed under the License is distributed on an
14+
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
15+
* KIND, either express or implied. See the License for the
16+
* specific language governing permissions and limitations
17+
* under the License.
18+
*/
19+
20+
package org.apache.spark.k8s.operator.utils;
21+
22+
import java.util.Collections;
23+
import java.util.List;
24+
import java.util.stream.Collectors;
25+
26+
import io.fabric8.kubernetes.api.model.ConfigMap;
27+
import io.fabric8.kubernetes.api.model.ConfigMapBuilder;
28+
29+
import org.apache.spark.k8s.operator.spec.ConfigMapSpec;
30+
31+
public final class ConfigMapSpecUtils {
32+
private ConfigMapSpecUtils() {}
33+
34+
public static List<ConfigMap> buildConfigMaps(List<ConfigMapSpec> configMapMountList) {
35+
if (configMapMountList == null || configMapMountList.isEmpty()) {
36+
return Collections.emptyList();
37+
}
38+
return configMapMountList.stream()
39+
.map(
40+
c ->
41+
new ConfigMapBuilder()
42+
.editOrNewMetadata()
43+
.withName(c.getName())
44+
.withLabels(c.getLabels())
45+
.withAnnotations(c.getAnnotations())
46+
.endMetadata()
47+
.addToData(c.getData())
48+
.build())
49+
.collect(Collectors.toList());
50+
}
51+
}

spark-submission-worker/src/test/java/org/apache/spark/k8s/operator/SparkAppResourceSpecTest.java

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,8 @@ void testDriverResourceIncludesConfigMap() {
7070
when(mockSpec.systemProperties()).thenReturn(new HashMap<>());
7171

7272
SparkAppResourceSpec appResourceSpec =
73-
new SparkAppResourceSpec(mockConf, mockSpec, Collections.emptyList());
73+
new SparkAppResourceSpec(
74+
mockConf, mockSpec, Collections.emptyList(), Collections.emptyList());
7475

7576
Assertions.assertEquals(2, appResourceSpec.getDriverResources().size());
7677
Assertions.assertEquals(1, appResourceSpec.getDriverPreResources().size());
Lines changed: 70 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,70 @@
1+
/*
2+
* Licensed to the Apache Software Foundation (ASF) under one
3+
* or more contributor license agreements. See the NOTICE file
4+
* distributed with this work for additional information
5+
* regarding copyright ownership. The ASF licenses this file
6+
* to you under the Apache License, Version 2.0 (the
7+
* "License"); you may not use this file except in compliance
8+
* with the License. You may obtain a copy of the License at
9+
*
10+
* http://www.apache.org/licenses/LICENSE-2.0
11+
*
12+
* Unless required by applicable law or agreed to in writing,
13+
* software distributed under the License is distributed on an
14+
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
15+
* KIND, either express or implied. See the License for the
16+
* specific language governing permissions and limitations
17+
* under the License.
18+
*/
19+
20+
package org.apache.spark.k8s.operator.utils;
21+
22+
import static org.junit.jupiter.api.Assertions.assertEquals;
23+
24+
import java.util.Collections;
25+
import java.util.HashMap;
26+
import java.util.List;
27+
import java.util.Map;
28+
29+
import io.fabric8.kubernetes.api.model.ConfigMap;
30+
import org.junit.jupiter.api.Test;
31+
32+
import org.apache.spark.k8s.operator.spec.ConfigMapSpec;
33+
34+
class ConfigMapSpecUtilsTest {
35+
36+
@Test
37+
void buildConfigMaps() {
38+
39+
assertEquals(Collections.emptyList(), ConfigMapSpecUtils.buildConfigMaps(null));
40+
assertEquals(
41+
Collections.emptyList(), ConfigMapSpecUtils.buildConfigMaps(Collections.emptyList()));
42+
43+
Map<String, String> labels = Map.of("foo-label-key", "foo-label-value");
44+
Map<String, String> annotations = Map.of("foo-annotation-key", "foo-annotation-value");
45+
Map<String, String> configMapData = new HashMap<>();
46+
// literal value
47+
configMapData.put("foo", "bar");
48+
// escaped property file
49+
configMapData.put(
50+
"conf.properties", "spark.foo.keyOne=\"valueOne\"\nspark.foo.keyTwo=\"valueTwo\"");
51+
// escaped yaml file
52+
configMapData.put("conf.yaml", "conf:\n foo: bar\n nestedField:\n keyName: value");
53+
ConfigMapSpec mount =
54+
ConfigMapSpec.builder()
55+
.name("test-config-map")
56+
.labels(labels)
57+
.annotations(annotations)
58+
.data(configMapData)
59+
.build();
60+
List<ConfigMap> configMaps =
61+
ConfigMapSpecUtils.buildConfigMaps(Collections.singletonList(mount));
62+
assertEquals(1, configMaps.size());
63+
ConfigMap configMap = configMaps.get(0);
64+
assertEquals("test-config-map", configMap.getMetadata().getName());
65+
assertEquals("foo-label-value", configMap.getMetadata().getLabels().get("foo-label-key"));
66+
assertEquals(
67+
"foo-annotation-value", configMap.getMetadata().getAnnotations().get("foo-annotation-key"));
68+
assertEquals(configMapData, configMap.getData());
69+
}
70+
}

0 commit comments

Comments
 (0)