Skip to content

Commit 7f43847

Browse files
adds support for the Dataproc on GDC SparkApplication resource (#12237) (#850)
[upstream:f954b7c9dab564bf88a97b24f94e9d795889faaf] Signed-off-by: Modular Magician <[email protected]>
1 parent 71de6e8 commit 7f43847

File tree

24 files changed

+713
-0
lines changed

24 files changed

+713
-0
lines changed
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
# This file has some scaffolding to make sure that names are unique and that
2+
# a region and zone are selected when you try to create your Terraform resources.
3+
4+
locals {
5+
name_suffix = "${random_pet.suffix.id}"
6+
}
7+
8+
resource "random_pet" "suffix" {
9+
length = 2
10+
}
11+
12+
provider "google" {
13+
region = "us-central1"
14+
zone = "us-central1-c"
15+
}
Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
resource "google_dataproc_gdc_application_environment" "app_env" {
2+
application_environment_id = "tf-e2e-spark-app-env-${local.name_suffix}"
3+
serviceinstance = "do-not-delete-dataproc-gdc-instance"
4+
project = "my-project-${local.name_suffix}"
5+
location = "us-west2"
6+
namespace = "default"
7+
}
8+
9+
resource "google_dataproc_gdc_spark_application" "spark-application" {
10+
spark_application_id = "tf-e2e-spark-app-${local.name_suffix}"
11+
serviceinstance = "do-not-delete-dataproc-gdc-instance"
12+
project = "my-project-${local.name_suffix}"
13+
location = "us-west2"
14+
namespace = "default"
15+
labels = {
16+
"test-label": "label-value"
17+
}
18+
annotations = {
19+
"an_annotation": "annotation_value"
20+
}
21+
properties = {
22+
"spark.executor.instances": "2"
23+
}
24+
application_environment = google_dataproc_gdc_application_environment.app_env.name
25+
version = "1.2"
26+
spark_application_config {
27+
main_jar_file_uri = "file:///usr/lib/spark/examples/jars/spark-examples.jar"
28+
jar_file_uris = ["file:///usr/lib/spark/examples/jars/spark-examples.jar"]
29+
archive_uris = ["file://usr/lib/spark/examples/spark-examples.jar"]
30+
file_uris = ["file:///usr/lib/spark/examples/jars/spark-examples.jar"]
31+
}
32+
}

dataprocgdc_sparkapplication/motd

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
===
2+
3+
These examples use real resources that will be billed to the
4+
Google Cloud Platform project you use - so make sure that you
5+
run "terraform destroy" before quitting!
6+
7+
===
Lines changed: 79 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,79 @@
1+
# Dataprocgdc Sparkapplication - Terraform
2+
3+
## Setup
4+
5+
<walkthrough-author name="[email protected]" analyticsId="UA-125550242-1" tutorialName="dataprocgdc_sparkapplication" repositoryUrl="https://github.com/terraform-google-modules/docs-examples"></walkthrough-author>
6+
7+
Welcome to Terraform in Google Cloud Shell! We need you to let us know what project you'd like to use with Terraform.
8+
9+
<walkthrough-project-billing-setup></walkthrough-project-billing-setup>
10+
11+
Terraform provisions real GCP resources, so anything you create in this session will be billed against this project.
12+
13+
## Terraforming!
14+
15+
Let's use {{project-id}} with Terraform! Click the Cloud Shell icon below to copy the command
16+
to your shell, and then run it from the shell by pressing Enter/Return. Terraform will pick up
17+
the project name from the environment variable.
18+
19+
```bash
20+
export GOOGLE_CLOUD_PROJECT={{project-id}}
21+
```
22+
23+
After that, let's get Terraform started. Run the following to pull in the providers.
24+
25+
```bash
26+
terraform init
27+
```
28+
29+
With the providers downloaded and a project set, you're ready to use Terraform. Go ahead!
30+
31+
```bash
32+
terraform apply
33+
```
34+
35+
Terraform will show you what it plans to do, and prompt you to accept. Type "yes" to accept the plan.
36+
37+
```bash
38+
yes
39+
```
40+
41+
42+
## Post-Apply
43+
44+
### Editing your config
45+
46+
Now you've provisioned your resources in GCP! If you run a "plan", you should see no changes needed.
47+
48+
```bash
49+
terraform plan
50+
```
51+
52+
So let's make a change! Try editing a number, or appending a value to the name in the editor. Then,
53+
run a 'plan' again.
54+
55+
```bash
56+
terraform plan
57+
```
58+
59+
Afterwards you can run an apply, which implicitly does a plan and shows you the intended changes
60+
at the 'yes' prompt.
61+
62+
```bash
63+
terraform apply
64+
```
65+
66+
```bash
67+
yes
68+
```
69+
70+
## Cleanup
71+
72+
Run the following to remove the resources Terraform provisioned:
73+
74+
```bash
75+
terraform destroy
76+
```
77+
```bash
78+
yes
79+
```
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
# This file has some scaffolding to make sure that names are unique and that
2+
# a region and zone are selected when you try to create your Terraform resources.
3+
4+
locals {
5+
name_suffix = "${random_pet.suffix.id}"
6+
}
7+
8+
resource "random_pet" "suffix" {
9+
length = 2
10+
}
11+
12+
provider "google" {
13+
region = "us-central1"
14+
zone = "us-central1-c"
15+
}
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
resource "google_dataproc_gdc_spark_application" "spark-application" {
2+
spark_application_id = "tf-e2e-spark-app-basic-${local.name_suffix}"
3+
serviceinstance = "do-not-delete-dataproc-gdc-instance"
4+
project = "my-project-${local.name_suffix}"
5+
location = "us-west2"
6+
namespace = "default"
7+
spark_application_config {
8+
main_class = "org.apache.spark.examples.SparkPi"
9+
jar_file_uris = ["file:///usr/lib/spark/examples/jars/spark-examples.jar"]
10+
args = ["10000"]
11+
}
12+
}
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
===
2+
3+
These examples use real resources that will be billed to the
4+
Google Cloud Platform project you use - so make sure that you
5+
run "terraform destroy" before quitting!
6+
7+
===
Lines changed: 79 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,79 @@
1+
# Dataprocgdc Sparkapplication Basic - Terraform
2+
3+
## Setup
4+
5+
<walkthrough-author name="[email protected]" analyticsId="UA-125550242-1" tutorialName="dataprocgdc_sparkapplication_basic" repositoryUrl="https://github.com/terraform-google-modules/docs-examples"></walkthrough-author>
6+
7+
Welcome to Terraform in Google Cloud Shell! We need you to let us know what project you'd like to use with Terraform.
8+
9+
<walkthrough-project-billing-setup></walkthrough-project-billing-setup>
10+
11+
Terraform provisions real GCP resources, so anything you create in this session will be billed against this project.
12+
13+
## Terraforming!
14+
15+
Let's use {{project-id}} with Terraform! Click the Cloud Shell icon below to copy the command
16+
to your shell, and then run it from the shell by pressing Enter/Return. Terraform will pick up
17+
the project name from the environment variable.
18+
19+
```bash
20+
export GOOGLE_CLOUD_PROJECT={{project-id}}
21+
```
22+
23+
After that, let's get Terraform started. Run the following to pull in the providers.
24+
25+
```bash
26+
terraform init
27+
```
28+
29+
With the providers downloaded and a project set, you're ready to use Terraform. Go ahead!
30+
31+
```bash
32+
terraform apply
33+
```
34+
35+
Terraform will show you what it plans to do, and prompt you to accept. Type "yes" to accept the plan.
36+
37+
```bash
38+
yes
39+
```
40+
41+
42+
## Post-Apply
43+
44+
### Editing your config
45+
46+
Now you've provisioned your resources in GCP! If you run a "plan", you should see no changes needed.
47+
48+
```bash
49+
terraform plan
50+
```
51+
52+
So let's make a change! Try editing a number, or appending a value to the name in the editor. Then,
53+
run a 'plan' again.
54+
55+
```bash
56+
terraform plan
57+
```
58+
59+
Afterwards you can run an apply, which implicitly does a plan and shows you the intended changes
60+
at the 'yes' prompt.
61+
62+
```bash
63+
terraform apply
64+
```
65+
66+
```bash
67+
yes
68+
```
69+
70+
## Cleanup
71+
72+
Run the following to remove the resources Terraform provisioned:
73+
74+
```bash
75+
terraform destroy
76+
```
77+
```bash
78+
yes
79+
```
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
# This file has some scaffolding to make sure that names are unique and that
2+
# a region and zone are selected when you try to create your Terraform resources.
3+
4+
locals {
5+
name_suffix = "${random_pet.suffix.id}"
6+
}
7+
8+
resource "random_pet" "suffix" {
9+
length = 2
10+
}
11+
12+
provider "google" {
13+
region = "us-central1"
14+
zone = "us-central1-c"
15+
}
Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
resource "google_dataproc_gdc_spark_application" "spark-application" {
2+
spark_application_id = "tf-e2e-pyspark-app-${local.name_suffix}"
3+
serviceinstance = "do-not-delete-dataproc-gdc-instance"
4+
project = "my-project-${local.name_suffix}"
5+
location = "us-west2"
6+
namespace = "default"
7+
display_name = "A Pyspark application for a Terraform create test"
8+
dependency_images = ["gcr.io/some/image"]
9+
pyspark_application_config {
10+
main_python_file_uri = "gs://goog-dataproc-initialization-actions-us-west2/conda/test_conda.py"
11+
jar_file_uris = ["file:///usr/lib/spark/examples/jars/spark-examples.jar"]
12+
python_file_uris = ["gs://goog-dataproc-initialization-actions-us-west2/conda/get-sys-exec.py"]
13+
file_uris = ["file://usr/lib/spark/examples/spark-examples.jar"]
14+
archive_uris = ["file://usr/lib/spark/examples/spark-examples.jar"]
15+
args = ["10"]
16+
}
17+
}

0 commit comments

Comments
 (0)