dbt-spark: How to achieve spark parallel execution? #1243
Unanswered
UdroiuRuben
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I’m trying to run three independent models in parallel using the following command: dbt run --select model_1 model_2 model_3
However, in the Spark UI, it appears that the models are being executed sequentially instead of in parallel.
Is it possible to achieve Spark jobs parallelism? Do you have any suggestions?
My profiles.yml configuration is as follows:
Beta Was this translation helpful? Give feedback.
All reactions