The correct way to submit spark applications to a cluster is using spark submit to set the cluster configuration options, jars, files,etc. However the DGA runner attempts to set jars and other infomation itself based on the command line options, this can cause conflicts and result in odd errors (like class not found errors when jars aren't distributed to workers).
Anytime a spark program is launched spark submit options should be used when available and not overwritten in the code.