-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Default value SparkHelper overrides config for spark.master #120
Comments
Thanks for letting us know! |
@christopherfrieler thanks for the idea! @Jolanrensen it looks like Christopher provided us with complete implementation, we should just change the default value :) |
Sure! Just wanted to check whether this is the best way to go, since they did call it a "workaround" and "not nice", haha. But creating a new |
Added! |
Currently
org.jetbrains.kotlinx.spark.api.withSpark(props, master, appName, logLevel, func)
provides a default value for the parametermaster
. This overrides the value from the external spark config provided with the spark-submit command.In my case, I'm running Spark on yarn, so I'm having a submit-command like this
./spark-submit --master yarn ...
. But when I start the SparkSession in my code with, the default
"local[*]"
for the parametermaster
is used. This leads to a local Spark session on the application master (which kind of works, but does not make use of the executors provided by yarn) and yarn complaining that I never started a Spark session after my job finishes, because it doesn't recognize the local one.I think, by default the value for
master
should be taken from the external config loaded bySparkConf()
. As a workaround, I load the value myself:This works, but is not nice and duplicates the default value "local[*]".
The text was updated successfully, but these errors were encountered: