本文介绍了如何将 -D 参数或环境变量传递给 Spark 作业?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我想在开发/生产环境中更改 Spark 作业的类型安全配置.在我看来,完成此任务的最简单方法是将 -Dconfig.resource=ENVNAME
传递给作业.然后 Typesafe 配置库将为我完成这项工作.
I want to change Typesafe config of a Spark job in dev/prod environment. It seems to me that the easiest way to accomplish this is to pass -Dconfig.resource=ENVNAME
to the job. Then Typesafe config library will do the job for me.
有没有办法将该选项直接传递给作业?或者也许有更好的方法在运行时更改作业配置?
Is there way to pass that option directly to the job? Or maybe there is better way to change job config at runtime?
- 当我将
--conf "spark.executor.extraJavaOptions=-Dconfig.resource=dev"
选项添加到 spark-submit 命令时,没有任何反应. - 当我将
-Dconfig.resource=dev
传递给 spark-submit 命令.
- Nothing happens when I add
--conf "spark.executor.extraJavaOptions=-Dconfig.resource=dev"
option to spark-submit command. - I got
Error: Unrecognized option '-Dconfig.resource=dev'.
when I pass-Dconfig.resource=dev
to spark-submit command.
推荐答案
Change spark-submit
命令行添加三个选项:
Change spark-submit
command line adding three options:
--files
--conf 'spark.executor.extraJavaOptions=-Dconfig.resource=app'
--conf 'spark.driver.extraJavaOptions=-Dconfig.resource=app'
这篇关于如何将 -D 参数或环境变量传递给 Spark 作业?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!