问题描述
我正在使用使用Scala和IntelliJ IDE开发的Spark来启动项目.
I'm starting a project using Spark developed with Scala and IntelliJ IDE.
我想知道如何在IntelliJ配置中使用Spark的特定配置设置-- properties-file
.
I was wondering how to set -- properties-file
with specific configuration of Spark in IntelliJ configuration.
我正在读取这样的配置"param1" -> sc.getConf.get("param1")
I'm reading configuration like this "param1" -> sc.getConf.get("param1")
当我从命令行执行Spark作业时,它就像一个超级按钮:/opt/spark/bin/spark-submit --class "com.class.main" --master local --properties-file properties.conf ./target/scala-2.11/main.jar arg1 arg2 arg3 arg4
When I execute Spark job from command line works like a charm:/opt/spark/bin/spark-submit --class "com.class.main" --master local --properties-file properties.conf ./target/scala-2.11/main.jar arg1 arg2 arg3 arg4
问题是当我使用VM Options
使用IntelliJ Run Configuration执行作业时:
The problem is when I execute job using IntelliJ Run Configuration using VM Options
:
- 我成功使用
--master
参数作为-Dspark.master=local
- 我成功使用
--conf
参数作为-Dspark.param1=value1
- 我失败,并且
--properties-file
- I succeed with
--master
param as-Dspark.master=local
- I succeed with
--conf
params as-Dspark.param1=value1
- I failed with
--properties-file
有人可以指出正确的设置方式吗?
Can anyone point me at the right way to set this up?
推荐答案
我不认为可以使用--properties-file
从IntelliJ IDEA内部启动Spark应用程序.
I don't think it's possible to use --properties-file
to launch a Spark application from within IntelliJ IDEA.
spark-submit
是用于提交Spark应用程序以执行的Shell脚本,在为Spark应用程序创建适当的提交环境之前,它不需要做任何额外的事情.
spark-submit
is the shell script to submit Spark application for execution and does few extra things before it creates a proper submission environment for the Spark application.
不过,您可以利用Spark应用程序默认加载的conf/spark-defaults.conf
来模仿--properties-file
的行为.
You can however mimic the behaviour of --properties-file
by leveraging conf/spark-defaults.conf
that a Spark application loads by default.
您可以在src/test/resources
(或src/main/resources
)下使用内容properties.conf
创建一个conf/spark-defaults.conf
.那是应该起作用的.
You could create a conf/spark-defaults.conf
under src/test/resources
(or src/main/resources
) with the content of properties.conf
. That is supposed to work.
这篇关于如何使用spark-submit的--properties-file选项在IntelliJ IDEA中启动Spark应用程序?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!