问题描述
我是新的Apache火花.我已经在spark独立模式下测试了一些应用程序,但是我想运行应用程序yarn模式.我正在Windows中运行apache-spark 2.1.0.这是我的代码
I am new apache-spark. I have tested some application in spark standalone mode.but I want to run application yarn mode.I am running apache-spark 2.1.0 in windows.Here is My code
c:\spark>spark-submit2 --master yarn --deploy-mode client --executor-cores 4 --jars C:\DependencyJars\spark-streaming-eventhubs_2.11-2.0.3.jar,C:\DependencyJars\scalaj-http_2.11-2.3.0.jar,C:\DependencyJars\config-1.3.1.jar,C:\DependencyJars\commons-lang3-3.3.2.jar --conf spark.driver.userClasspathFirst=true --conf spark.executor.extraClassPath=C:\DependencyJars\commons-lang3-3.3.2.jar --conf spark.executor.userClasspathFirst=true --class "GeoLogConsumerRT" C:\sbtazure\target\scala-2.11\azuregeologproject_2.11-1.0.jar
例外:使用主"yarn"运行时,必须在环境中设置HADOOP_CONF_DIR或YARN_CONF_DIR.火花
因此来自搜索网站.我创建了一个名为Hadoop_CONF_DIR的文件夹,并将配置单元site.xml放入其中,并指向环境变量,然后运行spark-submit,然后得到了
so from searching website. I have created a folder name Hadoop_CONF_DIR and place hive site.xml in it and pointed as environment variable, after that i have run spark-submit then I have got
连接被拒绝例外我认为我无法正确配置纱线模式设置.有人可以帮助我解决此问题吗?我需要分别安装Hadoop和yarn吗?我想在伪分布式模式下运行我的应用程序.请帮助我在Windows中配置yarn模式.
connection refused exceptionI think i could not configure yarn mode set up properly.Could anyone help me for solving this issue? do I need to install Hadoop and yarn separately?I want to run my application in pseudo distributed mode.Kindly help me to configure yarn mode in windows thanks
推荐答案
您需要导出两个变量HADOOP_CONF_DIR
和YARN_CONF_DIR
,以使配置文件对yarn可见.如果您使用的是Linux,请在.bashrc文件中使用以下代码.
You need to export two variables HADOOP_CONF_DIR
and YARN_CONF_DIR
to make your configurations file visible to yarn. Use below code in .bashrc file if you are using linux.
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export YARN_CONF_DIR=$HADOOP_HOME/etc/hadoop
在Windows中,您需要设置环境变量.
In windows you need to set environment variable.
希望这会有所帮助!
这篇关于异常:java.lang.Exception:使用主"yarn"运行时,必须在环境中设置HADOOP_CONF_DIR或YARN_CONF_DIR.火花中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!