问题描述
我最近开始使用Spark Scala,HDFS,sbt和Livy.目前,我尝试创建livy批处理.
I recently started working with Spark Scala, HDFS, sbt and Livy. Currently I tried to create livy batch.
Warning: Skip remote jar hdfs://localhost:9001/jar/project.jar.
java.lang.ClassNotFoundException: SimpleApp
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:225)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:686)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
这是错误语句,显示在livy批处理日志中.
This is the error statement, showing in livy batch log.
对于本地.jar文件,我的spark-submit命令运行正常.
My spark-submit command is working perfectly fine for local .jar file.
spark-submit --class "SimpleApp" --master local target/scala-2.11/simple-project_2.11-1.0.jar
但是对于livy(在cURL中)一样,它会引发错误.
But same for livy (in cURL) it is throwing error.
"requirement failed: Local path /target/scala-2.11/simple-project_2.11-1.0.jar cannot be added to user sessions."
因此,我在hdfs中移动了.jar文件.我新的livy代码是-
So, I shift .jar file in hdfs. My new code for livy is -
curl -X POST --data '{
"file": "/jar/project.jar",
"className": "SimpleApp",
"args": ["ddd"]
}'
-H
"Content-Type: application/json"
http://server:8998/batches
这是上面提到的抛出错误.
This is throwing error which is mention above.
请让我知道,我哪里错了?
Please let me know, where am I wrong?
提前谢谢!
推荐答案
要将本地文件用于livy
批处理作业,您需要将本地文件夹添加到livy.conf
中的livy.file.local-dir-whitelist
属性.
To use local files for livy
batch jobs you need to add the local folder to the livy.file.local-dir-whitelist
property in livy.conf
.
livy.conf.template
中的描述:
这篇关于Apache Livy cURL不适用于spark-submit命令的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!