问题描述
我刚刚下载了最新版本的火花,当我开始了火花壳我得到了以下错误:
I just downloaded the latest version of spark and when I started the spark shell I got the following error:
java.net.BindException: Failed to bind to: /192.168.1.254:0: Service 'sparkDriver' failed after 16 retries!
at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:393)
at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:389)
...
...
java.lang.NullPointerException
at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:193)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:71)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
at $iwC$$iwC.<init>(<console>:9)
...
...
<console>:10: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:10: error: not found: value sqlContext
import sqlContext.sql
^
有什么,我设立的火花错过了什么?
Is there something that I missed in setting up spark?
推荐答案
尝试星火环境变量 SPARK_LOCAL_IP
设置为一个本地IP地址。
Try setting the Spark env variable SPARK_LOCAL_IP
to a local IP address.
在我的情况,我于在Amazon EC2 Linux实例上运行的火花。 火花壳
停止了工作,与你相似的错误消息。我能够加入像下面的星火配置文件中的设置来解决它火花env.conf
。
In my case, I was running Spark on an Amazon EC2 Linux instance. spark-shell
stopped working, with an error message similar to yours. I was able to fix it by adding a setting like the following to the Spark config file spark-env.conf
.
出口SPARK_LOCAL_IP = 172.30.43.105
也可以将其设置在〜/ .profile或〜/ .bashrc中。
Could also set it in ~/.profile or ~/.bashrc.
另外在 / etc / hosts中
这篇关于错误启动火花shell时的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!