问题描述
我是在的
火花版本:火花1.4.0
SBT版本:0.13.8
和我跑命令SBT运行,并得到了错误。 java.io.InvalidClassException:org.apache.spark.deploy.ApplicationDescription;局部类不兼容
该应用程序将无法在VAL SC =新SparkContext(CONF),当我试图启动SparkContext类。我周围的一派,看this帖子但我没有使用Hadoop的客户端。
你能否检查出来?我的猜测是build.sbt一个版本的问题。非常感谢你。
更新:我尝试提交Python应用程序和工作正常,这意味着星火集群是确定
。斯卡拉code是如下:
/ * * SimpleApp.scala /
进口org.apache.spark.SparkContext
进口org.apache.spark.SparkContext._
进口org.apache.spark.SparkConf反对SimpleApp {
高清主(参数:数组[字符串]){
VAL LOGFILE =YOUR_SPARK_HOME / README.md//应该是你的系统上的某些文件
VAL的conf =新SparkConf()。setAppName(简单应用程序)
VAL SC =新SparkContext(CONF)
VAL logData = sc.textFile(日志文件,2).cache()
VAL numAs = logData.filter(线= GT; line.contains(一))计算()。
VAL麻木= logData.filter(线= GT; line.contains(B))计算()。
的println(用线:%S,线与B:%S.format(numAs,麻木))
}
}
built.sbt 是如下:
名称:=简单计划版本:=1.0scalaVersion:=2.10.4libraryDependencies + =org.apache.spark%%火花核%1.2.0
错误消息如下:
15/08/27 5点23分38秒错误Remoting的:org.apache.spark.deploy.ApplicationDescription;局部类不兼容:流classdesc的serialVersionUID = 7674242335164700840,本地类的serialVersionUID = -7685200927816255400
java.io.InvalidClassException:org.apache.spark.deploy.ApplicationDescription;局部类不兼容:流classdesc的serialVersionUID = 7674242335164700840,本地类的serialVersionUID = -7685200927816255400
在java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
在java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
在java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
在java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
在java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
在java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
在java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
在java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
在java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
在java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
在akka.serialization.JavaSerializer $$ anonfun $ 1.适用(Serializer.scala:136)
在scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
在akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
在akka.serialization.Serialization $$ anonfun $反序列化$ 1.适用(Serialization.scala:104)
在scala.util.Try $。适用(Try.scala:161)
在akka.serialization.Serialization.deserialize(Serialization.scala:98)
在akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:63)
在akka.serialization.Serialization $$ anonfun $反序列化$ 1.适用(Serialization.scala:104)
在scala.util.Try $。适用(Try.scala:161)
在akka.serialization.Serialization.deserialize(Serialization.scala:98)
在akka.remote.MessageSerializer $ .deserialize(MessageSerializer.scala:23)
在akka.remote.DefaultMessageDispatcher.payload $ lzycompute $ 1(Endpoint.scala:58)
1 akka.remote.DefaultMessageDispatcher.payload $(Endpoint.scala:58)
在akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
在akka.remote.EndpointReader $$ anonfun $获得$ 2.applyOrElse(Endpoint.scala:937)
在akka.actor.Actor $ class.aroundReceive(Actor.scala:465)
在akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
在akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
在akka.actor.ActorCell.invoke(ActorCell.scala:487)
在akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
在akka.dispatch.Mailbox.run(Mailbox.scala:220)
在akka.dispatch.ForkJoinExecutorConfigurator $ AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
在scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
在scala.concurrent.forkjoin.ForkJoinPool $ WorkQueue.runTask(ForkJoinPool.java:1339)
在scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
在scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
你说你是一个星火1.4.0集群,但你的 build.sbt
与建设运行1.2.0。请更改这个在您的build.sbt:
libraryDependencies + =org.apache.spark%%火花核%1.4.0
Hi I'm runnign example on Spark site at http://spark.apache.org/docs/1.2.0/quick-start.html#self-contained-applications
spark version: spark-1.4.0
sbt version: 0.13.8
And I ran command "sbt run" and got error "java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible".
This application fails at "val sc = new SparkContext(conf)", when I'm trying to initiate a SparkContext class. I've googled around and see this post but I didn't use hadoop-client.
Could you please check it out? My guess is a version issue in build.sbt. Thank you very much.
update: I've try to submit python application and works fine which means Spark cluster is OK.
scala code is below:
/* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object SimpleApp {
def main(args: Array[String]) {
val logFile = "YOUR_SPARK_HOME/README.md" // Should be some file on your system
val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
val logData = sc.textFile(logFile, 2).cache()
val numAs = logData.filter(line => line.contains("a")).count()
val numBs = logData.filter(line => line.contains("b")).count()
println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
}
}
built.sbt is below:
name := "Simple Project"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
error message is below:
15/08/27 05:23:38 ERROR Remoting: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 7674242335164700840, local class serialVersionUID = -7685200927816255400
java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 7674242335164700840, local class serialVersionUID = -7685200927816255400
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
at scala.util.Try$.apply(Try.scala:161)
at akka.serialization.Serialization.deserialize(Serialization.scala:98)
at akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:63)
at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
at scala.util.Try$.apply(Try.scala:161)
at akka.serialization.Serialization.deserialize(Serialization.scala:98)
at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
at akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
at akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
You say you are running with a Spark 1.4.0 cluster but your build.sbt
is building with 1.2.0. Please change to this in your build.sbt:
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.0"
这篇关于阿帕奇星火:ERROR局部类不兼容发起SparkContext类时的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!