问题描述
在我的应用程序中,我正在从cassandra中检索数据,并使用akka spray提供其余的api。当我通过IDE运行时,它工作正常。但是,当我在计算机上本地执行 Spark Submit
时,出现如下所示的错误
我的build.sbt如下。
scalaVersion:= 2.10.5
libraryDependencies + = org.apache.spark %% spark-core% 1.4.0
libraryDependencies + = org.apache.spark %%火花流% 1.4.0
libraryDependencies + = org.apache.spark% spark-sql_2 .10% 1.4.0
libraryDependencies + = com.datastax.spark %% spark-cassandra-connector% 1.4.0 withSources()withJavadoc()
解析器++ = Seq(
Akka存储库位于 http://repo.akka.io/releases/)
解析器+ + = Seq( http://repo.typesafe.com/typesafe/releases/上的 Typesafe存储库,
http://repo.spray.io上的 Spray存储库)
libraryDependencies + =
com.typesa fe.akka %% akka-actor% 2.3.14
libraryDependencies ++ = {
val sprayVersion = 1.3.2
Seq(
io.spray %% spray-can%sprayVersion,
io.spray %% spray-routing%sprayVersion,
io.spray% % spray-json%sprayVersion
)
}
请让我知道我犯了什么错误。您的所有建议都是有价值的。
您可能正在提交仅包含您的代码的JAR。您必须使用--jars参数将spark-cassandra-connector添加到类路径(用于spark-submit)(请参见)
或者,您可以尝试构建一个胖的JAR(将包括依赖项)并在您到目前为止执行的操作中提交它。您可以使用 从SBT构建胖JAR。 p>
In my application, I am retriving data from cassandra and providing the rest api using akka spray. It is working fine when I am running through IDE. But when I am running through Spark submit
locally on my machine got error as given below
My build.sbt as given below.
scalaVersion := "2.10.5"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.0"
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.4.0"
libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "1.4.0"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.4.0" withSources() withJavadoc()
resolvers ++= Seq(
"Akka Repository" at "http://repo.akka.io/releases/")
resolvers ++= Seq("Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/",
"Spray Repository" at "http://repo.spray.io")
libraryDependencies +=
"com.typesafe.akka" %% "akka-actor" % "2.3.14"
libraryDependencies ++= {
val sprayVersion = "1.3.2"
Seq(
"io.spray" %% "spray-can" % sprayVersion,
"io.spray" %% "spray-routing" % sprayVersion,
"io.spray" %% "spray-json" % sprayVersion
)
}
Please let me know what mistake I have done.Thanks in advance. All your suggestion are valueable.
You're probably submitting a JAR that contains only your code. You have to add spark-cassandra-connector to class path (for spark-submit) by using --jars parameter (see Spark submit's advanced dependency management)
Or, you can try building a fat JAR (which will include the dependency) and submit it as you're doing so far. You can build a fat JAR from SBT using sbt-assembly
这篇关于线程[default-akka.actor.default-dispatcher-5]关闭ActorSystem导致未捕获的致命错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!