嗨,当我写这段代码时
>sbt
然后看到这个结果
beyhan@beyhan:~/sparksample$ sbt
Starting sbt: invoke with -help for other options
[info] Set current project to Spark Sample (in build file:/home/beyhan/sparksample/)
在我编写这段代码之后
>compile
我收到这个错误
[error] {file:/home/beyhan/sparksample/}default-f390c8/*:update: sbt.ResolveException: unresolved dependency: org.apache.hadoop#hadoop-yarn-common;1.0.4: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-yarn-client;1.0.4: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-yarn-api;1.0.4: not found
[error] download failed: org.eclipse.jetty.orbit#javax.transaction;1.1.1.v201105210645!javax.transaction.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016!javax.servlet.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.mail.glassfish;1.4.1.v201005082020!javax.mail.glassfish.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.activation;1.1.0.v201105071233!javax.activation.orbit
[error] Total time: 14 s, completed Oct 16, 2015 3:58:48 PM
我的sparksample有这个。
beyhan@beyhan:~/sparksample$ ll
total 20
drwxrwxr-x 4 beyhan beyhan 4096 Eki 16 16:02 ./
drwxr-xr-x 57 beyhan beyhan 4096 Eki 16 15:27 ../
drwxrwxr-x 2 beyhan beyhan 4096 Eki 16 16:02 project/
-rw-rw-r-- 1 beyhan beyhan 142 Eki 15 18:57 simple.sbt
drwxrwxr-x 3 beyhan beyhan 4096 Eki 15 11:14 src/
另外src文件有
src>main>scala>SimpleCode.scala
和我的simple.sbt文件是这样的
name := "Spark Sample"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.2.0"
我该怎么办?
我认为这与我的毛线有关,因为我没有。
谢谢。
最佳答案
org.apache.hadoop#hadoop-yarn-client;1.0.4
对您的这种依赖性似乎不是因为build.sbt
造成的。可能是您在~/.ivy2
或~/.m2
中缓存的文件出了问题,或者可能是由于某些project/*.sbt
文件引入了其他依赖关系。
它对我来说很好:build.sbt
$ cat build.sbt
name := "Spark Sample"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
解决所有依赖项:
$ sbt compileGetting org.scala-sbt sbt 0.13.9 ...:: retrieving :: org.scala-sbt#boot-app confs: [default] 52 artifacts copied, 0 already retrieved (17785kB/791ms)Getting Scala 2.10.5 (for sbt)...:: retrieving :: org.scala-sbt#boot-scala confs: [default] 5 artifacts copied, 0 already retrieved (24493kB/306ms)[info] Set current project to Spark Sample (in build file:/home/tuxdna/tmp/p/)[info] Updating {file:/home/tuxdna/tmp/p/}p...[info] Resolving jline#jline;2.12.1 ...[info] downloading https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.11/1.2.0/spark-core_2.11-1.2.0.jar ...[info] [SUCCESSFUL ] org.apache.spark#spark-core_2.11;1.2.0!spark-core_2.11.jar (31007ms)[info] downloading https://repo1.maven.org/maven2/org/apache/spark/spark-network-common_2.11/1.2.0/spark-network-common_2.11-1.2.0.jar ...[info] [SUCCESSFUL ] org.apache.spark#spark-network-common_2.11;1.2.0!spark-network-common_2.11.jar (1873ms)[info] downloading https://repo1.maven.org/maven2/org/apache/spark/spark-network-shuffle_2.11/1.2.0/spark-network-shuffle_2.11-1.2.0.jar ...[info] [SUCCESSFUL ] org.apache.spark#spark-network-shuffle_2.11;1.2.0!spark-network-shuffle_2.11.jar (2122ms)[info] Done updating.[success] Total time: 61 s, completed 17 Oct, 2015 12:48:49 AM
注意我安装的Scala和SBT版本:
$ sbt sbt-version
[info] Set current project to Spark Sample (in build file:/home/tuxdna/tmp/p/)
[info] 0.13.9
$ scala -version
Scala code runner version 2.11.2 -- Copyright 2002-2013, LAMP/EPFL
您是否可以使用单独的用户(或也许在单独的计算机上)尝试这些步骤?
关于hadoop - 使用 Spark 时SBT纱的错误,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/33171330/