我正在尝试在Mac上构建Apache Spark。由于我使用Macports,因此Homebrew选项不可行。因此,我手动安装了正确版本的java和maven并创建了正确的路径:
Sankha-desktop:spark-1.6.1 user$ mvn -version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)
Maven home: /opt/local/share/java/maven33
Java version: 1.7.0_79, vendor: Oracle Corporation
Java home:
/Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "mac os x", version: "10.11.3", arch: "x86_64", family: "mac"
现在,当我尝试安装Spark时,出现以下错误:
Sankha-desktop:spark-1.6.1 user$ build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package
Using `mvn` from path: /opt/local/share/java/maven33/bin/mvn
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO]
~~~~~~~ some more stuff ~~~~~~~
[info] Compiling 3 Java sources to /Users/user/Documents/installers/spark/spark-1.6.1/tags/target/scala-2.10/classes...
[error] javac: invalid source release: 1.7
[error] Usage: javac <options> <source files>
[error] use -help for a list of possible options
[error] Compile failed at Apr 11, 2016 6:14:07 AM [0.024s]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [ 7.176 s]
[INFO] Spark Project Test Tags ............................ FAILURE [ 1.027 s]
[INFO] Spark Project Launcher ............................. SKIPPED
[INFO] Spark Project Networking ........................... SKIPPED
显然我的Java版本错误?我明确安装了1.7版,并与其链接。我也有最新的版本1.8,但是已经从JAVA_PATH中删除了它。
有人知道发生了什么吗?
最佳答案
我认为这与Zinc编译器有关。
通过禁用pom.xml文件中的Zinc编译器,我能够将其编译。更改此:
<useZincServer>true</useZincServer>
对此:
<useZincServer>false</useZincServer>
当Zinc编译器在我的JRE目录中寻找
javac
时,我的错误消息有所不同。我试图弄清楚如何为Zinc设置一个不同的java_home,但无法弄清楚(有一个-java-home参数,我试图将其添加到pom中,但似乎没有一个。影响)。这是有关Zinc的Spark文档:
http://spark.apache.org/docs/latest/building-spark.html#speeding-up-compilation-with-zinc
这是Zinc GitHub页面:
https://github.com/typesafehub/zinc
更新:这可能是scala-maven-plugin中的问题-https://github.com/davidB/scala-maven-plugin/issues/173
关于java - Apache Spark中的Java要求,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/36536586/