问题描述
我已经看到很多与此错误相关的答案,但都重新指向scala版本等。但我认为我的情况不同。
I have seen many answers related to this error, but all re-directing to scala versions etc. But I think my case is different.
我有一个遥控器使用版本2.10设置spark master-worker集群。我能够通过 列出来验证它所有工作节点
I have a remote spark master-worker cluster set up with version 2.10. I was able to verify it through http://master-ip:8080 listing all worker nodes
从我的应用程序,我正在尝试使用Java 7代码创建SparkConf。以下是代码
From my application, I am trying to create SparkConf with Java 7 code. Following below is the code
sparkConf = new SparkConf(true)
.set("spark.cassandra.connection.host", "localhost")
.set("spark.cassandra.auth.username", "username")
.set("spark.cassandra.auth.password", "pwd")
.set("spark.master", "spark://master-ip:7077")
.set("spark.app.name","Test App");
以下是我添加的maven依赖项
Following are the maven dependencies i added
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>2.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
<exclusions>
<exclusion>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
</exclusion>
</exclusions>
</dependency>
我收到以下错误
Caused by: java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
at org.apache.spark.util.Utils$.getSystemProperties(Utils.scala:1710)
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:73)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:68)
Spark版本来自其中一个工作节点
Spark Version from one of the worker nodes
./spark-shell --version
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.1.0
/_/
Using Scala version 2.11.8, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_91
Branch
Compiled by user jenkins on 2016-12-16T02:04:48Z
Revision
Url
Type --help for more information.
推荐答案
它与Scala版本有关。
It is related to Scala version.
你的集群有Scala 2.10,但Spark依赖是
Your cluster has Scala 2.10, but Spark dependency is
spark-core_2.11
这意味着Scala 2.11
which means Scala 2.11
更改它到2.10并将工作
Change it to 2.10 and will work
这篇关于NoSuchMethodError:scala.Predef $。$ conforms()Lscala / Predef $$ less $ colon $ less的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!