我在尝试使用com.typesafe.play play-json 2.4.0时遇到问题
在火花。
以下代码在spark服务器上例外,但在我的PC上运行良好。

val json = Json.parse(json_string)

例外:
java.lang.NoSuchMethodError: com.fasterxml.jackson.core.JsonToken.id()I
    at play.api.libs.json.jackson.JsValueDeserializer.deserialize(JacksonJson.scala:122)
    at play.api.libs.json.jackson.JsValueDeserializer.deserialize(JacksonJson.scala:108)
    at play.api.libs.json.jackson.JsValueDeserializer.deserialize(JacksonJson.scala:103)
    at com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:2860)
    at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:1569)
    at play.api.libs.json.jackson.JacksonJson$.parseJsValue(JacksonJson.scala:226)
    at play.api.libs.json.Json$.parse(Json.scala:21)
    at org.soprism.kafka.connector.TwitterToCassandraPostsParser$.ParseJson(TwitterToCassandraPostsParser.scala:74)
    at org.soprism.kafka.connector.TwitterToCassandraPostsParser$$anonfun$1$$anonfun$apply$1.apply(TwitterToCassandraPostsParser.scala:65)
    at org.soprism.kafka.connector.TwitterToCassandraPostsParser$$anonfun$1$$anonfun$apply$1.apply(TwitterToCassandraPostsParser.scala:65)
    at scala.collection.Iterator$class.foreach(Iterator.scala:727)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
    at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:798)
    at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:798)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1503)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1503)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
    at org.apache.spark.scheduler.Task.run(Task.scala:64)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)

我使用spark-submit命令执行它

jackson 图书馆的两个版本之间似乎不相容。我该如何解决?

谢谢

最佳答案

Spark节点不会检查您的依赖关系。您需要构建一个包含所有依赖项的uber-jar,并将其传递给Spark,以便分发到其他节点。

10-04 22:16
查看更多