本文介绍了java.lang.NoSuchMethodError: net.jpountz.util.Utils.checkRange的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧! 问题描述 29岁程序员,3月因学历无情被辞! 我在 python 中使用 spark-streaming 2.2.0.并从 kafka(2.11-0.10.0.0) 集群读取数据.我提交了一个python脚本spark-submit --jars spark-streaming-kafka-0-8-assembly_2.11-2.2.0.jar hodor.pyspark报错误信息i use spark-streaming 2.2.0 with python. and read data from kafka(2.11-0.10.0.0) cluster. and i submit a python script with spark-submit --jars spark-streaming-kafka-0-8-assembly_2.11-2.2.0.jar hodor.py the spark report a error message17/08/04 10:52:00 ERROR Utils: Uncaught exception in thread stdoutwriter for python java.lang.NoSuchMethodError: net.jpountz.util.Utils.checkRange([BII)Vat org.apache.kafka.common.message.KafkaLZ4BlockInputStream.read(KafkaLZ4BlockInputStream.java:176)at java.io.FilterInputStream.read(FilterInputStream.java:107)at kafka.message.ByteBufferMessageSet$$anonfun$decompress$1.apply$mcI$sp(ByteBufferMessageSet.scala:67)at kafka.message.ByteBufferMessageSet$$anonfun$decompress$1.apply(ByteBufferMessageSet.scala:67)at kafka.message.ByteBufferMessageSet$$anonfun$decompress$1.apply(ByteBufferMessageSet.scala:67)at scala.collection.immutable.Stream$.continually(Stream.scala:1279)at kafka.message.ByteBufferMessageSet$.decompress(ByteBufferMessageSet.scala:67)at kafka.message.ByteBufferMessageSet$$anon$1.makeNextOuter(ByteBufferMessageSet.scala:179)at kafka.message.ByteBufferMessageSet$$anon$1.makeNext(ByteBufferMessageSet.scala:192)at kafka.message.ByteBufferMessageSet$$anon$1.makeNext(ByteBufferMessageSet.scala:146)at kafka.utils.IteratorTemplate.maybeComputeNext(IteratorTemplate.scala:66)at kafka.utils.IteratorTemplate.hasNext(IteratorTemplate.scala:58)at scala.collection.Iterator$$anon$18.hasNext(Iterator.scala:764)at org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.getNext(KafkaRDD.scala:214)at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73)at scala.collection.Iterator$class.foreach(Iterator.scala:893)at org.apache.spark.util.NextIterator.foreach(NextIterator.scala:21)at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:509)at org.apache.spark.api.python.PythonRunner$WriterThread$$anonfun$run$3.apply(PythonRDD.scala:333)at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1954)at org.apache.spark.api.python.PythonRunner$WriterThread.run(PythonRDD.scala:269)我认为这可能是由 lz4 版本冲突引起的.火花取决于 net.jpountz.lz4 1.3.0但 kafka 依赖于 net.jpountz.lz4 1.2.0i think it maybe caused with lz4 version conflict. spark depend on net.jpountz.lz4 1.3.0 but kafka depend on net.jpountz.lz4 1.2.0我该如何解决?推荐答案我在升级到 spark 2.3.0 时遇到了同样的问题(使用结构化流连接两个流)I had the same issue when upgrading to spark 2.3.0 (join two streams with Structured Streaming)这是错误:NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream.init NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream.init<>我正在使用 Maven,所以解决方案是将两者中最旧的依赖项(spark 2.3 使用 1.4.0,kafka 使用 1.3.0)添加到我的主文件的 pom.xml项目.I am using Maven, so the solution was to add the oldest dependency of the two (spark 2.3 used 1.4.0, kafka used 1.3.0), to the pom.xml of my main project.<dependency> <groupId>net.jpountz.lz4</groupId> <artifactId>lz4</artifactId> <version>1.3.0</version></dependency> 这篇关于java.lang.NoSuchMethodError: net.jpountz.util.Utils.checkRange的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持! 上岸,阿里云!
08-24 11:33