问题描述
SPARK 2.3引发以下异常.谁能帮忙!我尝试添加JAR
SPARK 2.3 is throwing following exception. Can anyone please help!! I tried adding the JARs
308 [驱动程序]错误org.apache.spark.deploy.yarn.ApplicationMaster-用户类引发异常:java.lang.NoSuchMethodError:io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric; java.lang.NoSuchMethodError:io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric; 在org.apache.spark.network.util.NettyMemoryMetrics.registerMetrics(NettyMemoryMetrics.java:80) 在org.apache.spark.network.util.NettyMemoryMetrics.(NettyMemoryMetrics.java:76) 在org.apache.spark.network.client.TransportClientFactory.(TransportClientFactory.java:109) 在org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99) 在org.apache.spark.rpc.netty.NettyRpcEnv.(NettyRpcEnv.scala:71) 在org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461) 在org.apache.spark.rpc.RpcEnv $ .create(RpcEnv.scala:57) 在org.apache.spark.SparkEnv $ .create(SparkEnv.scala:249) 在org.apache.spark.SparkEnv $ .createDriverEnv(SparkEnv.scala:175) 在org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256) 在org.apache.spark.SparkContext.(SparkContext.scala:423) 在org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58) 在com.voicebase.etl.HBasePhoenixPerformance2.main(HBasePhoenixPerformance2.java:55) 在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处 在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 在java.lang.reflect.Method.invoke(Method.java:498) 在org.apache.spark.deploy.yarn.ApplicationMaster $$ anon $ 4.run(ApplicationMaster.scala:706) 315 [main]错误org.apache.spark.deploy.yarn.ApplicationMaster-未捕获的异常: org.apache.spark.SparkException:在awaitResult中引发的异常: 在org.apache.spark.util.ThreadUtils $ .awaitResult(ThreadUtils.scala:205) 在org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:486) 在org.apache.spark.deploy.yarn.ApplicationMaster.org $ apache $ spark $ deploy $ yarn $ ApplicationMaster $$ runImpl(ApplicationMaster.scala:345) 在org.apache.spark.deploy.yarn.ApplicationMaster $$ anonfun $ run $ 2.apply $ mcV $ sp(ApplicationMaster.scala:260) 在org.apache.spark.deploy.yarn.ApplicationMaster $$ anonfun $ run $ 2.apply(ApplicationMaster.scala:260) 在org.apache.spark.deploy.yarn.ApplicationMaster $$ anonfun $ run $ 2.apply(ApplicationMaster.scala:260) 在org.apache.spark.deploy.yarn.ApplicationMaster $$ anon $ 5.run(ApplicationMaster.scala:800) 在java.security.AccessController.doPrivileged(本机方法) 在javax.security.auth.Subject.doAs(Subject.java:422) 在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836) 在org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:799) 在org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:259) 在org.apache.spark.deploy.yarn.ApplicationMaster $ .main(ApplicationMaster.scala:824) 在org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala) 原因:java.util.concurrent.ExecutionException:装箱错误
308 [Driver] ERROR org.apache.spark.deploy.yarn.ApplicationMaster - User class threw exception: java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric; java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric; at org.apache.spark.network.util.NettyMemoryMetrics.registerMetrics(NettyMemoryMetrics.java:80) at org.apache.spark.network.util.NettyMemoryMetrics.(NettyMemoryMetrics.java:76) at org.apache.spark.network.client.TransportClientFactory.(TransportClientFactory.java:109) at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99) at org.apache.spark.rpc.netty.NettyRpcEnv.(NettyRpcEnv.scala:71) at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461) at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256) at org.apache.spark.SparkContext.(SparkContext.scala:423) at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58) at com.voicebase.etl.HBasePhoenixPerformance2.main(HBasePhoenixPerformance2.java:55) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:706) 315 [main] ERROR org.apache.spark.deploy.yarn.ApplicationMaster - Uncaught exception: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205) at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:486) at org.apache.spark.deploy.yarn.ApplicationMaster.org$apache$spark$deploy$yarn$ApplicationMaster$$runImpl(ApplicationMaster.scala:345) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply$mcV$sp(ApplicationMaster.scala:260) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply(ApplicationMaster.scala:260) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply(ApplicationMaster.scala:260) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$5.run(ApplicationMaster.scala:800) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836) at org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:799) at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:259) at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:824) at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala) Caused by: java.util.concurrent.ExecutionException: Boxed Error
推荐答案
我找到了解决方案.这是因为hadoop二进制文件是使用较旧版本编译的,因此需要我们替换它们.通过替换它们,我没有遇到过Hadoop的任何问题.
I found the solution. This is because hadoop binaries compiled with older version and need us to just replace them. I did not faced any issue with hadoop by replacing them.
您需要将路径$HADOOP_HOME\share\hadoop
中的netty-3.6.2.Final.jar
和netty-all-4.0.23.Final.jar
替换为netty-all-4.1.17.Final.jar
和netty-3.9.9.Final.jar
you need to replace netty-3.6.2.Final.jar
and netty-all-4.0.23.Final.jar
from path $HADOOP_HOME\share\hadoop
with netty-all-4.1.17.Final.jar
and netty-3.9.9.Final.jar
这解决了我的问题.如果您有其他解决方案,请共享.
This solved my problem. If you have alternate solution please do share.
这篇关于Spark 2.3 java.lang.NoSuchMethodError:io.netty.buffer.PooledByteBufAllocator.metric的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!