本文介绍了如何在执行“org.apache.spark.sql.DataSet.collectAsList()"时修复“不支持的类文件主要版本 55"的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在创建一个 Java RESTAPI Spring Boot 应用程序,它使用 Spark 从服务器获取一些数据.当我尝试从数据集转换为列表时,它失败了.

I'm creating a Java RESTAPI Spring Boot application that uses spark to get some data from the server. When I try to convert from Dataset to List it fails.

我已经尝试过 jdk8 和 jdk11 来编译和执行代码,但我得到了相同的 'java.lang.IllegalArgumentException: Unsupported class file major version 55',过去,我通过更新 Java 版本解决了这个问题,但它不适用于此.

I've tried jdk8 and jdk11 to compile and execute the code but I get the same 'java.lang.IllegalArgumentException: Unsupported class file major version 55', in the past, I've solved this issue by updating Java version, but it's not working for this.

我正在使用:

  • JDK 11.0.2

  • JDK 11.0.2

Spring Boot 2.1.4

Spring Boot 2.1.4

Spark 2.4.2

Spark 2.4.2

这是我正在执行的代码:

This is the code I'm executing:

Dataset<Row> dataFrame = sparkSession.read().json("/home/data/*.json");
        dataFrame.createOrReplaceTempView("events");
        Dataset<Row> resultDataFrame = sparkSession.sql("SELECT * FROM events WHERE " + predicate);
        Dataset<Event> eventDataSet = resultDataFrame.as(Encoders.bean(Event.class));
        return eventDataSet.collectAsList();

查询有效,实际上在调试时您可以在 resultDataFrame 和 eventDataSet 中看到信息.

The query works, actually while debugging you can see information in both resultDataFrame and eventDataSet.

我希望输出是正确的事件列表,但出现异常:

I expect the output to be a proper list of Events, but I'm getting the exception:

[http-nio-8080-exec-2] ERROR org.apache.catalina.core.ContainerBase.[Tomcat].[localhost].[/].[dispatcherServlet] - Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is java.lang.IllegalArgumentException: Unsupported class file major version 55] with root cause
java.lang.IllegalArgumentException: Unsupported class file major version 55
    at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
    at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
    at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
    at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
    at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
    at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
    at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
    at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
    at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
    at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
    at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
    at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
    at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
.....

评论更新:对于 Java 8,我将 pom 更改为针对 Java 8:

UPDATE BY COMMENTS:For Java 8, I change pom to aim java 8:

<java.version>1.8</java.version>

然后更新项目,maven clean,maven install 然后运行.获取相同的版本 55 错误

And then update project, maven clean, maven install and then run.Getting same version 55 error

推荐答案

问题的根本原因是一个符号链接,我指向了错误的 JDK,这就是它无法工作的原因.JAVA_HOME 的目标是 jdk11,而 Eclipse 正在运行.

The root cause of the issue was a symbolic link that I have aiming the wrong JDK and that's why it wasn't working. JAVA_HOME was aiming a jdk11 and eclipse was running with that.

这篇关于如何在执行“org.apache.spark.sql.DataSet.collectAsList()"时修复“不支持的类文件主要版本 55"的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-01 06:56