问题描述
我已经在我的Red Hat 64中安装了spark-hadoop env。我还想在IntelliJ idea 中的 spark源代码项目中读写代码。我已经下载了火花源代码并准备好了一切。但是在IntelliJ构思中编译spark项目时我遇到了一些错误。
以下是错误:
I have installed spark-hadoop env in my Red Hat 64. And I also want to read and write code in spark source code project in intelliJ idea. I have downloaded spark source code and make everything ready. But I had some errors when compiling spark project in IntelliJ idea. Here are errors:
错误:(809,34)未找到:值如果ast.tokenType =,则为SparkSqlParser
= SparlSqlParser.TinyintLiteral =>
Error:(809, 34) not found: value SparkSqlParser case ast if ast.tokenType == SparlSqlParser.TinyintLiteral =>
错误:(812,34)未找到:值SparkSqlParser
如果ast.tokenType == SparlSqlParser.SmallintLiteral =>
Error:(812, 34) not found: value SparkSqlParser case ast if ast.tokenType == SparlSqlParser.SmallintLiteral =>
...... ...
... ...
但实际上我没找到整个项目中名为 SparkSqlParser.scala 的文件既不是名为SparkSqlParser的scala类。
But actually I did not find a file named SparkSqlParser.scala in the whole project neither a scala class named SparkSqlParser.
但是,我在网上搜索了一些名为的文件SparkSqlParser.scala,但它们没有像TinyintLiteral,SmallintLiteral等属性。
以下是文件链接:
However, I had searched the web for some files named SparkSqlParser.scala, but they don't have attribute like "TinyintLiteral", "SmallintLiteral", etc.Here are the files link:
推荐答案
我遇到了同样的问题。这是我的解决方案:
I meet the same problem. Here is my solution:
- 下载IntelliJ的antlr4(即antlr v4)插件。然后你可以看到文件spark-2.0.1\sql\catalyst\src\main\antlr4\org\apache\spark\sql\catalyst\parser\SqlBase.g4 可以通过IntelliJ IDEA识别。
- 导航到View-> Tool Windows-> Maven Projects选项卡。选择项目Spark Project Catalyst。右键单击它。然后选择生成源和更新文件夹
- 之后,您可以看到一些文件已添加到spark-2.0.1 \sql\catalyst \ target生成源中\\\ ll4
- 然后你可以建立项目的成功。
- Download the antlr4 (i.e. antlr v4) plugin of IntelliJ. Then you can see the file "spark-2.0.1\sql\catalyst\src\main\antlr4\org\apache\spark\sql\catalyst\parser\SqlBase.g4" can be recognized by IntelliJ IDEA.
- Navigate to View->Tool Windows->Maven Projects tab. select the project "Spark Project Catalyst". Right click on it. Then select "Generate sources and update folders"
- After that you can see some files added into the "spark-2.0.1\sql\catalyst\target\generated-sources\antlr4"
- Then you can build success of the project.
希望它可以帮助你。
这篇关于在使用intelliJ构思编译时,Spark SQL没有SparkSqlParser.scala文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!