问题描述
运行spark-shell命令时,我收到以下错误:
I am getting following error when running spark-shell command:
我已经从,解压缩tar文件,将文件夹的内容粘贴到c:\Spark目录中。之后,我已经相应地配置了spark和jdk的环境变量,但是我收到这个错误。任何帮助将不胜感激。
I have downloaded spark-2.1.1-bin-hadoop2.7.tgz file from http://spark.apache.org/downloads.html, extracted the tar file and pasted the contents of the folder into c:\Spark directory. After that I have configured the environment variable for spark and jdk accordingly, but I am getting this error. Any help will be appreciated.
推荐答案
我几乎确保您的 JAVA_HOME 环境变量包含一个打破 spark-shell 的空格。请重新安装Java到路径中没有空格的目录。
I'm almost sure that your JAVA_HOME environment variable contains a space that breaks spark-shell. Please re-install Java to a directory with no spaces in the path.
您可以看到相关的代码段 spark-shell 在Windows下执行封面(通过 shell脚本):
You can see the relevant piece of code in bin/spark-class2.cmd that spark-shell executes on Windows under the covers (through bin/spark-submit2.cmd shell script):
if "x%1"=="x" (
所以当 spark-class2.cmd 将%1 替换为具有空格(或类似东西)的路径,最终为:
So when spark-class2.cmd substitutes %1 to a path with a space (or something similar) it ends up as:
if "x"Files\Java\jdk1.8.0_45""=="x" (
由于太多的双引号而导致错误。
that gives the error due to too many double quotes.
神秘的是如何在这个地方, JAVA_HOME 似乎找不到原因,但这是w帽子我们在这里看到。
The mystery is how does the JAVA_HOME end up in this place. I can't seem to find the reason, but that's what we see here.
这篇关于为什么火花壳失败,“此时出乎意料”?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!