本文介绍了PATH问题:搜索时找不到有效的SPARK_HOME的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我更新了PATH,使其看起来像这样:

I updated my PATH to look like this:

PATH="$HOME/bin:$HOME/.local/bin:$PATH:/home/username/Installs/Spark/bin"

认为起作用了,因为我设法从另一个文件夹调用了spark-shell(尽管我想知道我是否会发疯,而且实际上是来自bin文件夹).但是,重启Ubuntu后,它似乎不再起作用.为什么?

I think it worked as I managed to call spark-shell from a different folder (although I'm wondering if I'm going crazy and it was really from the bin folder). However after rebooting Ubuntu it no longer seems to work. Why?

Could not find valid SPARK_HOME while searching ['/home/username', '/usr/local/bin']
/usr/local/bin/spark-shell: line 57: /bin/spark-submit: No such file or directory

推荐答案

设置

PATH="$HOME/bin:$HOME/.local/bin:$PATH:/home/username/Installs/Spark/bin"

将允许运行可执行脚本,例如spark-shellspark-submitpyspark等,而无需提供脚本的完整路径.

would enable to run the executable scripts like spark-shell, spark-submit, pyspark etc. without need to give full path to the scripts.

除了设置PATH之外,您还需要设置

Besides setting PATH, you would need to set

SPARK_HOME=/home/username/Installs/Spark

在启动火花群集或使用spark-submit时在内部使用.

which is used internally when you start spark cluster or when you use spark-submit.

如果要在.bashrc文件中设置变量,则也需要export关键字作为

If you are setting the variables in .bashrc file, you need export keyword too as

export SPARK_HOME=/home/username/Installs/Spark

,如果您不想重新引导Ubuntu以测试它是否可以运行,请输入

and if you don't want to reboot Ubuntu to test it worked type

. ~/.profile

进入命令行,然后尝试执行spark命令.

into the command line then try your spark command.

这篇关于PATH问题:搜索时找不到有效的SPARK_HOME的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-20 14:11
查看更多