我一直在遵循本教程为scala安装spark:
https://www.tutorialspoint.com/apache_spark/apache_spark_installation.htm

但是,当我尝试运行spark-shell时,在控制台中收到此错误。

/usr/local/spark/bin/spark-shell: line 57: /usr/local/spark/bin/bin/spark-submit: No such file or directory

我的bashrc看起来像这样:
export PATH = $PATH:/usr/local/spark/bin
export SCALA_HOME=/usr/local/scala/bin
export PYTHONPATH=$SPARK_HOME/python

那我怎么了?我之前已经为python安装了spark,但是现在我正在尝试使用scala。 Spark 混淆了变量吗?谢谢。

最佳答案

您搜索的路径中有一个bin太多:

/usr/local/spark/bin/bin/spark-submit

应该
/usr/local/spark/bin/spark-submit

在您的情况下,SPARK_HOME应该是/usr/local/spark/,而不是现在看来是/usr/local/spark/bin/

07-26 09:29
查看更多