本文介绍了如何连接主机和从机在Apache的火花? (独立运行模式)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用教程页面安装星火在独立模式。

I'm using Spark Standalone Mode tutorial page to install Spark in Standalone mode.

1 - 我已经开始主:

1- I have started a master by:

./sbin/start-master.sh

2 - 我已经开始了一个工人:

2- I have started a worker by:

./bin/spark-class org.apache.spark.deploy.worker.Worker spark://ubuntu:7077

请注意:火花:// Ubuntu的:7077 是我的主人的名字,这是我可以看到它在主WebUI中

Note: spark://ubuntu:7077 is my master name, which I can see it in Master-WebUI.

问题:按第二个命令,一名工人成功启动。但它无法与主人联系起来。它反复尝试,然后把这个消息:

Problem: By second command, a worker started successfully. But it couldn't associate with master. It tries repeatedly and then give this message:

15/02/08 11:30:04 WARN Remoting: Tried to associate with unreachable    remote address [akka.tcp://sparkMaster@ubuntu:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: Connection refused: ubuntu/127.0.1.1:7077
15/02/08 11:30:04 INFO RemoteActorRefProvider$RemoteDeadLetterActorRef: Message [org.apache.spark.deploy.DeployMessages$RegisterWorker] from Actor[akka://sparkWorker/user/Worker#-1296628173] to Actor[akka://sparkWorker/deadLetters] was not delivered. [20] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
15/02/08 11:31:15 ERROR Worker: All masters are unresponsive! Giving up.

这是什么问题?

感谢

推荐答案

我通常的模板。我设置,我需要的属性。对于简单的集群,您需要:

I usually start from spark-env.sh template. And I set, properties that I need. For simple cluster you need:


  • SPARK_MASTER_IP

然后,创建一个名为奴隶在同一目录文件作为spark-env.sh和奴隶的ip的(每行一个)。向你保证,通过ssh到达所有的奴隶。

Then, create a file called "slaves" in the same directory as spark-env.sh and slaves ip's (one per line). Assure you reach all slaves through ssh.

最后,群集中的每个机器复制该配置。。然后启动整个集群执行start-all.sh脚本,并尝试火花外壳,检查您的配置。

Finally, copy this configuration in every machine of your cluster. Then start the entire cluster executing start-all.sh script and try spark-shell to check your configuration.

> sbin/start-all.sh
> bin/spark-shell

这篇关于如何连接主机和从机在Apache的火花? (独立运行模式)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-01 06:21