本文介绍了Apache的星火错误:无法连接到akka.tcp:// @ sparkMaster的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

这是使用像Apache火花和Hadoop大数据的东西我们的第一个步骤。

This is our first steps using big data stuff like apache spark and hadoop.

我们有一个安装了Cloudera CDH 5.3。从Cloudera的经理,我们选择安装的火花。星火启动并在集群中的某个节点上运行得非常好。

We have a installed Cloudera CDH 5.3. From the cloudera manager we choose to install spark. Spark is up and running very well in one of the nodes in the cluster.

这是我的机器,我做了连接到读取存储在Hadoop的HDFS一个文本文件中的小应用程序。

From my machine I made a little application that connects to read a text file stored on hadoop HDFS.

我试图从Eclipse运行应用程序,它显示这些消息

I am trying to run the application from Eclipse and it displays these messages

15/02/11 14点44分01秒INFO client.AppClient $ ClientActor:连接到主火花://10.62.82.21:7077 ...
15/02/11 14点44分02秒WARN client.AppClient $ ClientActor:无法连接到akka.tcp://[email protected]:7077:akka.remote.InvalidAssociation:无效地址:akka.tcp:// [email protected]:7077
15/02/11 14点44分02秒WARN Remoting的:试过用遥控器无法访问的地址相关联[akka.tcp://[email protected]:7077。地址现在为门5000毫秒,这个地址的所有邮件将被传递到一纸空文。原因:连接被拒绝:没有进一步的信息:/10.62.82.21:7077

该应用程序有一个类创建一个使用以下行的上下文

The application is has one class the create a context using the following line

JavaSparkContext SC =新JavaSparkContext(新SparkConf()setAppName(星火计数).setMaster(火花://10.62.82.21:7077));

这哪里是IP的火花机的工作的IP。

where this IP is the IP of the machine spark working on.

然后我尝试使用以下行来读取HDFS文件

Then I try to read a file from HDFS using the following line

sc.textFile(HDFS://10.62.82.21/tmp/words.txt)

当我跑我得到了应用程序的

When I run the application I got the

推荐答案

检查星火主日志,您应该看到类似:

Check your Spark master logs, you should see something like:

15/02/11 13:37:14 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkMaster@mymaster:7077]
15/02/11 13:37:14 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkMaster@mymaster:7077]
15/02/11 13:37:14 INFO Master: Starting Spark master at spark://mymaster:7077

那么你的连接到主时,一定要准确使用相同的主机名以上(不使用IP地址)的日志中找到的:

Then when your connecting to the master, be sure to use exactly the same hostname as found in the logs above (do not use the IP address):

.setMaster("spark://mymaster:7077"));

星火独立是有点挑剔与此主机名/ IP的东西。

Spark standalone is a bit picky with this hostname/IP stuff.

这篇关于Apache的星火错误:无法连接到akka.tcp:// @ sparkMaster的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-24 05:23