问题描述
我在Ubuntu 12.04和Hadoop 1.2.1上安装了一个2节点hadoop集群。
当我试图运行hadoop字数统计的例子时,我得到了太多获取失败错误
。我引用了很多文章,但我无法弄清 Masters
, Slaves
和 / etc / hosts
文件。
我的节点名称是 master
,其中 ip
10.0.0.1
和slaveone与
ip 10.0.0.2
。
我需要协助 master
和 slave上的主站,从站和
/ etc / hosts
/ code> node? p>
如果无论出于何种原因无法升级集群,以下内容:
- 确保您的主机名绑定到网络IP, 127.0.0.1 $ c $ / etc / hosts
- 确保您仅使用主机名而不使用IP来引用服务。 >如果上述内容正确,请尝试以下设置:
set mapred.reduce.slowstart.completed.maps = 0.80
set tasktracker.http.threads = 80
set mapred.reduce.parallel.copies =(> = 10)(10应该足够了)
另外结帐这SO帖子:
这一个:
此外如果上述操作无效:
为了简洁和时间的利益,我正在把我发现这里是最贴切的。
编辑:原始答案表示确保您的主机名绑定到网络IP和127.0.0.1在 / etc / hosts
中
I have a setup, 2 node hadoop cluster on Ubuntu 12.04 and Hadoop 1.2.1.While I am trying to run hadoop word count example I am gettig "Too many fetch faliure error
". I have referred many articles but I am unable to figure out what should be the entries in Masters
,Slaves
and /etc/hosts
file.My nodes names are "master
" with ip
10.0.0.1
and "slaveone" with ip 10.0.0.2
.
I need assistance in what should be the entries in masters,slaves and /etc/hosts
file in both master
and slave
node?
If you're unable to upgrade the cluster for whatever reason, you can try the following:
- Ensure that your hostname is bound to the network IP and NOT 127.0.0.1 in
/etc/hosts
- Ensure that you're using only hostnames and not IPs to reference services.
- If the above are correct, try the following settings:
set mapred.reduce.slowstart.completed.maps=0.80
set tasktracker.http.threads=80
set mapred.reduce.parallel.copies=(>= 10)(10 should probably be sufficient)
Also checkout this SO post: Why I am getting "Too many fetch-failures" every other day
And this one: Too many fetch failures: Hadoop on cluster (x2)
And also this if the above don't help: http://grokbase.com/t/hadoop/common-user/098k7y5t4n/how-to-deal-with-too-many-fetch-failuresFor brevity and in interest of time, I'm putting what I found to be the most pertinent here.
EDIT: Original answer said "Ensure that your hostname is bound to the network IP and 127.0.0.1 in /etc/hosts
"
这篇关于太多获取faliuers的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!