本文介绍了hadoop 2.2.0 64位安装但无法启动的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图在服务器上安装Hadoop 2.2.0群集。现在所有的服务器都是64位的,我下载了Hadoop 2.2.0并且所有的配置文件都已经设置好了。当我运行./start-dfs.sh时,出现以下错误:

  13/11/15 14:29 :26 WARN util.NativeCodeLoader:无法为您的平台加载native-hadoop库......在适用的情况下使用builtin-java类
在[Java HotSpot(TM)64位服务器VM警告中启动namenodes:已加载library /home/hchen/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0这可能会禁用堆栈守卫。虚拟机将尝试修复堆栈后卫。强烈建议您使用'execstack -c< libfile>'修复库,或者将其与'-z noexecstack'.namenode]链接
sed:-e表达式#1,字符6:未知选项到`s'有:ssh:无法解析主机名具有:名称或服务未知
HotSpot(TM):ssh:无法解析主机名HotSpot(TM):Name或服务未知
-c:未知密码类型'cd'
Java:ssh:无法解析主机名称Java:名称或服务未知
主机名称的真实性(192.168.1.62) '无法建立。
RSA密钥指纹是65:f9:aa:7c:8f:fc:74:e4:c7:a2:f5:7f:d2:cd:55:d4。
您确定要继续连接(是/否)吗? VM:ssh:无法解析主机名VM:名称或服务未知
您:ssh:无法解析主机名您:名称或服务未知
警告:: ssh:无法解析主机名警告::名称或服务未知
库:ssh:无法解析主机名称库:名称或服务未知
有:ssh:无法解析主机名具有:名称或服务未知
64位:ssh:无法解析主机名64位:名称或服务未知
...



解决方案
将以下条目添加到.bashrc中,其中HADOOP_HOME是您的hadoop文件夹:

  export HADOOP_COMMON_LIB_NATIVE_DIR = $ HADOOP_HOME / lib / native 
export HADOOP_OPTS = - Djava.library.path = $ HADOOP_HOME / lib


$ b

另外,执行以下命令:

  ssh-keygen -t rsa -P''-f〜/ .ssh / id_rsa 
cat〜/ .ssh / id_rsa.pub>> 〜/ .ssh / authorized_keys


I am trying to install Hadoop 2.2.0 Cluster on the servers. For now all the servers are 64-bit, I download the Hadoop 2.2.0 and all the configuration files have been set up. When I am running ./start-dfs.sh, I got the following error:

13/11/15 14:29:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hchen/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.namenode]
sed: -e expression #1, char 6: unknown option to `s' have: ssh: Could not resolve hostname have: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
-c: Unknown cipher type 'cd'
Java: ssh: Could not resolve hostname Java: Name or service not known
The authenticity of host 'namenode (192.168.1.62)' can't be established.
RSA key fingerprint is 65:f9:aa:7c:8f:fc:74:e4:c7:a2:f5:7f:d2:cd:55:d4.
Are you sure you want to continue connecting (yes/no)? VM: ssh: Could not resolve        hostname VM: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
...

Beside the 64-bit, is there any other errors? I have finished the log in between namenode and datanodes without password, what do the other errors mean?

解决方案

Add the following entries to .bashrc where HADOOP_HOME is your hadoop folder:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

In addition, execute the following commands:

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

这篇关于hadoop 2.2.0 64位安装但无法启动的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-04 15:33
查看更多