本文介绍了在hadoop 2.4.1中启动namenode时出错的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当我尝试使用以下命令启动dfs时:

  start-dfs.sh 




$ b

我收到一个错误消息:

  14 / 07/03 11:03:21 WARN util.NativeCodeLoader:无法为您的平台加载
native-hadoop库...使用内建-java类
适用时在[OpenJDK 64-Bit上启动namenodes服务器VM
警告:您已经加载了库
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0,它可能有
禁用的堆栈后卫。虚拟机现在会尝试修复堆栈后卫。强烈建议您使用'execstack -c
< libfile>'来修复库,或者使用'-z noexecstack'链接它。本地主机] sed:-e
表达式#1,字符6:未知选项到s服务器:ssh:不能
解析主机名服务器:名称或服务未知
-c:未知密码类型'cd'堆栈:ssh:无法解析主机名堆栈:名称或服务未知64位:ssh:无法解析
主机名64位:名称或服务未知警卫:ssh:无法
解析主机名守卫:名称或服务未知:ssh:不能
解析主机名:名称或服务未知guard:ssh:无法
解析主机名守卫:名称或服务不知道可能:ssh:可能
未解析主机名可能:名称或服务未知堆栈:ssh:
无法解析主机名堆栈:名称或服务未知将:ssh:
无法解析主机名将:名称或服务未知:ssh:
无法解析主机名:名称或服务未知修复:ssh:
无法解析主机名修复:名称或服务未知VM:ssh :
无法解析主机名称VM:名称或服务器冰不知道你:ssh:
无法解析主机名您:名称或服务未知:ssh:
无法解析主机名:名称或服务未知它是:ssh:
可能未解析主机名这是:名称或服务未知已禁用:
ssh:无法解析主机名已禁用:名称或服务未知
尝试:ssh:无法解析主机名尝试:名称或服务未知
localhost:namenode作为进程4463运行。首先停止它。库:
ssh:无法解析主机名称库:名称或服务未知
with:ssh:无法解析主机名:名称或服务未知
:ssh:无法解析主机名:名称或服务未知
warning :: ssh:无法解析主机名警告::名称或服务
未知VM:ssh:无法解析主机名VM:名称或服务不是
现在已知:ssh:现在无法解析主机名:名称或服务不是
已知已加载:ssh:无法解析已加载的主机名:名称或服务
未知库:ssh:无法解析主机名库:名称或
服务未知< libfile>',:ssh:无法解析主机名称
< libfile>',:未知名称或服务:ssh:连接到主机到
端口22:拒绝连接OpenJDK:ssh:无法解析主机名
OpenJDK:名称或服务未知有:ssh:无法解析
主机名有:名称或服务未知有:ssh:无法解析
hostname有:名字o r服务未知:ssh:无法解析
主机名:名称或服务未知修复:ssh:无法解析
主机名修复:名称或服务未知noexecstack'.: ssh:无法解析
解析主机名noexecstack':名称或服务未知:ssh:
无法解析的主机名称:名称或服务未知:ssh:
无法解析主机名称:名称或服务未知或:ssh:
无法解析主机名或:名称或服务未知:ssh:
无法高度解析主机名:名称或服务未知
推荐:ssh:可能未解析主机名推荐:名称或
服务未知'execstack:ssh:无法解析主机名$ b $'execstack:名称或服务未知链接:ssh:无法解析
主机名链接:名称或服务不知道它:ssh:无法解析
主机名它:名称或服务未知'-z:ssh:无法解析
主机名'-z:名称或服务未知localhost:datanode正在运行一个s
过程4561.先停下来。启动辅助名称节点[OpenJDK
64位服务器虚拟机警告:您已加载库
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0,可能已禁用
守卫。虚拟机现在会尝试修复堆栈后卫。强烈建议您使用'execstack -c
< libfile>'来修复库,或者使用'-z noexecstack'链接它。
0.0.0.0] sed:-e表达式#1,字符6:未知选项到s s OpenJDK:ssh:无法解析主机名OpenJDK:名称或服务未知
-c:未知密码类型'cd'VM:ssh:无法解析主机名VM:名称或服务未知主机'0.0.0.0(0.0.0.0)'
的真实性无法建立。 ECDSA密钥指纹是
dd:64:53:7e:c0:62:40:c0:63:2b:5c:6d:1e:b6:cd:23。您确定要
继续连接(是/否)吗?可能:ssh:无法解析
主机名可能:名称或服务未知服务器:ssh:无法
解析主机名服务器:名称或服务未知警卫:ssh:可能
未解析主机名守卫:名称或服务未知有:ssh:
无法解析主机名有:名称或服务未知您:ssh:
无法解析主机名您:名称或服务未知:ssh :
无法解析主机名:名称或服务未知:ssh:
无法解析主机名:名称或服务未知有:ssh:
无法解析主机名具有:名称或服务未知禁用:
ssh:无法解析主机名禁用:名称或服务未知
VM:ssh:无法解析主机名VM:名称或服务未知
它是:ssh:无法解析主机名这是:名称或服务未知
修复:ssh:无法解析主机名修复:名称或服务未知
:ssh:无法解析主机名:名称或服务未知
警告: :ssh:无法解析主机名警告::名称或服务
不知道将:ssh:无法解析主机名将:名称或服务
不知道:ssh:无法解析主机名:名称或服务
不知道库:ssh:无法解析主机名库:名称或
服务不知道:ssh:无法解析主机名:名称或
服务未知高度:ssh:可能不解析主机名:名称
或服务未知'execstack:ssh:无法解析主机名
'execstack:名称或服务未知try:ssh:无法解析
hostname try:Name或服务未知guard:ssh:无法解析
主机名guard:名称或服务未知64位:ssh:无法
解析主机名64位:名称或服务未知加载:ssh:可能
不解析主机名加载:名称或服务未知库:ssh:
无法解析主机名库:名称或服务未知修复:
ssh:无法解析主机名修复:名称或服务不知道:
ssh:连接到主机到端口22:连接被拒绝链接:ssh:可能
未解析主机名链接:名称或服务未知堆栈:ssh:可能
未解析主机名栈:名称或服务未知'-z:ssh:可能
不能解析主机名'-z:名称或服务不知道你:ssh:可能
不能解析主机名你:名称或服务未知:ssh:可能
不解析主机名:名称或服务未知:ssh:可能
未解析主机名:名称或服务未知建议:ssh:
无法解析主机名建议:名称或服务未知
堆栈:ssh:无法解析主机名堆栈:名称或服务不是
现在已知:ssh:现在无法解析主机名:名称或服务不是
已知< libfile>':ssh:无法解析主机名< libfile>',:名称
或服务未知或:ssh:无法解析主机名或:名称或
服务未知noexecstack' 。:ssh:无法解析主机名
noexecstack':名称或服务未知:ssh:无法解析
主机名它:名称或服务未知^ C0.0.0.0:主机密钥
验证失败。 ^ c

我的core-site.xml文件包含以下内容:

 < configuration> 
<属性>
<名称> fs.default.name< /名称>
< value> hdfs:// localhost:9000< / value>
< / property>
< / configuration>

我的 .profile (替换<$

  export JAVA_HOME = / usr / lib / c $ c> .bashrc  jvm / java-7-openjdk-amd64 
export HADOOP_INSTALL = / usr / local / hadoop
export PATH = $ PATH:$ HADOOP_INSTALL / bin
export PATH = $ PATH:$ HADOOP_INSTALL / sbin
export HADOOP_MAPRED_HOME = $ HADOOP_INSTALL $ b $ export HADOOP_COMMON_HOME = $ HADOOP_INSTALL $ b $ export HADOOP_HDFS_HOME = $ HADOOP_INSTALL $ b $ export YARN_HOME = $ HADOOP_INSTALL $ b $ export HADOOP_COMMON_LIB_NATIVE_DIR = $ HADOOP_INSTALL / lib / native
export HADOOP_OPTS = - Djava.library.path = $ HADOOP_INSTALL / lib

可以很容易地ssh我的本地主机说:

  ssh localhost 

欢迎来到Linux Mint 16 Petra(GNU / Linux
3.11.0-12-generic x86_64)

欢迎使用Linux Mint *文档:http://www.linuxmint.com上次登录$ b $登录:Wed Jul 2 16: 2014年5月1日51:15从localhost


解决方案

禁止JVM将堆栈警戒警告打印到stdout / stderr,因为这会破坏HDFS启动脚本。





通过替换你的 etc / hadoop / hadoop-env.sh 行来完成它:

  export HADOOP_OPTS =$ HADOOP_OPTS -Djava.net.preferIPv4Stack = true



  export HADOOP_OPTS =$ HADOOP_OPTS -XX:-PrintWarnings -Djava .net.preferIPv4Stack = true




(已在上找到)


When I try to start dfs using:

start-dfs.sh

I get an error saying :

14/07/03 11:03:21 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable Starting namenodes on [OpenJDK 64-Bit Server VM
warning: You have loaded library
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0 which might have
disabled stack guard. The VM will try to fix the stack guard now. It's
highly recommended that you fix the library with 'execstack -c
<libfile>', or link it with '-z noexecstack'. localhost] sed: -e
expression #1, char 6: unknown option to `s' Server: ssh: Could not
resolve hostname Server: Name or service not known
-c: Unknown cipher type 'cd' stack: ssh: Could not resolve hostname stack: Name or service not known 64-Bit: ssh: Could not resolve
hostname 64-Bit: Name or service not known guard.: ssh: Could not
resolve hostname guard.: Name or service not known The: ssh: Could not
resolve hostname The: Name or service not known guard: ssh: Could not
resolve hostname guard: Name or service not known might: ssh: Could
not resolve hostname might: Name or service not known stack: ssh:
Could not resolve hostname stack: Name or service not known will: ssh:
Could not resolve hostname will: Name or service not known the: ssh:
Could not resolve hostname the: Name or service not known fix: ssh:
Could not resolve hostname fix: Name or service not known VM: ssh:
Could not resolve hostname VM: Name or service not known You: ssh:
Could not resolve hostname You: Name or service not known which: ssh:
Could not resolve hostname which: Name or service not known It's: ssh:
Could not resolve hostname It's: Name or service not known disabled:
ssh: Could not resolve hostname disabled: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
localhost: namenode running as process 4463. Stop it first. library:
ssh: Could not resolve hostname library: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service
not known VM: ssh: Could not resolve hostname VM: Name or service not
known now.: ssh: Could not resolve hostname now.: Name or service not
known loaded: ssh: Could not resolve hostname loaded: Name or service
not known library: ssh: Could not resolve hostname library: Name or
service not known <libfile>',: ssh: Could not resolve hostname
<libfile>',: Name or service not known to: ssh: connect to host to
port 22: Connection refused OpenJDK: ssh: Could not resolve hostname
OpenJDK: Name or service not known have: ssh: Could not resolve
hostname have: Name or service not known have: ssh: Could not resolve
hostname have: Name or service not known with: ssh: Could not resolve
hostname with: Name or service not known fix: ssh: Could not resolve
hostname fix: Name or service not known noexecstack'.: ssh: Could not
resolve hostname noexecstack'.: Name or service not known that: ssh:
Could not resolve hostname that: Name or service not known you: ssh:
Could not resolve hostname you: Name or service not known or: ssh:
Could not resolve hostname or: Name or service not known highly: ssh:
Could not resolve hostname highly: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or
service not known 'execstack: ssh: Could not resolve hostname
'execstack: Name or service not known link: ssh: Could not resolve
hostname link: Name or service not known it: ssh: Could not resolve
hostname it: Name or service not known '-z: ssh: Could not resolve
hostname '-z: Name or service not known localhost: datanode running as
process 4561. Stop it first. Starting secondary namenodes [OpenJDK
64-Bit Server VM warning: You have loaded library
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0 which might have
disabled stack guard. The VM will try to fix the stack guard now. It's
highly recommended that you fix the library with 'execstack -c
<libfile>', or link it with '-z noexecstack'.
0.0.0.0] sed: -e expression #1, char 6: unknown option to `s' OpenJDK: ssh: Could not resolve hostname OpenJDK: Name or service not known
-c: Unknown cipher type 'cd' VM: ssh: Could not resolve hostname VM: Name or service not known The authenticity of host '0.0.0.0 (0.0.0.0)'
can't be established. ECDSA key fingerprint is
dd:64:53:7e:c0:62:40:c0:63:2b:5c:6d:1e:b6:cd:23. Are you sure you want
to continue connecting (yes/no)? might: ssh: Could not resolve
hostname might: Name or service not known Server: ssh: Could not
resolve hostname Server: Name or service not known guard.: ssh: Could
not resolve hostname guard.: Name or service not known have: ssh:
Could not resolve hostname have: Name or service not known You: ssh:
Could not resolve hostname You: Name or service not known The: ssh:
Could not resolve hostname The: Name or service not known which: ssh:
Could not resolve hostname which: Name or service not known have: ssh:
Could not resolve hostname have: Name or service not known disabled:
ssh: Could not resolve hostname disabled: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service
not known will: ssh: Could not resolve hostname will: Name or service
not known the: ssh: Could not resolve hostname the: Name or service
not known library: ssh: Could not resolve hostname library: Name or
service not known that: ssh: Could not resolve hostname that: Name or
service not known highly: ssh: Could not resolve hostname highly: Name
or service not known 'execstack: ssh: Could not resolve hostname
'execstack: Name or service not known try: ssh: Could not resolve
hostname try: Name or service not known guard: ssh: Could not resolve
hostname guard: Name or service not known 64-Bit: ssh: Could not
resolve hostname 64-Bit: Name or service not known loaded: ssh: Could
not resolve hostname loaded: Name or service not known library: ssh:
Could not resolve hostname library: Name or service not known fix:
ssh: Could not resolve hostname fix: Name or service not known to:
ssh: connect to host to port 22: Connection refused link: ssh: Could
not resolve hostname link: Name or service not known stack: ssh: Could
not resolve hostname stack: Name or service not known '-z: ssh: Could
not resolve hostname '-z: Name or service not known you: ssh: Could
not resolve hostname you: Name or service not known with: ssh: Could
not resolve hostname with: Name or service not known with: ssh: Could
not resolve hostname with: Name or service not known recommended: ssh:
Could not resolve hostname recommended: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not
known now.: ssh: Could not resolve hostname now.: Name or service not
known <libfile>',: ssh: Could not resolve hostname <libfile>',: Name
or service not known or: ssh: Could not resolve hostname or: Name or
service not known noexecstack'.: ssh: Could not resolve hostname
noexecstack'.: Name or service not known it: ssh: Could not resolve
hostname it: Name or service not known ^C0.0.0.0: Host key
verification failed. ^C

My core-site.xml file contains this:

<configuration>
    <property>
       <name>fs.default.name</name>
       <value>hdfs://localhost:9000</value>
    </property>
</configuration>

My .profile (replacement for .bashrc) contains these lines:

export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"

And I can easily ssh my localhost saying:

ssh localhost

Welcome to Linux Mint 16 Petra (GNU/Linux
3.11.0-12-generic x86_64)

Welcome to Linux Mint  * Documentation:  http://www.linuxmint.com Last
login: Wed Jul  2 16:51:15 2014 from localhost
解决方案

Stop JVM from printing the stack guard warning to stdout/stderr, because this is what breaks the HDFS starting script.


Do it by replacing in your etc/hadoop/hadoop-env.sh line:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"

with:

export HADOOP_OPTS="$HADOOP_OPTS -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true"


(This solution has been found on Sumit Chawla's blog)

这篇关于在hadoop 2.4.1中启动namenode时出错的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-04 15:33