问题描述
为什么在启动hadoop之前需要ssh登录?为什么hadoop要求密码启动任何服务? shravilp @ shravilp -HP-15-Notebook-PC:〜/ hadoop-2.6.3 $ sbin / start-all。 sh
此脚本已弃用。而是使用start-dfs.sh和start-yarn.sh
在[localhost]上启动namenodes
shravilp @ localhost的密码:
localhost:启动namenode,记录到/ home / shravilp / hadoop-
在Ubuntu中,您可以使用以下一次设置在运行hadoop命令时不需要输入密码,例如start-dfs.sh,start-yarn.sh:
sudo apt-get install openssh-server openssh-client
ssh-keygen -t rsa
ssh-copy-id user @ localhost
用您的用户名替换用户。它在Ubuntu 16.04.2,hadoop-2.7.3,jdk1.8.0_121上进行了测试。
注意:1.执行ssh- keygen -t rsa命令,只需按三次ENTER键即可接受默认值。 2.执行ssh-copy-id user @ localhost命令时,你确定要继续连接(是/否)吗?输入yes,然后输入你的密码
看到这个还
why ssh login is required before starting hadoop? And why hadoop asks for password for starting any of the services?
shravilp@shravilp-HP-15-Notebook-PC:~/hadoop-2.6.3$ sbin/start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Starting namenodes on [localhost]
shravilp@localhost's password:
localhost: starting namenode, logging to /home/shravilp/hadoop-
In Ubuntu, you can use the following one time set up steps to eliminate the need to enter password when running hadoop commands, for example, start-dfs.sh, start-yarn.sh:
sudo apt-get install openssh-server openssh-client
ssh-keygen -t rsa
ssh-copy-id user@localhost
replace user with your username. It was tested on Ubuntu 16.04.2, hadoop-2.7.3, jdk1.8.0_121
Note: 1. when execute "ssh-keygen -t rsa" command, you can simply press ENTER three times to accept the default values. 2. when execute "ssh-copy-id user@localhost" command, "Are you sure you want to continue connecting (yes/no)?" enter "yes", then your password
See this question also
这篇关于为什么在启动任何服务之前,hadoop要求输入密码?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!