hadoop 2.2.0 64位安装但无法启动

use*_*504 19 hadoop

我正在尝试在服务器上安装Hadoop 2.2.0群集.目前所有服务器都是64位,我下载Hadoop 2.2.0并且已经设置了所有配置文件.当我运行./start-dfs.sh时,我收到以下错误:

13/11/15 14:29:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hchen/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.namenode]
sed: -e expression #1, char 6: unknown option to `s' have: ssh: Could not resolve hostname have: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
-c: Unknown cipher type 'cd'
Java: ssh: Could not resolve hostname Java: Name or service not known
The authenticity of host 'namenode (192.168.1.62)' can't be established.
RSA key fingerprint is 65:f9:aa:7c:8f:fc:74:e4:c7:a2:f5:7f:d2:cd:55:d4.
Are you sure you want to continue connecting (yes/no)? VM: ssh: Could not resolve        hostname VM: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
...
Run Code Online (Sandbox Code Playgroud)

除了64位,还有其他错误吗?我已经在没有密码的namenode和datanodes之间完成了登录,其他错误意味着什么?

Rup*_*lia 22

将以下条目添加到.bashrc,其中HADOOP_HOME是您的hadoop文件夹:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
Run Code Online (Sandbox Code Playgroud)

另外,执行以下命令:

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
Run Code Online (Sandbox Code Playgroud)


小智 9

根本原因是hadoop中的默认本机库是为32位构建的.解决方案

1)在中设置一些环境变量.bash_profile.请参阅https://gist.github.com/ruo91/7154697

2)重建你的hadoop本地库,请参阅http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html