无法在hadoop安装中找到start-all.sh

Leg*_*ter 6 installation hadoop ubuntu-14.04

我正在尝试在我的本地机器上设置hadoop并且正在关注此事.我也设置了hadoop回家

这是我现在要运行的命令

hduser@ubuntu:~$ /usr/local/hadoop/bin/start-all.sh
Run Code Online (Sandbox Code Playgroud)

这就是我得到的错误

-su: /usr/local/hadoop/bin/start-all.sh: No such file or directory
Run Code Online (Sandbox Code Playgroud)

这是我添加到$ HOME/.bashrc文件中的内容

# Set Hadoop-related environment variables
export HADOOP_HOME=/usr/local/hadoop

# Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on)
export JAVA_HOME=/usr/lib/jvm/java-8-oracle

# Some convenient aliases and functions for running Hadoop-related commands
unalias fs &> /dev/null
alias fs="hadoop fs"
unalias hls &> /dev/null
alias hls="fs -ls"

# If you have LZO compression enabled in your Hadoop cluster and
# compress job outputs with LZOP (not covered in this tutorial):
# Conveniently inspect an LZOP compressed file from the command
# line; run via:
#
# $ lzohead /hdfs/path/to/lzop/compressed/file.lzo
#
# Requires installed 'lzop' command.
#
lzohead () {
    hadoop fs -cat $1 | lzop -dc | head -1000 | less
}

# Add Hadoop bin/ directory to PATH
export PATH=$PATH:$HADOOP_HOME/bin
Run Code Online (Sandbox Code Playgroud)

编辑在尝试mahendra给出的解决方案后,我得到以下输出

This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh Starting namenodes on [localhost] localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-mmt-HP-ProBook-430-G3.out localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-mmt-HP-ProBook-430-G3.out Starting secondary namenodes [0.0.0.0] 0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-mmt-HP-ProBook-430-G3.out starting yarn daemons starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-mmt-HP-ProBook-430-G3.out localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-mmt-HP-ProBook-430-G3.out

Mah*_*dra 9

尝试运行:

hduser@ubuntu:~$ /usr/local/hadoop/sbin/start-all.sh
Run Code Online (Sandbox Code Playgroud)

由于start-all.shstop-all.sh位于sbin目录中,而hadoop二进制文件位于bin目录中.

还更新了您.bashrc的:

export PATH = $ PATH:$ HADOOP_HOME/bin:$ HADOOP_HOME/sbin

这样你就可以直接访问了 start-all.sh