所以 telnet 实际上正在工作,我的意思是 telnet localhost 25 正在连接;但是 telnet localhost 或 telnet localhost 9000 得到了这样的结果:
Trying 127.0.0.1...
telnet: Unable to connect to remote host: Connection refused
Run Code Online (Sandbox Code Playgroud)
nmap 结果:
$ nmap localhost
Starting Nmap 6.00 ( http://nmap.org ) at 2013-10-03 00:54 MSK
Nmap scan report for localhost (127.0.0.1)
Host is up (0.00030s latency).
rDNS record for 127.0.0.1: localhost.localdomain
Not shown: 992 closed ports
PORT STATE SERVICE
22/tcp open ssh
25/tcp open smtp
80/tcp open http
587/tcp open submission
631/tcp open ipp
3306/tcp …Run Code Online (Sandbox Code Playgroud) 我正在尝试使用 Hadoop 3.0 运行 Hive 3.1。以下是我的系统配置:
Ubuntu 18.04.1 LTS
Hadoop version 3.0.3
Hive 3.1.0
Derby 10.14.2
Run Code Online (Sandbox Code Playgroud)
当我执行显示表时;查询我收到以下错误。
FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
Run Code Online (Sandbox Code Playgroud)
以下是配置单元日志文件中的详细错误。
2018-09-05T11:38:25,952 INFO [main] conf.HiveConf: Found configuration file file:/usr/local/apache-hive-3.1.0-bin/conf/hive-site.xml
2018-09-05T11:38:30,549 INFO [main] SessionState: Hive Session ID = 826ec55c-7fca-4fff-baa5-b5a010e5af89
2018-09-05T11:38:35,948 INFO [main] SessionState:
Logging initialized using configuration in jar:file:/usr/local/apache-hive-3.1.0-bin/lib/hive-common-3.1.0.jar!/hive-log4j2.properties Asy$
2018-09-05T11:38:47,015 INFO [main] session.SessionState: Created HDFS directory: /tmp/hive/hadoop
2018-09-05T11:38:47,069 INFO [main] session.SessionState: Created local directory: /tmp/mydir
2018-09-05T11:38:47,096 INFO [main] session.SessionState: Created HDFS directory: /tmp/hive/hadoop/826ec55c-7fca-4fff-baa5-b5a010e5af89
2018-09-05T11:38:47,104 INFO [main] …Run Code Online (Sandbox Code Playgroud) 我正在使用 hadoop 并且需要更改打开文件的数量ulimit -n。我在 stackoverflow 和其他地方看到过类似的问题,并尝试了这些答案中的所有内容,但仍然无效。我正在与ubuntu 12.04 LTS. 这是我所做的:
更改限制/etc/security/limits.conf,我已经为 * 和root. 我还更改了一些数字的限制,例如10000和unlimited。
* soft nofile 1513687
* hard nofile 1513687
root soft nofile 1513687
root hard nofile 1513687
Run Code Online (Sandbox Code Playgroud)
我也试过上面的设置-而不是soft和hard。在这些更改之后,我对/etc/pam.d/文件进行了更改,例如:
common-session
common-session-noninterative
login
cron
sshd
su
sudo
Run Code Online (Sandbox Code Playgroud)
我已经添加session required pam_limits.so到每个文件的开头。我重新启动了有问题的盒子,但设置没有生效。
我还发现/etc/security/limits.d/用户hbase mapred和hdfs. 我也尝试更改这些单个文件中的限制,但无济于事。
我也试过放在ulimit -S -n unlimited里面/etc/profile。这没用。
最后,我尝试将 …
我已经在我的系统(14.04)中安装了 Hadoop-2.4.0,我想完全删除它并重新安装它。如果我删除了配置的文件夹 hadoop,是否足以在我的 Ubuntu 14.04 LTS 中删除 hadoop。
大家好,我还有一个非常基本的问题。请多多包涵。
我正在按照以下网站上的说明在我的计算机上下载和配置 Hadoop。
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
当我使用 localhost 命令时,我在最后的 SSH 部分。当我运行以下代码时,我收到一条错误消息。
hadoop@amathew-Dimension-3000:~$ cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys
hadoop@amathew-Dimension-3000:~$ ssh localhost
ssh: connect to host localhost port 22: Connection refused
Run Code Online (Sandbox Code Playgroud)
以下是一些可能相关的其他信息。
hadoop@amathew-Dimension-3000:~$ ssh -vvv localhost
OpenSSH_5.5p1 Debian-4ubuntu5, OpenSSL 0.9.8o 01 Jun 2010
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: Applying options for *
debug2: ssh_connect: needpriv 0
debug1: Connecting to localhost [::1] port 22.
debug1: connect to address ::1 port 22: Connection refused
debug1: Connecting to localhost [127.0.0.1] port 22.
debug1: connect to address 127.0.0.1 …Run Code Online (Sandbox Code Playgroud) 我正在使用以下站点在 Ubuntu 10.10 中安装和配置 Hadoop http://arifn.web.id/blog/2010/07/29/running-hadoop-single-cluster.html
但是,当我尝试格式化 Hadoop 文件系统时,出现以下错误。
amathew@amathew-Dimension-3000:~$ cd /usr/local/hadoop
amathew@amathew-Dimension-3000:/usr/local/hadoop$ bin/hadoop namenode -format
11/04/16 21:23:07 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = amathew-Dimension-3000/192.168.1.66
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 0.20.2
STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20 -r 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
************************************************************/
11/04/16 21:23:08 INFO namenode.FSNamesystem: fsOwner=amathew,amathew,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
11/04/16 21:23:08 INFO namenode.FSNamesystem: supergroup=supergroup
11/04/16 21:23:08 INFO namenode.FSNamesystem: isPermissionEnabled=true
11/04/16 21:23:08 ERROR namenode.NameNode: java.io.IOException: Cannot create directory /usr/local/hadoop-datastore/hadoop/dfs/name/current
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:295) …Run Code Online (Sandbox Code Playgroud) 我安装了 Hadoop,但无法获取其版本:
ramesh@ramesh-H61M-S2P-B3:~$ echo $HADOOP_HOME /home/hadoop/work/hadoop-1.1.2 ramesh@ramesh-H61M-S2P-B3:~$ hadoop -version hadoop:找不到命令
我必须做什么才能让 Hadoop 工作?
hduser1@archit-HP-Notebook:~$ ssh localhost
/etc/ssh/ssh_config: line 11: Bad configuration option: hostkey
/etc/ssh/ssh_config: line 12: Bad configuration option: hostkey
/etc/ssh/ssh_config: line 13: Bad configuration option: hostkey
/etc/ssh/ssh_config: line 14: Bad configuration option: hostkey
/etc/ssh/ssh_config: line 16: Bad configuration option: useprivilegeseparation
/etc/ssh/ssh_config: line 19: Bad configuration option: keyregenerationinterval
/etc/ssh/ssh_config: line 20: Bad configuration option: serverkeybits
/etc/ssh/ssh_config: line 23: Bad configuration option: syslogfacility
/etc/ssh/ssh_config: line 27: Bad configuration option: logingracetime
/etc/ssh/ssh_config: line 28: Bad configuration option: permitrootlogin
/etc/ssh/ssh_config: line 29: Bad configuration option: strictmodes …Run Code Online (Sandbox Code Playgroud)