问题是
hduser@saket-K53SM:/usr/local/hadoop$ jps
The program 'jps' can be found in the following packages:
* openjdk-6-jdk
* openjdk-7-jdk
Try: sudo apt-get install <selected package>
Run Code Online (Sandbox Code Playgroud)
我的配置是
hduser@saket-K53SM:/usr/local/hadoop$ java -version
java version "1.6.0_33"
Java(TM) SE Runtime Environment (build 1.6.0_33-b04)
Java HotSpot(TM) 64-Bit Server VM (build 20.8-b03, mixed mode)
Run Code Online (Sandbox Code Playgroud)
设置conf/hadoop-env.sh
hduser@saket-K53SM:/usr/local/hadoop$ cat conf/hadoop-env.sh | grep JAVA_HOME
# The only required environment variable is JAVA_HOME. All others are
# set JAVA_HOME in this file, so that it is correctly defined on
export JAVA_HOME=/usr/lib/jvm/jdk1.6.0_33/
Run Code Online (Sandbox Code Playgroud)
我知道有一个问题(http://stackoverflow.com/questions/7843422/hadoop-jps-can-not-find-java-installed)与此类似.但是我在这里安装了Sun jdk.所以任何帮助将不胜感激..
我尝试将文件从本地磁盘复制到hdfs.起初它给了SafeModeException.在搜索解决方案时,我读到如果再次执行相同的命令,则不会出现问题.所以我再次尝试,并没有给出例外.
hduser@saket:/usr/local/hadoop$ bin/hadoop dfs -copyFromLocal /tmp/gutenberg/ /user/hduser/gutenberg
copyFromLocal: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/hduser/gutenberg. Name node is in safe mode.
hduser@saket:/usr/local/hadoop$ bin/hadoop dfs -copyFromLocal /tmp/gutenberg/ /user/hduser/gutenberg
Run Code Online (Sandbox Code Playgroud)
为什么会这样?我应该使用此代码保持安全模式吗?
hadoop dfs -safemode leave
Run Code Online (Sandbox Code Playgroud) 我从http://yann.lecun.com/exdb/mnist/index.html下载了一个Csv文件.我需要将其转换为arff文件格式.
我试过跑步
java weka.core.converters.CSVLoader /home/saket/Documents/Assignment/NIST7000 > /home/saket/Documents/Myfile.arff
Run Code Online (Sandbox Code Playgroud)
但它给出了以下错误
java.lang.IllegalArgumentException: Attribute names are not unique! Causes: '0' '0' '0' '0' '0' '0' '0'
Run Code Online (Sandbox Code Playgroud)
然后我尝试使用http://weka.wikispaces.com/Converting+CSV+to+ARFF java代码.BUt仍然出现同样的错误.
有人可以建议我做错了什么
我想将字段描述中的值分配给隐藏字段测试.但问题是"描述"包含单词序列,下面的代码只分配第一个单词"test"
<s:hidden value=<s:property value="Description" /> name="test">
我对struts有点新意.有人可以请帮助.如果我了解struts2的良好教程链接也会很好.
当我尝试在hdfs中复制3个文件的目录时,我得到以下错误
hduser@saket-K53SM:/usr/local/hadoop$ bin/hadoop dfs -copyFromLocal /tmp/gutenberg /user/hduser/gutenberg
12/08/01 23:48:46 WARN hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/hduser/gutenberg/gutenberg/pg20417.txt could only be replicated to 0 nodes, instead of 1
12/08/01 23:48:46 WARN hdfs.DFSClient: Error Recovery for block null bad datanode[0] nodes == null
12/08/01 23:48:46 WARN hdfs.DFSClient: Could not get block locations. Source file "/user/hduser/gutenberg/gutenberg/pg20417.txt" - Aborting...
copyFromLocal: java.io.IOException: File /user/hduser/gutenberg/gutenberg/pg20417.txt could only be replicated to 0 nodes, instead of 1
org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/hduser/gutenberg/gutenberg/pg20417.txt could only be replicated to 0 nodes, …Run Code Online (Sandbox Code Playgroud)