我已经设置了Hadoop 2.2.0单节点并启动了它.我能够浏览FS http://localhost:50070/
然后我尝试使用以下代码编写一个虚拟文件.
public class Test {
public void write(File file) throws IOException{
FileSystem fs = FileSystem.get(new Configuration());
Path outFile = new Path("test.jpg");
FSDataOutputStream out = fs.create(outFile);
}
Run Code Online (Sandbox Code Playgroud)
我得到以下异常
INFO: DEBUG - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
INFO: DEBUG - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
INFO: DEBUG - UgiMetrics, User and group related metrics
INFO: DEBUG - Creating new Groups object
INFO: DEBUG - Trying to load the custom-built native-hadoop library...
INFO: DEBUG - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
INFO: DEBUG - java.library.path=/usr/lib/jvm/jdk1.7.0/jre/lib/amd64:/usr/lib/jvm/jdk1.7.0/jre/lib/i386::/usr/java/packages/lib/i386:/lib:/usr/lib
INFO: WARN - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
INFO: DEBUG - Falling back to shell based
INFO: DEBUG - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
INFO: DEBUG - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000
INFO: DEBUG - hadoop login
INFO: DEBUG - hadoop login commit
INFO: DEBUG - using local user:UnixPrincipal: qualebs
INFO: DEBUG - UGI loginUser:qualebs (auth:SIMPLE)
INFO: DEBUG - Failed to detect a valid hadoop home directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:225)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:250)
at
org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:639)
at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:468)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:456)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:424)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:905)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:886)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:783)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:772)
at com.qualebs.managers.HadoopDFS.writer(HadoopDFS.java:41)
Run Code Online (Sandbox Code Playgroud)
我在哪里设置HADOOP_HOME或hadoop.home.dir?操作系统是Ubuntu 11.10
我配置的唯一配置文件如下,并添加了属性
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
Run Code Online (Sandbox Code Playgroud)
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
Run Code Online (Sandbox Code Playgroud)
<configuration>
<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>
</configuration>
Run Code Online (Sandbox Code Playgroud)
热切期待您的回复.
Yon*_*eng 13
我通过这样做找到我的解决方案:
checkHadoopHome()
checkHadoopHome()中抛出此异常org.apache.hadoop.util.Shell
希望能帮助到你!
如果您没有使用 hadoop 的专用用户,请将其添加到终端 bash 文件中。
1. start terminal
2. sudo vi .bashrc
3. export HADOOP_HOME=YOUR_HADOOP_HOME_DIRECTORY(don't include bin folder)
4. save
5. restart terminal and check it if it's saved by typing : echo $HADOOP_HOME
Run Code Online (Sandbox Code Playgroud)