S3N和S3A distcp在Hadoop 2.6.0中不起作用

Ste*_*ong 5 hadoop amazon-s3 hadoop2

摘要

股票hadoop2.6.0安装给了我no filesystem for scheme: s3n.hadoop-aws.jar现在添加到类路径给了我ClassNotFoundException: org.apache.hadoop.fs.s3a.S3AFileSystem.

细节

我有一个大多数股票安装hadoop-2.6.0.我只设置目录,并设置以下环境变量:

export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64/jre
export HADOOP_COMMON_HOME=/opt/hadoop
export HADOOP_HOME=$HADOOP_COMMON_HOME
export HADOOP_HDFS_HOME=$HADOOP_COMMON_HOME
export HADOOP_MAPRED_HOME=$HADOOP_COMMON_HOME
export HADOOP_OPTS=-XX:-PrintWarnings
export PATH=$PATH:$HADOOP_COMMON_HOME/bin
Run Code Online (Sandbox Code Playgroud)

hadoop classpath方法是:

/opt/hadoop/etc/hadoop:/opt/hadoop/share/hadoop/common/lib/*:/opt/hadoop/share/hadoop/common/*:/opt/hadoop/share/hadoop/hdfs:/opt/hadoop/share/hadoop/hdfs/lib/*:/opt/hadoop/share/hadoop/hdfs/*:/opt/hadoop/share/hadoop/yarn/lib/*:/opt/hadoop/share/hadoop/yarn/*:/opt/hadoop/share/hadoop/mapreduce/lib/*:/opt/hadoop/share/hadoop/mapreduce/*:/contrib/capacity-scheduler/*.jar:/opt/hadoop/share/hadoop/tools/lib/*
Run Code Online (Sandbox Code Playgroud)

当我试着奔跑时,hadoop distcp -update hdfs:///files/to/backup s3n://${S3KEY}:${S3SECRET}@bucket/files/to/backup我得到了Error: java.io.Exception, no filesystem for scheme: s3n.如果我使用s3a,我会得到同样的错误抱怨s3a.

互联网告诉我,hadoop-aws.jar是不是默认的类路径的一部分.我添加了以下行/opt/hadoop/etc/hadoop/hadoop-env.sh:

HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HADOOP_COMMON_HOME/share/hadoop/tools/lib/*
Run Code Online (Sandbox Code Playgroud)

现在hadoop classpath附上以下内容:

:/opt/hadoop/share/hadoop/tools/lib/*
Run Code Online (Sandbox Code Playgroud)

这应该涵盖/opt/hadoop/share/hadoop/tools/lib/hadoop-aws-2.6.0.jar.现在我得到:

Caused by: java.lang.ClassNotFoundException:
Class org.apache.hadoop.fs.s3a.S3AFileSystem not found
Run Code Online (Sandbox Code Playgroud)

jar文件包含无法找到的类:

unzip -l /opt/hadoop/share/hadoop/tools/lib/hadoop-aws-2.6.0.jar |grep S3AFileSystem
28349  2014-11-13 21:20   org/apache/hadoop/fs/s3a/S3AFileSystem.class
Run Code Online (Sandbox Code Playgroud)

是否有订单添加这些罐子,或者我错过了其他重要的东西?

Ste*_*ong 6

根据Abhishek对他的回答的评论,我需要做的唯一改变是mapred-site.xml:

<property>
  <!-- Add to the classpath used when running an M/R job -->
  <name>mapreduce.application.classpath</name>
  <value>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*,$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*,$HADOOP_MAPRED_HOME/share/hadoop/tools/lib/*</value>
</property>
Run Code Online (Sandbox Code Playgroud)

任何其他xml或sh文件都不需要更改.


Abh*_*hek 4

您可以通过添加以下行来解决 s3n 问题core-site.xml

<property>
<name>fs.s3n.impl</name>
<value>org.apache.hadoop.fs.s3native.NativeS3FileSystem</value>
<description>The FileSystem for s3n: (Native S3) uris.</description>
</property>
Run Code Online (Sandbox Code Playgroud)

添加该属性后它应该可以工作。

编辑:如果它不能解决您的问题,那么您将必须在类路径中添加罐子。您可以检查mapred-site.xml是否有mapreduce.application.classpath:/usr/hdp//hadoop-mapreduce/*。它将在类路径中包含其他相关的 jar :)