我正在尝试使用EMR中的工作流将文件从s3复制到hdfs,当我运行以下命令时,作业流程成功启动但在尝试将文件复制到HDFS时出现错误.我是否需要设置任何输入文件权限?
命令:
./elastic-mapreduce --jobflow j-35D6JOYEDCELA --jar s3://us-east-1.elasticmapreduce/libs/s3distcp/1.latest/s3distcp.jar --args'-- src,s3:// odsh /输入/, - DEST,HDFS:///用户
产量
任务TASKID ="task_201301310606_0001_r_000000"TASK_TYPE ="REDUCE"TASK_STATUS ="FAILED"FINISH_TIME ="1359612576612"ERROR ="java.lang.RuntimeException:Reducer任务无法复制1个文件:s3://odsh/input/GL_01112_20121019.dat等位于org.apache.hadoop.mapred.ReduceTask的org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:538)的com.amazon.external.elasticmapreduce.s3distcp.CopyFilesReducer.close(CopyFilesReducer.java:70) .run(ReduceTask.java:429)位于javax.security.auth.Subject的java.security.AccessController.doPrivileged(Native Method)的org.apache.hadoop.mapred.Child $ 4.run(Child.java:255). doA(Subject.java:396)org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1132)atg.apache.hadoop.mapred.Child.main(Child.java:249)