NoSuchMethodError: org.apache.hadoop.io.retry.RetryUtils.getDefaultRetryPolicy

Sau*_*rab 3 java hadoop hdfs

以前,我hdfs在 a 上从 java创建目录single node cluster并且它运行流畅,但是一旦我使集群成为多节点,我就会收到此错误我得到的堆栈跟踪看起来像这样

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.io.retry.RetryUtils.getDefaultRetryPolicy(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/String;ZLjava/lang/String;Ljava/lang/String;Ljava/lang/Class;)Lorg/apache/hadoop/io/retry/RetryPolicy;
    at org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtocol(NameNodeProxies.java:410)
    at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:316)
    at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:665)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:601)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2811)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2848)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
    at CreateDirectory.main(CreateDirectory.java:44)
Run Code Online (Sandbox Code Playgroud)

这是 CreateDirectory 类

public static void main(String[] args) throws SQLException, ClassNotFoundException {
        String hdfsUri = "hdfs://localhost:9000/";
       //String dirName = args[0];
        String dirName=null;
      // String filename = args[1];
        String filename;

        if(args.length<=0) dirName = "ekbana"; filename = "text.csv";

        URL url = null;
        BufferedReader in = null;
        FileSystem hdfs = null;
        FSDataOutputStream outStream = null;
        HttpURLConnection conn = null;
        List<Map<String, String>> flatJson;
        Configuration con = new Configuration();
        try {
            url = new URL("http://crm.bigmart.com.np:81/export/export-sales-data.php?sdate=2016-12-01&edate=2016-12-02&key=jdhcvuicx8ruqe9djskjf90ueddishr0uy8v9hbjncvuw0er8idsnv");
        } catch (MalformedURLException ex) {
        }

        try {
            con.set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
            con.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());
            hdfs = FileSystem.get(URI.create(hdfsUri), con); // this is line 44 
        } catch (IOException e) {
            e.printStackTrace();
        }

        try {
            System.out.println(hdfs.mkdirs(new Path(hdfsUri + "/" + dirName)));
        } catch (IOException e) {
            e.printStackTrace();
        }
Run Code Online (Sandbox Code Playgroud)

许多站点上的解决方案说我需要 hadoop-common 并且我已经有了它,但仍然出现此错误。我怀疑与我的设置相关的重试策略,如果没有,那么为什么会出现此错误?

小智 5

添加 maven 依赖有助于:

```
    <dependency>
       <groupId>org.apache.hadoop</groupId>
       <artifactId>hadoop-hdfs</artifactId>
       <version>2.8.1</version>
    </dependency>

```
Run Code Online (Sandbox Code Playgroud)