我在 Java 中有一个非常简单的代码,它从中读取数据 hdfs
try{
InputStream s = new GzipCompressorInputStream(hdfsFileSystem.open(filePath), false);
ByteStreams.copy(s, outputStream);
s.close();
}
catch (Exception ex){
logger.error("Problem with file "+ filePath,ex);
}
Run Code Online (Sandbox Code Playgroud)
有时(并非总是)它会抛出我异常
java.lang.NoSuchMethodError: org.apache.commons.io.IOUtils.closeQuietly(Ljava/io/Closeable;)V
at org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(DFSInputStream.java:1099)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:533)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:749)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:793)
at java.io.DataInputStream.read(DataInputStream.java:149)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
at java.io.BufferedInputStream.read(BufferedInputStream.java:254)
at org.apache.commons.compress.compressors.gzip.GzipCompressorInputStream.init(GzipCompressorInputStream.java:136)
at org.apache.commons.compress.compressors.gzip.GzipCompressorInputStream.<init>(GzipCompressorInputStream.java:129)
[...]
Run Code Online (Sandbox Code Playgroud)
在线下线:
InputStream s = new GzipCompressorInputStream(hdfsFileSystem.open(filePath), false);
Run Code Online (Sandbox Code Playgroud)
我正在使用 bellow maven 依赖项来加载 hadoop 客户端:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.2.0</version>
</dependency>
Run Code Online (Sandbox Code Playgroud)
有谁知道如何解决这个问题?当然,我可以更改catch(Exception e)为catch(Error e),但这不仅仅是解决方法。
看起来在你的类路径中存在几个不同版本的“commons-io.jar”。方法“closeQuietly(Ljava/io/Closeable;)”出现在2.0版本中。有时先加载旧版本的“commons-io.jar”,然后出现异常。需要修复类路径。