我在 HDFS 数据目录上有 300000 多个文件。
当我执行 hadoop fs -ls 时,出现内存不足错误,提示已超出 GC 限制。每个集群节点都有 256 GB 的 RAM。我如何解决它?
我正在使用配置单元0.10,当我这样做时
hive -e "show tables", hive -e "desc table_name" it works!
Run Code Online (Sandbox Code Playgroud)
但是当我做类似的事情时,我hive -e "select count(*) table_name在下面得到了例外。有办法调试吗?相同的代码在以前的集群中与较旧版本的hive一起使用时,新集群抛出此错误。什么应该是调试此类问题的正确方法,而Google并未找到解决该问题的方法。
java.lang.IllegalArgumentException: Can not create a Path from an empty string
at org.apache.hadoop.fs.Path.checkPathArg(Path.java:91)
at org.apache.hadoop.fs.Path.<init>(Path.java:99)
at org.apache.hadoop.hive.ql.exec.Utilities.getHiveJobID(Utilities.java:382)
at org.apache.hadoop.hive.ql.exec.Utilities.clearMapRedWork(Utilities.java:195)
at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:472)
at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:138)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:347)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:706)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
Run Code Online (Sandbox Code Playgroud)
失败:执行错误,从org.apache.hadoop.hive.ql.exec.MapRedTask返回代码1