标签: hbasetestingutility

如何在 HbaseTestingUtility 中更改 HBase 基本目录

我在 IntelliJ IDE 中运行 HbaseTestingUtility 时遇到问题,我可以看到以下错误可能是由于文件名太长造成的:

16/03/14 22:45:13 WARN datanode.DataNode: IOException in BlockReceiver.run(): 
java.io.IOException: Failed to move meta file for ReplicaBeingWritten, blk_1073741825_1001, RBW
getNumBytes()     = 7
getBytesOnDisk()  = 7
getVisibleLength()= 7
getVolume()       = C:\Users\user1\Documents\work\Repos\hadoop-analys\reporting\mrkts-surveillance\target\test-data\9654a646-e923-488a-9e20-46396fd15292\dfscluster_6b264e6b-0218-4f30-ad5b-72e838940b1e\dfs\data\data1\current
getBlockFile()    = C:\Users\user1\Documents\work\Repos\hadoop-analys\reporting\mrkts-surveillance\target\test-data\9654a646-e923-488a-9e20-46396fd15292\dfscluster_6b264e6b-0218-4f30-ad5b-72e838940b1e\dfs\data\data1\current\BP-429386217-192.168.1.110-1457991908038\current\rbw\blk_1073741825
bytesAcked=7
bytesOnDisk=7 from C:\Users\user1\Documents\work\Repos\hadoop-analys\reporting\mrkts-surveillance\target\test-data\9654a646-e923-488a-9e20-46396fd15292\dfscluster_6b264e6b-0218-4f30-ad5b-72e838940b1e\dfs\data\data1\current\BP-429386217-192.168.1.110-1457991908038\current\rbw\blk_1073741825_1001.meta to    C:\Users\user1\Documents\work\Repos\hadoop-analys\reporting\mrkts-surveillance\target\test-data\9654a646-e923-488a-9e20-46396fd15292\dfscluster_6b264e6b-0218-4f30-ad5b-72e838940b1e\dfs\data\data1\current\BP-429386217-192.168.1.110-1457991908038\current\finalized\subdir0\subdir0\blk_1073741825_1001.meta
   at     org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.moveBlockFiles(FsDatasetImpl.java:615)
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice.addBlock(BlockPoolSlice.java:250)
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsVolumeImpl.addBlock(FsVolumeImpl.java:229)
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.finalizeReplica(FsDatasetImpl.java:1119)
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.finalizeBlock(FsDatasetImpl.java:1100)
at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.finalizeBlock(BlockReceiver.java:1293)
at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.run(BlockReceiver.java:1233)
at java.lang.Thread.run(Thread.java:745)
Caused by: 3: The system cannot find the path specified.
Run Code Online (Sandbox Code Playgroud)

知道吗,我如何指定 Hbasetestingutility 的基目录而不使用这个庞大的起始目录?

谢谢,

windows hbase scala intellij-idea hbasetestingutility

5
推荐指数
1
解决办法
1296
查看次数

HBaseTestingUtility 是否适用于 MiniCluster?

我有一个简单的单元测试,我想针对HBaseTestingUtilityMiniCluster运行。HBaseTestingUtility缺少运行测试所需的传递依赖项。我一直在跟踪NoClassDefFoundErrors并遇到了一些可能是其中一个 jar 文件的打包错误的问题。这是错误:

java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/test/MetricsAssertHelper at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:91)

当我在项目的 .jar 文件中搜索 MetricsAssertHelper 时,我在 META-INF/services 目录下找到了该文件:

jar tvf org/apache/hbase/hbase-hadoop2-compat/1.0.0/hbase-hadoop2-compat-1.0.0-tests.jar | grep MetricsAssertHelper 53 Sat Feb 14 19:43:40 MST 2015 META-INF/services/org.apache.hadoop.hbase.test.MetricsAssertHelper 1337 Sat Feb 14 19:43:40 MST 2015 org/apache/hadoop/hbase/test/MetricsAssertHelperImpl$MockMetricsBuilder.class 3743 Sat Feb 14 19:43:40 MST 2015 org/apache/hadoop/hbase/test/MetricsAssertHelperImpl$MockRecordBuilder.class 6689 Sat Feb 14 19:43:40 MST 2015 org/apache/hadoop/hbase/test/MetricsAssertHelperImpl.class

然而,这不是 .class 文件。我想知道MetricsAssertHelper.class.jar 文件中是否缺少该文件,因为那里有一个“MetricsAssertHelperImpl.class 文件”。

这是我的代码和 mvn 依赖项。错误发生在HBaseTestingUtility.startMiniCluster()通话中。

    private static HBaseTestingUtility utility; 

@Before
public void setUp() throws Exception …
Run Code Online (Sandbox Code Playgroud)

hbase hbasetestingutility

0
推荐指数
1
解决办法
3623
查看次数