Hadoop为java.nio.ByteBuffer的keytype抛出ClassCastException

sam*_*rth 5 hadoop bytebuffer mapreduce hadoop-streaming

我正在使用"hadoop-0.20.203.0rc1.tar.gz"进行群集设置.每当我设定job.setMapOutputKeyClass(ByteBuffer.class);

并运行我得到以下异常的工作:

    12/01/13 15:09:00 INFO mapred.JobClient: Task Id : attempt_201201131428_0005_m_000001_2, Status : FAILED
java.lang.ClassCastException: class java.nio.ByteBuffer
        at java.lang.Class.asSubclass(Class.java:3018)
        at org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:776)
        at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:958)
        at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:673)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:755)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:369)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:259)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
        at org.apache.hadoop.mapred.Child.main(Child.java:253)
Run Code Online (Sandbox Code Playgroud)

另外我注意到ByteBuffer是Comparable而不是Writable是否会产生任何差异?如果需要任何其他信息,请与我们联系.

Pra*_*ati 5

这是抛出异常的地方.这是SVN 的代码.

public RawComparator getOutputKeyComparator() {
    Class<? extends RawComparator> theClass = getClass("mapred.output.key.comparator.class",
        null, RawComparator.class);
    if (theClass != null)
        return ReflectionUtils.newInstance(theClass, this);
    return WritableComparator.get(getMapOutputKeyClass().asSubclass(WritableComparable.class));
}
Run Code Online (Sandbox Code Playgroud)

如果mapred.output.key.comparator.class未在JobConf上定义属性,则密钥必须实现WritableComparable接口.ByteBuffer类没有实现WritableComparable接口,所以异常.

BTW,WritableComparable接口是Writable和Comparable类的子接口.