Hadoop映射器可以在输出中生成多个键吗?

Mon*_*bal 5 hadoop key mapper

单个Mapper类可以在一次运行中生成多个键值对(相同类型)吗?

我们在mapper中输出键值对,如下所示:

context.write(key, value);
Run Code Online (Sandbox Code Playgroud)

这是Key的精简版(和示例版):

import java.io.DataInput;
import java.io.DataOutput;
import java.io.IOException;

import org.apache.hadoop.io.ObjectWritable;
import org.apache.hadoop.io.WritableComparable;
import org.apache.hadoop.io.WritableComparator;


public class MyKey extends ObjectWritable implements WritableComparable<MyKey> {

    public enum KeyType {
        KeyType1,
        KeyType2
    }

    private KeyType keyTupe;
    private Long field1;
    private Integer field2 = -1;
    private String field3 = "";


    public KeyType getKeyType() {
        return keyTupe;
    }

    public void settKeyType(KeyType keyType) {
        this.keyTupe = keyType;
    }

    public Long getField1() {
        return field1;
    }

    public void setField1(Long field1) {
        this.field1 = field1;
    }

    public Integer getField2() {
        return field2;
    }

    public void setField2(Integer field2) {
        this.field2 = field2;
    }


    public String getField3() {
        return field3;
    }

    public void setField3(String field3) {
        this.field3 = field3;
    }

    @Override
    public void readFields(DataInput datainput) throws IOException {
        keyTupe = KeyType.valueOf(datainput.readUTF());
        field1 = datainput.readLong();
        field2 = datainput.readInt();
        field3 = datainput.readUTF();
    }

    @Override
    public void write(DataOutput dataoutput) throws IOException {
        dataoutput.writeUTF(keyTupe.toString());
        dataoutput.writeLong(field1);
        dataoutput.writeInt(field2);
        dataoutput.writeUTF(field3);
    }

    @Override
    public int compareTo(MyKey other) {
        if (getKeyType().compareTo(other.getKeyType()) != 0) {
            return getKeyType().compareTo(other.getKeyType());
        } else if (getField1().compareTo(other.getField1()) != 0) {
            return getField1().compareTo(other.getField1());
        } else if (getField2().compareTo(other.getField2()) != 0) {
            return getField2().compareTo(other.getField2());
        } else if (getField3().compareTo(other.getField3()) != 0) {
            return getField3().compareTo(other.getField3());
        } else {
            return 0;
        }
    }

    public static class MyKeyComparator extends WritableComparator {
        public MyKeyComparator() {
            super(MyKey.class);
        }

        public int compare(byte[] b1, int s1, int l1, byte[] b2, int s2, int l2) {
            return compareBytes(b1, s1, l1, b2, s2, l2);
        }
    }

    static { // register this comparator
        WritableComparator.define(MyKey.class, new MyKeyComparator());
    }
}
Run Code Online (Sandbox Code Playgroud)

这就是我们尝试在Mapper中输出两个键的方式:

MyKey key1 = new MyKey();
key1.settKeyType(KeyType.KeyType1);
key1.setField1(1L);
key1.setField2(23);

MyKey key2 = new MyKey();
key2.settKeyType(KeyType.KeyType2);
key2.setField1(1L);
key2.setField3("abc");

context.write(key1, value1);
context.write(key2, value2);
Run Code Online (Sandbox Code Playgroud)

我们的作业输出格式类是:org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat

我说这是因为在其他输出格式类中,我看到输出没有附加,只是提交了write方法的实现.

此外,我们将以下类用于Mapper和Context:org.apache.hadoop.mapreduce.Mapper org.apache.hadoop.mapreduce.Context

ajd*_*574 10

在一个地图任务中多次写入上下文是完全正常的.

但是,您的密钥类可能有几个问题.无论何时实现WritableComparable密钥,都应该实现equals(Object)hashCode()方法.这些不是WritableComparable接口的一部分,因为它们是在中定义的Object,但您必须提供实现.

默认分区程序使用该hashCode()方法来确定每个键/值对转到哪个reducer.如果您没有提供合理的实现,您可能会得到奇怪的结果.

根据经验,无论何时实施hashCode()或任何种类的比较方法,您都应该提供一种equals(Object)方法.您必须确保它接受一个Object参数,因为这是在Object类中定义的方式(您可能会覆盖其实现).