Hadoop减速器没有被调用

eza*_*mur 6 hadoop mapreduce

所有

我有简单的map/reduce实现.调用Mapper并且它完成它的工作但是从不调用reducer.

这是mapper:

static public class InteractionMap extends Mapper<LongWritable, Text, Text, InteractionWritable> {

    @Override
    protected void map(LongWritable offset, Text text, Context context) throws IOException, InterruptedException {
        System.out.println("mapper");
        String[] tokens = text.toString().split(",");
        for (int idx = 0; idx < tokens.length; idx++) {
            String sourceUser = tokens[1];
            String targetUser = tokens[2];
            int points = Integer.parseInt(tokens[4]);
            context.write(new Text(sourceUser), new InteractionWritable(targetUser, points));
            }
        }
    }
}
Run Code Online (Sandbox Code Playgroud)

这是我的减速机:

static public class InteractionReduce extends Reducer<Text, InteractionWritable, Text, Text> {

    @Override
    protected void reduce(Text token, Iterable<InteractionWritable> counts, Context context) throws IOException, InterruptedException {
        System.out.println("REDUCER");
        Iterator<InteractionWritable> i = counts.iterator();
        while (i.hasNext()) {
            InteractionWritable interaction = i.next();
            context.write(token, new Text(token.toString() + " " + interaction.getTargetUser().toString() + " " + interaction.getPoints().get()));
        }
    }

}
Run Code Online (Sandbox Code Playgroud)

而且,这是配置部分:

@Override
public int run(String[] args) throws Exception {
    Configuration configuration = getConf();
    Job job = new Job(configuration, "Interaction Count");
    job.setJarByClass(InteractionMapReduce.class);
    job.setMapperClass(InteractionMap.class);
    job.setCombinerClass(InteractionReduce.class);
    job.setReducerClass(InteractionReduce.class);
    job.setInputFormatClass(TextInputFormat.class);
    job.setOutputFormatClass(TextOutputFormat.class);
    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(Text.class);
    FileInputFormat.addInputPath(job, new Path(args[0]));
    FileOutputFormat.setOutputPath(job, new Path(args[1]));
    return job.waitForCompletion(true) ? 0 : -1;
}
Run Code Online (Sandbox Code Playgroud)

有没有人知道为什么没有调用reducer?

eza*_*mur 7

好吧,正如预期的那样,这是我的错.工作配置不好.它应该是这样的:

Configuration configuration = getConf();

Job job = new Job(configuration, "Interaction Count");
job.setJarByClass(InteractionMapReduce.class);
job.setMapperClass(InteractionMap.class);
job.setReducerClass(InteractionReduce.class);
job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(InteractionWritable.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);

FileInputFormat.addInputPath(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));

return job.waitForCompletion(true) ? 0 : -1;
Run Code Online (Sandbox Code Playgroud)

出现问题是因为map和reduce阶段具有不同的输出类型.调用context.write方法后,作业无声地失败.所以,我必须添加的是这些行:

job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(InteractionWritable.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
Run Code Online (Sandbox Code Playgroud)