如何在Scala中使用泛型引用静态Java类的子类

Ale*_*lex 5 google-app-engine hadoop scala

我有这个Java代码:

public class TestMapper extends AppEngineMapper<Key, Entity, NullWritable, NullWritable> {
  public TestMapper() {
  }
// [... other overriden methods ...]
      @Override
      public void setup(Context context) {
        log.warning("Doing per-worker setup");
      }
}
Run Code Online (Sandbox Code Playgroud)

...我转换为:

class TestMapper extends AppEngineMapper[Key, Entity, NullWritable, NullWritable] {
// [... other overriden methods ...]
      override def setup(context: Context) {
        log.warning("Doing per-worker setup")
      }
}
Run Code Online (Sandbox Code Playgroud)

现在实际问题:

Context被定义为 org.apache.hadoop.mapreduce.Mapper类中的嵌套类:

        public static class Mapper<KEYIN, VALUEIN, KEYOUT, VALUEOUT>   {
    //[... some other methods ...]
protected void setup(org.apache.hadoop.mapreduce.Mapper<KEYIN,VALUEIN,KEYOUT,VALUEOUT>.Context context) throws java.io.IOException, java.lang.InterruptedException { /* compiled code */ }
        public class Context extends org.apache.hadoop.mapreduce.MapContext<KEYIN,VALUEIN,KEYOUT,VALUEOUT> {

        public Context(org.apache.hadoop.conf.Configuration configuration, org.apache.hadoop.mapreduce.TaskAttemptID conf, org.apache.hadoop.mapreduce.RecordReader<KEYIN,VALUEIN> taskid, org.apache.hadoop.mapreduce.RecordWriter<KEYOUT,VALUEOUT> reader, org.apache.hadoop.mapreduce.OutputCommitter writer, org.apache.hadoop.mapreduce.StatusReporter committer, org.apache.hadoop.mapreduce.InputSplit reporter) throws java.io.IOException, java.lang.InterruptedException { /* compiled code */ }

        }
Run Code Online (Sandbox Code Playgroud)

所以我不能告诉我的Scala类实际上是什么/什么是Context.如果Mapper没有泛型,我可以引用Context via

Mapper#Context
Run Code Online (Sandbox Code Playgroud)

但我如何判断Mapper是否有泛型?

Mapper[_,_,_,_]#Context
Run Code Online (Sandbox Code Playgroud)

......没用

Mor*_*itz 9

在您的情况下,您必须为您的类型投影提供确切的基本类型

Mapper[Key, Entity, NullWritable, NullWritable]#Context
Run Code Online (Sandbox Code Playgroud)

所以压倒一切setup都会写成

override def setup(context: Mapper[Key, Entity, NullWritable, NullWritable]#Context)
Run Code Online (Sandbox Code Playgroud)

通过引入类型别名可以简化用法

class TestMapper extends AppEngineMapper[Key, Entity, NullWritable, NullWritable] {

  type Context = Mapper[Key, Entity, NullWritable, NullWritable]#Context

  override def setup(context: Context) = {
      // ...
   }
}
Run Code Online (Sandbox Code Playgroud)

如果您想编写多个映射器,您可以将其重构为可以混合到您的实现中的特征:

trait SMapper[A,B,C,D] extends Mapper[A,B,C,D] {
  type Context = Mapper[A,B,C,D]#Context
}

class TestMapper extends AppEngineMapper[Key, Entity, NullWritable, NullWritable]
                    with SMapper[Key, Entity, NullWritable, NullWritable] {
  override def setup(context: Context) = {
     // ...
  }
}
Run Code Online (Sandbox Code Playgroud)

或者对于普通的hadoop:

class TestMapper extends SMapper[Key, Entity, NullWritable, NullWritable] {
  override def setup(context: Context) = {
     // ...
  }
}
Run Code Online (Sandbox Code Playgroud)

  • 你实际上并不需要存在主义类型; 因为setup是一个被覆盖的Mapper方法,所以你可以先使用Mapper [A,B,C,D] #Context(或类型别名). (2认同)