我创建了一个Mapper&Reducer,它使用AVRO作为输入,map-output和reduce输出。创建MRUnit测试时,我得到以下堆栈跟踪:

java.lang.NullPointerException
at org.apache.hadoop.io.serializer.SerializationFactory.getSerializer(SerializationFactory.java:73)
at org.apache.hadoop.mrunit.mock.MockOutputCollector.deepCopy(MockOutputCollector.java:74)
at org.apache.hadoop.mrunit.mock.MockOutputCollector.collect(MockOutputCollector.java:110)
at org.apache.hadoop.mrunit.mapreduce.mock.MockMapContextWrapper$MockMapContext.write(MockMapContextWrapper.java:119)
at org.apache.avro.mapreduce.AvroMapper.writePair(AvroMapper.java:22)
at com.bol.searchrank.phase.day.DayMapper.doMap(DayMapper.java:29)
at com.bol.searchrank.phase.day.DayMapper.doMap(DayMapper.java:1)
at org.apache.avro.mapreduce.AvroMapper.map(AvroMapper.java:16)
at org.apache.avro.mapreduce.AvroMapper.map(AvroMapper.java:1)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.hadoop.mrunit.mapreduce.MapDriver.run(MapDriver.java:200)
at org.apache.hadoop.mrunit.mapreduce.MapReduceDriver.run(MapReduceDriver.java:207)
at com.bol.searchrank.phase.day.DayMapReduceTest.shouldProduceAndCountTerms(DayMapReduceTest.java:39)

驱动程序初始化如下(我已经创建了Avro MapReduce API实现):
    driver = new MapReduceDriver<AvroWrapper<Pair<Utf8, LiveTrackingLine>>, NullWritable, AvroKey<Utf8>, AvroValue<Product>, AvroWrapper<Pair<Utf8, Product>>, NullWritable>().withMapper(new DayMapper()).withReducer(new DayReducer());

使用io.serialization添加配置对象无济于事:
    Configuration configuration = new Configuration();
    configuration.setStrings("io.serializations", new String[] {
        AvroSerialization.class.getName()
    });
    driver = new MapReduceDriver<AvroWrapper<Pair<Utf8, LiveTrackingLine>>, NullWritable, AvroKey<Utf8>, AvroValue<Product>, AvroWrapper<Pair<Utf8, Product>>, NullWritable>().withMapper(new DayMapper()).withReducer(new DayReducer()).withConfiguration(configuration);

我使用Cloudera的Hadoop&MRUnit 0.20.2-cdh3u2和Avro MapRed 1.6.3。

最佳答案

之所以获得NPE,是因为SerializationFactory在io.serializations中找不到实现序列化的可接受类。

MRUnit除了可写以外,还有一些与序列化相关的错误,包括https://issues.apache.org/jira/browse/MRUNIT上的MRUNIT-45,MRUNIT-70,MRUNIT-77,MRUNIT-86。这些错误涉及conf无法正确传递给SerializationFactory构造函数,或者代码需要所有Writables具有的Key或Value中的默认构造函数。所有这些修复程序均出现在Apache MRUnit 0.9.0-incubating中,该版本将在本周的某个时间发布。

Cloudera的0.20.2-cdh3u2 MRUnit接近于Apache MRUnit 0.5.0孵化。我认为即使在0.9.0孵化中您的代码仍然可能是一个问题,请将您的完整代码示例通过电子邮件发送至mrunit-user @ incubator.apache.org,Apache MRUnit项目将很乐意对此进行查看。

现在将编译MRUNIT-99放宽对K2类型参数的限制,不必是可比较的

关于hadoop - MRUnit和AVRO一起使用,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/10119616/

10-12 23:44