问题描述
I've been playing around with MRUnit and tried running it for a hadoop wordcount example following the tutorial for wordcount and unit testing
尽管不是粉丝,但我一直在使用Eclipse来运行代码,而setMapper函数却一直报错
Though not a fan, I've been using Eclipse to run the code and I keep getting an error for setMapper function
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mrunit.mapreduce.MapDriver;
import org.apache.hadoop.mrunit.mapreduce.MapReduceDriver;
import org.apache.hadoop.mrunit.mapreduce.ReduceDriver;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.junit.Before;
import org.junit.Test;
public class TestWordCount {
MapReduceDriver<LongWritable, Text, Text, IntWritable, Text, IntWritable> mapReduceDriver;
MapDriver<LongWritable, Text, Text, IntWritable> mapDriver;
ReduceDriver<Text, IntWritable, Text, IntWritable> reduceDriver;
@Before
public void setUp() throws IOException
{
WordCountMapper mapper = new WordCountMapper();
mapDriver = new MapDriver<LongWritable, Text, Text, IntWritable>();
mapDriver.setMapper(mapper); //<--Issue here
WordCountReducer reducer = new WordCountReducer();
reduceDriver = new ReduceDriver<Text, IntWritable, Text, IntWritable>();
reduceDriver.setReducer(reducer);
mapReduceDriver = new MapReduceDriver<LongWritable, Text, Text, IntWritable, Text, IntWritable>();
mapReduceDriver.setMapper(mapper); //<--Issue here
mapReduceDriver.setReducer(reducer);
}
错误消息:
java.lang.Error: Unresolved compilation problems:
The method setMapper(Mapper<LongWritable,Text,Text,IntWritable>) in the type MapDriver<LongWritable,Text,Text,IntWritable> is not applicable for the arguments (WordCountMapper)
The method setMapper(Mapper<LongWritable,Text,Text,IntWritable>) in the type MapReduceDriver<LongWritable,Text,Text,IntWritable,Text,IntWritable> is not applicable for the arguments (WordCountMapper)
查找此问题,我认为这可能是API冲突,但是我不确定在哪里寻找它.其他人以前有这个问题吗?
Looking up this issue, I think it might be an API conflict but I'm not sure where to look for it. Anybody else have this issue before?
EDIT 我正在使用一个用户定义的库,其中包含hadoop2 jar和最新的Junit(4.10)jar.
EDIT I'm using a user defined library with the hadoop2 jar and the latest Junit(4.10) jar in it.
编辑2 这是WordCountMapper的代码
EDIT 2 Here is the code for WordCountMapper
import java.io.IOException;
import java.util.StringTokenizer;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;
public class WordCountMapper extends Mapper<Object, Text, Text, IntWritable>
{
private final static IntWritable one = new IntWritable(1);
private Text word = new Text();
public void map(Object key, Text value, Context context)throws IOException, InterruptedException
{
StringTokenizer itr = new StringTokenizer(value.toString());
while (itr.hasMoreTokens())
{
word.set(itr.nextToken());
context.write(word, one);
}
}
}
最终编辑/工作
原来我需要设置
WordCountMapper mapper = new WordCountMapper();
到
Mapper mapper = new WordCountMapper();
因为泛型存在问题.还需要将模仿库导入我的用户定义库中.
since there was an issue with generics. Also needed to import the mockito library into my user defined library.
推荐答案
确保您导入了正确的类,与上面我的程序在Reducer和reduce_test两个类中都具有正确的参数不同,我遇到了相同的错误,但是由于导入了错误的类我遇到了上面报告的相同错误消息
Make sure you have imported correct class, i have faced same error unlike above my program was having correct parameters in both classes Reducer and reduce_test but due to importing wrong class i have faced same error message which is reported above
错误导入的类-
导入org.apache.hadoop.mrunit.ReduceDriver;
import org.apache.hadoop.mrunit.ReduceDriver;
正确的课程---
导入org.apache.hadoop.mrunit.mapreduce.ReduceDriver;
import org.apache.hadoop.mrunit.mapreduce.ReduceDriver;
对于mapper_test,如果您确定自己在Mapper__class和Mapper_test中的参数相同,则采用相同的解决方案
same solution in case of mapper_test, if you are sure that your parameters are same in Mapper__class and Mapper_test
这篇关于尝试运行MRUnit示例时API冲突的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!