我有一个genkey
类,它实现writable
和comparable
。我需要将此类的对象从mapper传递给reducer。映射器工作正常。现在,当我最终尝试通过context.write
获取输出时,出现以下错误...class org.apache.hadoop.io.Text is not class genkey
这是我的代码
public static class Mappertwo extends Mapper<Object, Text, genkey, IntWritable> {
private genkey ky=new genkey();
public void map(Object key , Text value , Context context)
throws IOException , InterruptedException{
.............
context.write(ky,one);
}
}
public static class Reducertwo extends Reducer<genkey, IntWritable, Text, IntWritable>
{
public void reduce(genkey ukey , Iterable<IntWritable> values , Context context)
throws IOException , InterruptedException{
IntWritable result = new IntWritable();
Text nw=new Text();
int sum=0;
for (IntWritable val : values){
sum += val.get();
}
String s=ukey.getkey();
nw.set(s);
result.set(sum);
context.write(nw,result);
//this line here gives the error... }
}
public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException{
if (args.length != 5) {
System.err.println("exceeded array limit:");
System.exit(-1);
}
Configuration conf1 = new Configuration();
Job job1= new Job(conf1, "round2");
conf1.set("t",args[2]);
DistributedCache.addCacheFile(new Path("/user/hduser/out/part-r-00000").toUri(),job1.getConfiguration());
job1.setJarByClass(MapSide.class);
job1.setMapperClass(Mappertwo.class);
job1.setCombinerClass(Reducertwo.class);
job1.setReducerClass(Reducertwo.class);
job1.setOutputKeyClass(Text.class);
job1.setOutputValueClass(IntWritable.class);
job1.setMapOutputKeyClass(genkey.class);
FileInputFormat.addInputPath(job1, new Path(args[0]));
FileOutputFormat.setOutputPath(job1, new Path(args[3]));
System.exit(job1.waitForCompletion(true)?0:1);
}
最佳答案
您已定义要编写Text, IntWritable
。如果要编写genkey, IntWritable
,则reducer定义必须如下所示:
public static class Reducertwo extends Reducer<genkey, IntWritable, genkey, IntWritable>
在同一页面上,您的作业输出键类也需要设置为:
job1.setOutputKeyClass(genkey.class);