我正在尝试编写简单的map reduce程序,以使用新的API(0.20.2)查找最大素数。这就是我的Map和reduce类的样子……
public class PrimeNumberMap extends Mapper<LongWritable, Text, IntWritable, IntWritable> {
public void map (LongWritable key, Text Kvalue,Context context) throws IOException,InterruptedException
{
Integer value = new Integer(Kvalue.toString());
if(isNumberPrime(value))
{
context.write(new IntWritable(value), new IntWritable(new Integer(key.toString())));
}
}
boolean isNumberPrime(Integer number)
{
if (number == 1) return false;
if (number == 2) return true;
for (int counter =2; counter<(number/2);counter++)
{
if(number%counter ==0 )
return false;
}
return true;
}
}
public class PrimeNumberReduce extends Reducer<IntWritable, IntWritable, IntWritable, IntWritable> {
public void reduce ( IntWritable primeNo, Iterable<IntWritable> Values,Context context) throws IOException ,InterruptedException
{
int maxValue = Integer.MIN_VALUE;
for (IntWritable value : Values)
{
maxValue= Math.max(maxValue, value.get());
}
//output.collect(primeNo, new IntWritable(maxValue));
context.write(primeNo, new IntWritable(maxValue)); }
}
public static void main(String[] args) throws IOException, InterruptedException, ClassNotFoundException{
if (args.length ==0)
{
System.err.println(" Usage:\n\tPrimenumber <input Directory> <output Directory>");
System.exit(-1);
}
Job job = new Job();
job.setJarByClass(Main.class);
job.setJobName("Prime");
// Creating job configuration object
FileInputFormat.addInputPath(job, new Path (args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));
job.setMapOutputKeyClass(IntWritable.class);
job.setMapOutputValueClass(IntWritable.class);
job.setOutputKeyClass(IntWritable.class);
job.setOutputValueClass(IntWritable.class);
String star ="*********************************************";
System.out.println(star+"\n Prime number computer \n"+star);
System.out.println(" Application started ... keeping fingers crossed :/ ");
System.exit(job.waitForCompletion(true)?0:1);
}
}
关于 map 密钥不匹配,我仍然收到错误消息
有人可以提出什么问题吗?我尝试了所有的钩子(Hook)和骗子。
最佳答案
您尚未在主块中配置Mapper或reducer类,因此将使用默认的Mapper-被称为Identity Mapper-输出其作为输入接收的每一对(因此将LongWritable作为输出键):
job.setMapperClass(PrimeNumberMap.class);
job.setReducerClass(PrimeNumberReduce.class);
关于hadoop - “Type mismatch in key from map: expected org.apache.hadoop.io.IntWritable, recieved org.apache.hadoop.io.LongWritable”-每件事看起来都正确,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/11021478/