环境如下:

ubuntu-12.04.2-server-amd64

hadoop-1.0.4

VirtualBox


1、在VBox中安装UbuntuServer,

用户名和密码都是hadoop,安装完成后,克隆两份,这样就有三台机器,一台master,两台slave。克隆后的slave打开不能上网,原因是克隆的时候,“重新初始化所有网卡的MAC地址”,而Ubuntu系统中网卡缓存中的MAC地址也新的MAC地址不一致,解决方法是在克隆后的机器中删除一个文件,使用如下命令:


sudoconfigurationpropertynamenamevaluevaluepropertyconfigurationconfigurationpropertynamenamevaluevaluepropertypropertynamenamevaluevaluepropertypropertynamenamevaluevaluepropertyconfigurationconfigurationpropertynamenamevaluevaluepropertypropertynamenamevaluevaluepropertyconfigurationLongWritable,TextTextIntWritable>Text, IntWritable> output, Reporter reporter) throws IOException {
  •        int sum = 0;
  •        while (values.hasNext()) {
  •          sum += values.next().get();
  •        }
  •        output.collect(key, new IntWritable(sum));
  •      }
  •    }

  •    public static void main(String[] args) throws Exception {
  •      JobConf conf = new JobConf(WordCount.class);
  •      conf.setJobName("wordcount");

  •      conf.setOutputKeyClass(Text.class);
  •      conf.setOutputValueClass(IntWritable.class);

  •      conf.setMapperClass(Map.class);
  •      conf.setCombinerClass(Reduce.class);
  •      conf.setReducerClass(Reduce.class);

  •      conf.setInputFormat(TextInputFormat.class);
  •      conf.setOutputFormat(TextOutputFormat.class);

  •      FileInputFormat.setInputPaths(conf, new Path(args[0]));
  •      FileOutputFormat.setOutputPath(conf, new Path(args[1]));

  •      JobClient.runJob(conf);
  •    }
  • }

  • 参考链接:

    http://blog.chinaunix.net/uid-26867092-id-3213709.html

    http://www.cnblogs.com/xia520pi/archive/2012/05/16/2503949.html
    官网1.0.4的安装教程:http://hadoop.apache.org/docs/r1.0.4/index.html

    官网2.6.0的安装教程:http://hadoop.apache.org/docs/r2.6.0/hadoop-project-dist/hadoop-common/SingleCluster.html

    转载自:http://www.cnblogs.com/sunjie21/archive/2013/04/01/2994309.html
    01-29 20:40