在hadoop map-reduce中运行jar文件时出错

Posted

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了在hadoop map-reduce中运行jar文件时出错相关的知识,希望对你有一定的参考价值。

在Hadoop中运行jar文件时,我遇到了错误。我无法理解问题是什么。

以下是地图代码

//Mapper class 
public static class E_EMapper extends MapReduceBase implements 
   Mapper<LongWritable ,/*Input key Type */ 
   Text,                /*Input value Type*/ 
   Text,                /*Output key Type*/ 
   IntWritable>        /*Output value Type*/ 
   { 

  //Map function 
  public void map(LongWritable key, Text value, 
  OutputCollector<Text, IntWritable> output,   
  Reporter reporter) throws IOException 
  { 
     String line = value.toString(); 
     String lasttoken = null; 
     StringTokenizer s = new StringTokenizer(line,"	"); 
     String year = s.nextToken(); 

     while(s.hasMoreTokens())
        {
           lasttoken=s.nextToken();
        } 

     int avgprice = Integer.parseInt(lasttoken); 
     output.collect(new Text(year), new IntWritable(avgprice)); 
  } 
} 

以下是减少代码

//Reducer class 

public static class E_EReduce extends MapReduceBase implements 
 Reducer< Text, IntWritable, Text, IntWritable > 
   {  
  //Reduce function 
  public void reduce( Text key, Iterator <IntWritable> values, 
     OutputCollector<Text, IntWritable> output, Reporter reporter) throws IOException 
     { 
        int maxavg=30; 
        int val=Integer.MIN_VALUE; 

        while (values.hasNext()) 
        { 
           if((val=values.next().get())>maxavg) 
           { 
              output.collect(key, new IntWritable(val)); 
           } 
        } 

     } 
}  

以下是主要代码

 //Main function 
public static void main(String args[])throws Exception 
   { 
      JobConf conf = new JobConf(ProcessUnits.class); 

  conf.setJobName("max_eletricityunits"); 
  conf.setOutputKeyClass(Text.class);
  conf.setOutputValueClass(IntWritable.class); 
  conf.setMapperClass(E_EMapper.class); 
  conf.setCombinerClass(E_EReduce.class); 
  conf.setReducerClass(E_EReduce.class); 
  conf.setInputFormat(TextInputFormat.class); 
  conf.setOutputFormat(TextOutputFormat.class); 

  FileInputFormat.setInputPaths(conf, new Path(args[0])); 
  FileOutputFormat.setOutputPath(conf, new Path(args[1])); 

  JobClient.runJob(conf); 
} 
} 

以上是关于在hadoop map-reduce中运行jar文件时出错的主要内容,如果未能解决你的问题,请参考以下文章

hadoop——配置eclipse下的map-reduce运行环境 1

Hadoop基础---shuffle机制(进一步理解Hadoop机制)

使用eclipse连接远程hadoop服务器

hadoop Map-Reduce入门讲解

hadoop-初学者写map-reduce程序中容易出现的问题 3

MapReuduce的一些情况