用Spark写一个简单的wordcount词频统计程序
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了用Spark写一个简单的wordcount词频统计程序相关的知识,希望对你有一定的参考价值。
public class WordCountLocal {
public static void main(String[] args) {
SparkConf conf = new SparkConf().setAppName("WordCountLocal").setMaster("local[2]");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<String> words = sc.textFile("c:.//words.txt").flatMap(new FlatMapFunction<String, String>() {
@Override
public Iterable<String> call(String line) throws Exception {
return Arrays.asList(line.split(" "));
}
});
JavaPairRDD<String, Integer> mapToPair = words.mapToPair(new PairFunction<String, String, Integer>() {
@Override
public Tuple2<String, Integer> call(String word) throws Exception {
return new Tuple2<String, Integer>(word,1);
}
});
JavaPairRDD<String, Integer> result = mapToPair.reduceByKey(new Function2<Integer, Integer, Integer>() {
@Override
public Integer call(Integer v1, Integer v2) throws Exception {
return v1 + v2;
}
});
result.foreach(new VoidFunction<Tuple2<String,Integer>>() {
@Override
public void call(Tuple2<String, Integer> wordCount) throws Exception {
System.out.println(wordCount._1 + " appear " + wordCount._2 + " times!");
}
});
sc.close();
}
}
本文出自 “星月情缘” 博客,请务必保留此出处http://xuegodxingyue.blog.51cto.com/5989753/1933462
以上是关于用Spark写一个简单的wordcount词频统计程序的主要内容,如果未能解决你的问题,请参考以下文章
使用Scala语言调用Flink框架进行WordCount词频统计测试不同Parallelism并行度对运算速度的影响