flink_初识02kafkawordcount

Posted yin-fei

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了flink_初识02kafkawordcount相关的知识,希望对你有一定的参考价值。

1.启动zookeeper服务

./bin/zookeeper-server-start.sh config/zookeeper.properties

2.开启kafka服务

.\bin\windows\kafka-server-start.bat .\config\server.properties
./bin/kafka-server-start.sh config/server.properties

3.创建topic

.\bin\windows\kafka-topics.bat --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test_flink
./bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test_flink

4.创建生产者

.\bin\windows\kafka-console-producer.bat --broker-list localhost:9092 --topic test_flink
./bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test_flink

--5.创建消费者
--.\bin\windows\kafka-console-consumer.bat --zookeeper localhost:2181 --topic test_flink
--./bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic test_flink --from-beginning

5.

package flink

import java.util.Properties

import org.apache.flink.api.common.serialization.SimpleStringSchema
import org.apache.flink.core.fs.FileSystem.WriteMode
import org.apache.flink.streaming.api.scala.StreamExecutionEnvironment
import org.apache.flink.streaming.api.windowing.time.Time
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer011
import org.apache.flink.streaming.api.scala._

object KafkaWordCount 


  def getFlinkKafkaConsumer() = 
    val prop = new Properties()

    prop.setProperty("zookeeper.connect", "localhost:2181") //ZOOKEEPER_HOST
    prop.setProperty("bootstrap.servers", "localhost:9092") //KAFKA_BROKER
    prop.setProperty("group.id", "group1") //TRANSACTION_GROUP
    new FlinkKafkaConsumer011[String]("test_flink", new SimpleStringSchema(), prop) //TOPIC
  


  def main(args: Array[String]): Unit = 

    val env = StreamExecutionEnvironment.getExecutionEnvironment

    val source = getFlinkKafkaConsumer()

    source.setStartFromEarliest()

    val dStream = env.addSource(source)

    val result = dStream.flatMap(x => x.split("\\s"))
      .map(x => (x, 1)).keyBy(0).timeWindow(Time.seconds(2l)).sum(1)

    result.setParallelism(1).print()

    result.writeAsText("E:\\\\sparkproject\\\\src\\\\test\\\\data\\\\result2.txt", WriteMode.OVERWRITE)

    env.execute("KafkaWordCount")
  

 

以上是关于flink_初识02kafkawordcount的主要内容,如果未能解决你的问题,请参考以下文章

flink初识及安装flink standalone集群

Flink初识Flink快速又灵巧

Flink初识Flink快速又灵巧

初识Apache Flink - 数据流上的有状态计算

初识flink

flink02------1.自定义source