Trident整合Kafka
Posted tangzhe
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Trident整合Kafka相关的知识,希望对你有一定的参考价值。
首先编写一个打印函数KafkaPrintFunction
import org.apache.storm.trident.operation.BaseFunction; import org.apache.storm.trident.operation.TridentCollector; import org.apache.storm.trident.tuple.TridentTuple; import org.apache.storm.tuple.Values; public class KafkaPrintFunction extends BaseFunction { @Override public void execute(TridentTuple input, TridentCollector collector) { String msg = input.getStringByField("str"); System.out.println(this.getClass().getSimpleName() + ": " + msg); collector.emit(new Values(msg)); } }
然后编写trident整合kafka的topology
import net.baiqu.shop.report.utils.Constants; import org.apache.storm.Config; import org.apache.storm.LocalCluster; import org.apache.storm.kafka.BrokerHosts; import org.apache.storm.kafka.StringScheme; import org.apache.storm.kafka.ZkHosts; import org.apache.storm.kafka.trident.TransactionalTridentKafkaSpout; import org.apache.storm.kafka.trident.TridentKafkaConfig; import org.apache.storm.spout.SchemeAsMultiScheme; import org.apache.storm.trident.Stream; import org.apache.storm.trident.TridentTopology; import org.apache.storm.tuple.Fields; /** * kafka连接trident */ public class KafkaTrident { public static void main(String[] args) { TridentTopology topology = new TridentTopology(); BrokerHosts hosts = new ZkHosts(Constants.ZK_HOSTS); String topic = "tridentTestTopic"; String id = "kafka.queue.tridentTestTopic"; TridentKafkaConfig kafkaConfig = new TridentKafkaConfig(hosts, topic, id); kafkaConfig.scheme = new SchemeAsMultiScheme(new StringScheme()); TransactionalTridentKafkaSpout kafkaSpout = new TransactionalTridentKafkaSpout(kafkaConfig); Stream stream = topology.newStream("kafkaSpout", kafkaSpout); stream.shuffle().each(new Fields("str"), new KafkaPrintFunction(), new Fields("result")); LocalCluster cluster = new LocalCluster(); cluster.submitTopology("kafkaTridentDemo", new Config(), topology.build()); } }
另一个Java项目发送kafka数据
@Scheduled(fixedRate = 3000) public void shopDataTestJob9() { for (int i = 0; i < 1; i++) { kafkaTemplate.send("tridentTestTopic", "test kafka trident"); System.out.println("test kafka trident"); } }
最后运行storm项目以及java项目(需要先运行java项目往kafka发数据,建立此topic,storm才能消费这个topic)
观察结果,storm项目控制台输出
KafkaPrintFunction: test kafka trident
KafkaPrintFunction: test kafka trident
KafkaPrintFunction: test kafka trident
表示storm trident消费kafka数据成功
以上是关于Trident整合Kafka的主要内容,如果未能解决你的问题,请参考以下文章
整合Kafka到Spark Streaming——代码示例和挑战
超详细!一文告诉你 SparkStreaming 如何整合 Kafka !附代码可实践
整合Kafka到Spark Streaming——代码示例和挑战
超详细!一文详解 SparkStreaming 如何整合 Kafka !附代码可实践