本文介绍了从卡夫卡获得最新价值的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个名为 A 的Kafka主题.

I have a Kafka topic called A.

格式为:

{ id : 1, name:stackoverflow, created_at:2017-09-28 22:30:00.000}
{ id : 2, name:confluent, created_at:2017-09-28 22:00:00.000}
{ id : 3, name:kafka, created_at:2017-09-28 24:42:00.000}
{ id : 4, name:apache, created_at:2017-09-28 24:41:00.000}

现在在消费者方面,我只想获取一小时窗口的最新数据,这意味着每隔一小时我需要从基于created_at的主题中获取最新值

Now in consumer side i want to get only latest data of one hour window means every one hour i need to get latest value from topic based on created_at

我的预期输出是:

{ id : 1, name:stackoverflow, created_at:2017-09-28 22:30:00.000}
{ id : 3, name:kafka, created_at:2017-09-28 24:42:00.000}

我认为可以通过ksql解决,但我不确定.请帮助我.

I think this can be solve by ksql but i m not sure. Please help me.

谢谢.

推荐答案

是的,您可以为此使用KSQL.请尝试以下操作:

Yes, you can use KSQL for this. Try the following:

CREATE STREAM S1 (id BIGINT, name VARCHAR, created_at VARCHAT) WITH (kafka_topic = 'topic_name', value_format = 'JSON');

CREATE TABLE maxRow AS SELECT id, name, max(STRINGTOTIMESTAMP(created_at, 'yyyy-mm-dd hh:mm:ss.SSS')) AS creted_at FROM s1 WINDOW TUMBLING (size 1 hour) GROUP BY id, name;

结果将采用Linux时间戳格式的created_at时间.您可以在新查询中使用TIMESTAMPTOSTRING udf将其更改为所需的格式.如果发现任何问题,请告诉我.

The result will have the created_at time in linux timestamp format. You can change it into your desired format using TIMESTAMPTOSTRING udf in a new query.Please let me know if you find any issues.

这篇关于从卡夫卡获得最新价值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-18 17:07