是否有适用于Scala的Kalfa客户API

是否有适用于Scala的Kalfa客户API

本文介绍了Kafka:是否有适用于Scala的Kalfa客户API?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

限时删除!!

我只是从Kafka开始,这听起来对微服务确实很好,但是我基本上在Scala工作.

I am just starting with Kafka, it sounds really good for Microservices, but I work essentially in Scala.

我通过以下方式将kafka添加到我的sbt项目中:

I added kafka to my sbt project with this:

libraryDependencies += "org.apache.kafka" %% "kafka" % "2.0.0"

然后我这样做:

import org.apache.kafka.clients.producer.{Callback,KafkaProducer, Producer}

...

val producer = new KafkaProducer[String, String](props)
val record = new ProducerRecord[String, String]("my-topic", "key", "value")
val fut = producer.send(record, callBack)
...

我的问题是调用producer.send时没有得到Scala Future,它是Java Future.我不知道Java Futures是如何工作的,我希望跳过该学习过程.这次是Future,但我的意思是Java.

My problem here is that I am not getting a Scala Future when I call producer.send, it is a Java Future. I don't know how Java Futures work, and I would prefer to skip that learning curve. This time it is Future, but I mean Java in general.

所以我想知道是否有完整的Scala API与Kafka一起使用.因为Kafka是用Scala编写的,通常应该是这种情况.

So I am wondering if there is a full Scala api to work with Kafka. It should normally be the case since Kafka is written in Scala.

推荐答案

来自 Kafka在2.0版中的显着变化.0

从0.10.0.0开始不推荐使用的Scala生产者已被移除.从0.9.0.0版本开始,推荐使用Java生产者.请注意,Java生产者中默认分区程序的行为与Scala生产者中默认分区程序的行为不同.迁移的用户应考虑配置保留以前行为的自定义分区程序.请注意,即使将代理升级到2.0.0,Scala生产者(1.1.0及更高版本)也将继续工作.

The Scala producers, which have been deprecated since 0.10.0.0, have been removed. The Java producer has been the recommended option since 0.9.0.0. Note that the behaviour of the default partitioner in the Java producer differs from the default partitioner in the Scala producers. Users migrating should consider configuring a custom partitioner that retains the previous behaviour. Note that the Scala producers in 1.1.0 (and older) will continue to work even if the brokers are upgraded to 2.0.0.

这篇关于Kafka:是否有适用于Scala的Kalfa客户API?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

1403页,肝出来的..

09-06 23:00