Kafka 2.1.0 Java Consumer vs Scala Consumer

问题: I would like to upgrade my team's Kafka clusters from version 0.10.1.1 to version 2.1.0. By the way, Kafka official document says the following words. Kafka Official Docu...

问题:

I would like to upgrade my team's Kafka clusters from version 0.10.1.1 to version 2.1.0. By the way, Kafka official document says the following words.

Kafka Official Docu

Note that the older Scala clients, which are no longer maintained, do not support the message format introduced in 0.11, so to avoid conversion costs (or to take advantage of exactly once semantics), the newer Java clients must be used.

I do not understand that sentence well. Currently our team is using the Kafka Consumer Application written in Scala. But should we turn this into Java? If we use the application written in Scala at present, I do not know exactly what disadvantages it can have.


回答1:

I think you're confusing the old kafka.consumer and kafka.producer packages that were in the Kafka core module with the new kafka-clients dependency, that is implemented in Java.

If your imports are these, you will be fine, and don't need to use different classes, and might only need to re-write a few parameters of methods calls after an upgrade

org.apache.kafka.clients.consumer.KafkaConsumer
org.apache.kafka.clients.producer.KafkaProducer

should we turn this into Java? If we use the application written in Scala at present, I do not know exactly what disadvantages it can have

Java is more verbose, and doesn't have as nice a type-system as Scala. You're welcome to write the same code in Scala, Kotlin, Clojure, etc... At the end of the day, it's all running in the JVM

  • 发表于 2019-01-04 21:48
  • 阅读 ( 417 )
  • 分类:网络文章

条评论

请先 登录 后评论
不写代码的码农
小编

篇文章

作家榜 »

  1. 小编 文章
返回顶部
部分文章转自于网络,若有侵权请联系我们删除