[英]Read from Kafka topic process the data and write back to Kafka topic using scala and spark
[英]How to read data from Kafka topic in Scala
我正在尝试循环我们将从 kafka 主题收到的数据,但下面的代码似乎因类型不匹配而出错
def consume[K, V](consumer: KafkaConsumer[K, V], topic: String, timeoutMillis: Long): Unit = {
logger.info(s"Start to consume from $topic")
consumer.subscribe(List(topic).asJavaCollection)
Try {
while (true) {
val records: ConsumerRecords[K, V] = consumer.poll(Duration.ofMillis(timeoutMillis))
records.iterator().forEachRemaining { record: ConsumerRecord[K, V] =>
logger.info(s"""
|message
| offset=${record.offset}
| partition=${record.partition}
| key=${record.key}
| value=${record.value}
""".stripMargin)
}
}
} match {
case Success(_) =>
logger.info(s"Finish to consume from $topic")
case Failure(exception) =>
logger.error(s"Finish to consume from $topic with error", exception)
}
consumer.close()
}
我在运行此代码段时遇到的错误如下: C:\Users\abcde\IdeaProjects\demo\src\main\scala\com\cbe\mem\xerox\KakfaHelper.scala:24:76 typematch; found: org.apache.kafka.clients.consumer.ConsumerRecord[K,V] => Unit required: java.util.function.Consumer[_ >: org.apache.kafka.clients.consumer.ConsumerRecord[K,V] ] records.iterator().forEachRemaining { 记录:ConsumerRecord[K, V] =>
请让我知道如何运行此代码段,并表示我想提前感谢。
通常,您会将轮询记录转换为 Scala Iterable,然后使用foreach
对其进行迭代:
import scala.collection.JavaConverters._
val records = consumer.poll(Duration.ofMillis(timeoutMillis)).asScala
records.foreach(record =>
logger.info(s"""
|message
| offset=${record.offset}
| partition=${record.partition}
| key=${record.key}
| value=${record.value}
""".stripMargin)
)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.