简体   繁体   English

如何使用 confluent-kafka-python 删除主题

[英]How can I delete Topics using confluent-kafka-python

I am using Kafka to make multiple micro services communicate with each other.我正在使用 Kafka 使多个微服务相互通信。 Services are written in Python, and I use Confluent library to handle Kafka.服务是用 Python 编写的,我使用 Confluent 库来处理 Kafka。 At some point, I know that some Topics are just 'over', so that I could clean them automatically.在某些时候,我知道某些主题刚刚“结束”,因此我可以自动清理它们。

Is there a way to delete "Topics" thanks to Confluent library ?感谢 Confluent 库,有没有办法删除“主题”? I cannot find any documentation on this...我找不到任何关于此的文档...

Thanks谢谢

You can use the confluent Admin Api's to delete a topic您可以使用confluent Admin Api 删除主题

Example例子

Takes an AdminClient instance and a list of topics接受一个 AdminClient 实例和一个主题列表

def example_delete_topics(a, topics):
    """ delete topics """

    # Call delete_topics to asynchronously delete topics, a future is returned.
    # By default this operation on the broker returns immediately while
    # topics are deleted in the background. But here we give it some time (30s)
    # to propagate in the cluster before returning.
    #
    # Returns a dict of <topic,future>.
    fs = a.delete_topics(topics, operation_timeout=30)

    # Wait for operation to finish.
    for topic, f in fs.items():
        try:
            f.result()  # The result itself is None
            print("Topic {} deleted".format(topic))
        except Exception as e:
            print("Failed to delete topic {}: {}".format(topic, e))

To answer your question literally : no, AFAIK you can't delete topics using the client library. 从字面上回答您的问题:不,AFAIK 您不能使用客户端库删除主题。

This is possible using the AdminClient API这可以使用AdminClient API

To address some of your follow up questions, I would suggest there is no point deleting these topics.为了解决您的一些后续问题,我建议删除这些主题是没有意义的。 If you're running a sandbox/prototype, then maybe you end up with lots of topics you want to clear down from experimenting.如果您正在运行沙盒/原型,那么您最终可能会遇到很多想要从试验中清除的主题。 But once you move these microservices into a Production environment, each topic will serve a purpose, and perhaps it's finished processing the current set of data, but what about the next ?但是一旦你将这些微服务转移到生产环境中,每个主题都将发挥作用,也许它已经完成了当前数据集的处理,但是下一个呢? Unless you're doing something funky with your topic naming that ties it to some attributes in the data, then topics will be re-used.除非您对主题命名做一些时髦的事情,将其与数据中的某些属性联系起来,否则主题将被重用。

I use this simple Python function:我使用这个简单的 Python 函数:

def delete_kafka_topic(topic_name):
    call(["/usr/bin/kafka-topics", "--zookeeper", "zookeeper-1:2181", "--delete", "--topic", topic_name])

My team uses this in automated tests, where we want to be able to rerun tests, verify results, and not see results from previous test attempts.我的团队在自动化测试中使用它,我们希望能够重新运行测试、验证结果,而不是看到以前的测试尝试的结果。

Simplest example:最简单的例子:

from kafka import KafkaAdminClient


admin_client = KafkaAdminClient(bootstrap_servers=['localhost:port'])
admin_client.delete_topics(topics=['test1', 'test2'])

Doc: https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html#confluent_kafka.admin.AdminClient.delete_topics文档: https ://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html#confluent_kafka.admin.AdminClient.delete_topics

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何使用 confluent-kafka-python 确定是否存在 kafka 主题 - How to determine if a kafka topic exists using confluent-kafka-python Confluent Kafka:如何在 confluent-kafka-python 客户端中指定序列化和分区? - Confluent Kafka: How is serialization and partitioning specified in confluent-kafka-python client? 无法使用 confluent-kafka-python 从 Kafka Topic 消费到新的消费者组 - Cannot consume from Kafka Topic using confluent-kafka-python to a new consumer-group Kafka Error INVALID_ARG No Such configuration property sasl.mechanisms 使用 confluent-kafka-python 时 - Kafka Error INVALID_ARG No Such configuration property sasl.mechanisms when using confluent-kafka-python 在 confluent-kafka-python 中设置主题日志保留 - Setting Topic log retention in confluent-kafka-python confluent-kafka-python 生产者和消费者中的 error_cb - error_cb in confluent-kafka-python producers and consumers confluent-kafka-python:没有参数的Consumer.commit() - confluent-kafka-python: Consumer.commit() with no parameters confluent-kafka-python json_producer:无法识别的字段:schemaType - confluent-kafka-python json_producer : Unrecognized field: schemaType Confluent-Kafka-Python:获取每个主题分区的延迟 - Confluent-Kafka-Python: Get lag per topic partition Confluent-Kafka Python:如何以编程方式列出所有主题 - Confluent-Kafka Python : How to list all topics programmatically
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM