简体   繁体   English

RabbitMQ Python Pika - 多条消息的连接处理

[英]RabbitMQ Python Pika - Connection handling for multiple messages

I have been reading multiple blogs and documentation about proper setting up RabbitMQ connection for publishing.我一直在阅读有关正确设置 RabbitMQ 连接以进行发布的多个博客和文档。 Below is my scenario下面是我的场景

  • Have few scheduled jobs which execute certain tasks and publish the output to RabbitMQ执行某些任务并将输出发布到 RabbitMQ 的计划作业很少
  • The jobs runs at different time intervals, but the output will be posted to same RabbitMQ queue作业以不同的时间间隔运行,但输出将发布到同一个 RabbitMQ 队列

The following is the implementation下面是实现

def get_credentials(self):
    print ("Host ", config_reader.get_lookup_data('RABBITMQ', 'host'))
    credentials = pika.PlainCredentials(config_reader.get_lookup_data('RABBITMQ', 'user'),
                                        config_reader.get_lookup_data('RABBITMQ', 'password'))
    return credentials

def publish_message(self, message):
    print ("Publish message" , message)
    connection = pika.BlockingConnection(
        pika.ConnectionParameters(host=config_reader.get_lookup_data('RABBITMQ', 'host'),
                                  credentials=self.get_credentials()))
    channel = connection.channel()
    channel.exchange_declare(exchange=config_reader.get_lookup_data('RABBITMQ', 'exchange'), passive=True)
    result = channel.queue_declare(exclusive=False,
                                   queue=config_reader.get_lookup_data('RABBITMQ', 'sensor_queue'))
    channel.queue_bind(result.method.queue,
                       exchange=config_reader.get_lookup_data('RABBITMQ', 'exchange'),
                       routing_key=config_reader.get_lookup_data('RABBITMQ', 'routing_key'))
    print ('Publishing message ', message)
    channel.basic_publish(exchange=config_reader.get_lookup_data('RABBITMQ', 'exchange'), body=json.dumps(message),
                          routing_key=config_reader.get_lookup_data('RABBITMQ', 'routing_key'),
                          properties=pika.BasicProperties(
                              headers={'Content-Type': 'application/json'}  # Add a key/value header
                          ))
    print ('published')

I observe that the above implementation is every job is establishing a connection and then a channel.我观察到上面的实现是每个作业都在建立连接,然后是通道。 I doubt if this type of implementation is causing unnecessary overhead.我怀疑这种类型的实现是否会导致不必要的开销。

Can someone suggest right way to handle the connection object.有人可以建议正确的方法来处理连接对象。 I personally feel creating connection for every message is certainly overhead我个人觉得为每条消息创建连接肯定是开销

https://www.rabbitmq.com/tutorials/amqp-concepts.html#amqp-connections https://www.rabbitmq.com/tutorials/amqp-concepts.html#amqp-connections

TL;DR TL; 博士

AMQP connections are long-lived. AMQP 连接是长期存在的。 The handshake process for an AMQP connection is quite complex and requires at least 7 TCP packets (more if TLS is used). AMQP 连接的握手过程非常复杂,至少需要 7 个 TCP 数据包(如果使用 TLS,则需要更多)。 A best practice is to reuse connections and multiplex a connection between threads with channels.最佳实践是重用连接并在具有通道的线程之间复用连接。

Connection pool details: A connection pool with a minimum of 10 connections.连接池详细信息:至少有 10 个连接的连接池。 If more than 10 are needed a new connection can be created.如果需要 10 个以上,则可以创建新连接。 The maximum connections can be of 40 in the pool.池中的最大连接数可以是 40。 A time limit can be set after which the connection can be closed so that the connections can be closed instead of being there forever.可以设置一个时间限制,在此之后可以关闭连接,以便可以关闭连接而不是永远存在。

Reference: https://www.cloudamqp.com/blog/2018-01-19-part4-rabbitmq-13-common-errors.html参考: https : //www.cloudamqp.com/blog/2018-01-19-part4-rabbitmq-13-common-errors.html

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM