简体   繁体   English

并行响应多个传入的Slack-Bot请求

[英]Respond to multiple incoming Slack-Bot requests in parallel

As a novice in this field, I have been trying to tackle the GIL limitation while using threads in order to process multiple incoming RTM events for my custom Slack bot(written in python) 作为该领域的新手,我一直在尝试使用线程解决GIL限制,以便为我的自定义Slack机器人处理多个传入RTM事件(用python编写)

Use Case: A Slack Bot which would be added to several channels having to serve multiple requests(bot-commands) in parallel . 用例:一个Slack Bot,它将被添加到必须并行处理多个请求(机器人命令)的多个通道中。

Challenge faced: Since Threading in python follows the concept of GIL, the incoming bot requests are not actually executed in parallel. 面临的挑战:由于python中的线程遵循GIL的概念,因此传入的bot请求实际上并未并行执行。

Solutions looked into: As an alternative, I was looking into multiprocessing.pool which would enable me to spawn many workers that would serve each request in a mutually exclusive manner. 解决方案进行了研究:作为一种选择,我正在研究multiprocessing.pool,它将使我产生许多可以互斥的方式满足每个请求的工作程序。

Question: Since I am polling for the incoming events infinitely within a While loop , I needed to find a way which could spawn processes to respond to each of the incoming request without blocking the processing of another request(which could be posted from another channel at the same time ) without exhausting all the available memory. 问题:由于我While循环中无限轮询传入事件,因此我需要找到一种方法,该方法可以生成进程以响应每个传入请求,而不会阻塞另一个请求的处理(可以从另一个通道将其发布到同一时间 ),而不耗尽所有可用内存。

Code : 代码

slack_client = SlackClient(<bot_token>)
if slack_client.rtm_connect(auto_reconnect=True):
     while True:
         incoming_events = slack_client.rtm_read()
         command = parse_bot_mention(incoming_events) #this method returns the command issued to the bot in specific
         if command:
             handle_command_thread = Thread(target=handle_bot_command, args=(command))
             handle_command_thread.start()
         time.sleep(1) #RTM read delay of 1 sec

This approach works fine when I have this bot subscribed to a single channel , and there are multiple commands issued to it. 当我将此漫游器订阅到单个频道并且向其发出多个命令时,此方法效果很好。 Problem is when there would be multiple-channels comprising of many participants/channel, the response time is very long. 问题是当存在多个参与者/通道的多通道时,响应时间会很长。

Is there any approach/programming paradigm that can be adopted to address this? 是否可以采用任何方法/编程范例来解决此问题?

Seems you're encountering one of Python's weaknesses, handling concurrency. 似乎您在处理并发性时遇到了Python的弱点之一。 Dude, that's a painful one! 杜德,那真是痛苦! I'll throw you some of my thoughts in the confines of your question, but not before I commit my biggest StackOverflow pet peeves... answer your question by telling you to try something else entirely... 我将把您的一些想法限制在您的问题范围内,但不要在我犯下我最大的 StackOverflow宠物烦恼之前...通过告诉您完全尝试其他方法来回答您的问题...

Annoying Answer: The use case you are describing seems to require a tool that makes concurrency a first-class citizen which is why I would recommend you look into using Golang. 烦人的回答:您描述的用例似乎需要使并发成为一流公民的工具,这就是为什么我建议您考虑使用Golang的原因。 Coming from a pretty heavy Python background I found it wasn't extremely difficult to pick up Golang. 来自非常沉重的Python背景,我发现拿起Golang并不是很困难。 Golang uses "go-routines" to easily handle problems like the one you are having. Golang使用“例程”轻松处理类似您遇到的问题。 It also has a bunch of other really nice features you should check out (like typing.. ooh yea!). 它还具有许多您应该签出的其他非常好的功能(例如键入..哦,是的!)。 It's a little strange at first if you're used to Python Dev work, but it's fairly simple after you get the concepts! 如果您习惯了Python Dev的工作,一开始有点奇怪,但是获得概念之后,这相当简单!

Not Annoying Answer: Okay if you read the above I appreciate it, let me pitch a few ideas in the confines of your question. 不烦人的答案:好的,如果您阅读以上内容,我将不胜感激,让我在您的问题范围内提出一些想法。

  1. One option would be to use the MultiProcessing library like you are mentioning. 一种选择是使用您提到的MultiProcessing库。 Your best bet would probably be to instantiate a pool of workers. 您最好的选择可能是实例化一组工作人员。 As the program works through the while loop and registers that a command has been given, it will pass that command off to an open worker to get the job done. 当程序通过while循环工作并注册已给出命令时,它将把该命令传递给打开的工作程序以完成工作。

So you'd create a function to insert into the while loop something like, 因此,您需要创建一个函数将类似while的内容插入while循环中,

Def Dispatch_worker_boi(self, command):
    use multiprocessing function to pitch to a worker (see Pool.apply_async)
  1. The only other idea that comes to mind is the AsyncIO package. 想到的唯一另一个想法是AsyncIO软件包。 If your bot waits around listening for command responses it may come in handy. 如果您的机器人在等待侦听命令响应,它可能会派上用场。 I'll be honest though, I have much less experience with that package. 不过,老实说,我对该程序包的经验要少得多。 Worth a look! 值得一看!

Final Notes: Another method as listed by Chiragjn would be to use some type of queue service like celery. 最后说明: Chiragjn列出的另一种方法是使用某种类型的队列服务,例如celery。 This would allow you to stick with Python and have much more scalability than a pure Python approach. 这将使您能够坚持使用Python,并且比纯Python方法具有更多的可伸缩性。 To be honest though, if you're dealing with a large amount of scale, Golang would serve you well. 不过,老实说,如果您要处理的规模很大,Golang将为您提供良好的服务。

Hope this was somewhat helpful, always appreciate feedback if you have any! 希望这对您有所帮助,请随时提供反馈!

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM