简体   繁体   中英

Respond to multiple incoming Slack-Bot requests in parallel

As a novice in this field, I have been trying to tackle the GIL limitation while using threads in order to process multiple incoming RTM events for my custom Slack bot(written in python)

Use Case: A Slack Bot which would be added to several channels having to serve multiple requests(bot-commands) in parallel .

Challenge faced: Since Threading in python follows the concept of GIL, the incoming bot requests are not actually executed in parallel.

Solutions looked into: As an alternative, I was looking into multiprocessing.pool which would enable me to spawn many workers that would serve each request in a mutually exclusive manner.

Question: Since I am polling for the incoming events infinitely within a While loop , I needed to find a way which could spawn processes to respond to each of the incoming request without blocking the processing of another request(which could be posted from another channel at the same time ) without exhausting all the available memory.

Code :

slack_client = SlackClient(<bot_token>)
if slack_client.rtm_connect(auto_reconnect=True):
     while True:
         incoming_events = slack_client.rtm_read()
         command = parse_bot_mention(incoming_events) #this method returns the command issued to the bot in specific
         if command:
             handle_command_thread = Thread(target=handle_bot_command, args=(command))
             handle_command_thread.start()
         time.sleep(1) #RTM read delay of 1 sec

This approach works fine when I have this bot subscribed to a single channel , and there are multiple commands issued to it. Problem is when there would be multiple-channels comprising of many participants/channel, the response time is very long.

Is there any approach/programming paradigm that can be adopted to address this?

Seems you're encountering one of Python's weaknesses, handling concurrency. Dude, that's a painful one! I'll throw you some of my thoughts in the confines of your question, but not before I commit my biggest StackOverflow pet peeves... answer your question by telling you to try something else entirely...

Annoying Answer: The use case you are describing seems to require a tool that makes concurrency a first-class citizen which is why I would recommend you look into using Golang. Coming from a pretty heavy Python background I found it wasn't extremely difficult to pick up Golang. Golang uses "go-routines" to easily handle problems like the one you are having. It also has a bunch of other really nice features you should check out (like typing.. ooh yea!). It's a little strange at first if you're used to Python Dev work, but it's fairly simple after you get the concepts!

Not Annoying Answer: Okay if you read the above I appreciate it, let me pitch a few ideas in the confines of your question.

  1. One option would be to use the MultiProcessing library like you are mentioning. Your best bet would probably be to instantiate a pool of workers. As the program works through the while loop and registers that a command has been given, it will pass that command off to an open worker to get the job done.

So you'd create a function to insert into the while loop something like,

Def Dispatch_worker_boi(self, command):
    use multiprocessing function to pitch to a worker (see Pool.apply_async)
  1. The only other idea that comes to mind is the AsyncIO package. If your bot waits around listening for command responses it may come in handy. I'll be honest though, I have much less experience with that package. Worth a look!

Final Notes: Another method as listed by Chiragjn would be to use some type of queue service like celery. This would allow you to stick with Python and have much more scalability than a pure Python approach. To be honest though, if you're dealing with a large amount of scale, Golang would serve you well.

Hope this was somewhat helpful, always appreciate feedback if you have any!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM