简体   繁体   中英

Forking multiple process in twisted with shared object

I want to fork multiple processes with twisted. I know from the discussion here that Twisted and multiprocessing are not compatible with each other. Its true that I can launch separate processes from different terminals to achieve the same effect, but I cannot do so.

There is a big object (size in GBs) which I want to share among different python processes (cannot load the same object multiple times on RAM, due to RAM limitation on my computer)

What I am trying to do is

  1. Start multiple rabbit-mq consumers asynchronously in a single process.
  2. Fork multiple such processes with a common shared object to leverage all CPUs in my system.

I am able to achieve step one. Here is what I come up with -

import pika

class PikaFactory(protocol.ReconnectingClientFactory):
    def __init__(self, parameters, task_number=0, shared_obj=None):
        self.parameters = parameters
        self.task_count = total_tasks
        self.task_number = task_number
        self.shared_obj= shared_obj

    def buildProtocol(self, addr):
        self.resetDelay()
        logger.info('Task: %s, Connected' % self.task_number)
        proto = twisted_connection.TwistedProtocolConnection(self.parameters)
        # run is a async function that consumes the rabbit-queue
        proto.ready.addCallback(run, self.task_number, self.shared_obj)
        return proto

    # Rest of the implementation ...........

def run_tasks(shared_obj):
    from twisted.internet import reactor
    try:
        parameters = pika.ConnectionParameters(**CONNECTION_PARAMETERS)
        factory = PikaFactory(parameters, 0, shared_obj)

        for i in range(total_tasks):
            # Launch multiple async-tasks in the same process
            reactor.connectTCP(parameters.host, parameters.port, factory)

        logger.info(' [*] Waiting for messages. To exit press CTRL+C')
        reactor.run()
    except:
        logger.exception("Error")
        reactor.stop()

if __name__ == '__main__':
    obj = Fibonacci()
    run_tasks(obj)

Now to fork multiple processes I have written this code.

from multiprocessing.managers import BaseManager
class MyManager(BaseManager):
    """
    This Manager is responsible for coordinating shared
    information state between all processes
    """
    pass

# Register your custom "Fibonacci" class with the manager
# This is the class I want to share among multiple processes
MyManager.register('Fibonacci', Fibonacci)

def Manager():
    m = MyManager()
    m.start()
    return m

def run_multiple_processes():
    manager = Manager()
    # object I want to share among multiple processes 
    fibonacci = manager.Fibonacci()
    pool = multiprocessing.Pool(processes=workers)
    for i in range(0, workers):
        pool.apply_async(run_tasks, (fibonacci, ))

    # Stay alive
    try:
        while True:
            continue
    except KeyboardInterrupt:
        logger.error(' [*] Exiting...')
        pool.terminate()
        pool.join()

I am getting some random errors while running the above code like -

builtins.AttributeError: '_SIGCHLDWaker' object has no attribute 'doWrite'

What would be the twisted way to launch multiple process and share a custom object between them. No writes will be performed on the object, the object is used to only read from its attributes.

Thanks in advance.

While searching I encountered this post - twisted incompatible with python multiprocessing . The answer exists in this post only quoting his words -

"not loading any of Twisted until you've already created the child processes. This means not even importing Twisted until after you've created the child processes."

Thanks @Jean-Paul Calderone for this useful comment.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM