简体   繁体   中英

Why doesn't this queue work in multiprocessing?(append() method in function body)

I am working to build a script with multiprocessing. But while implementing the queue, I found out it doesn't work as I expected and still don't know why.

Here is my pseudo code which consists of three parts: a function for making sentences list from strings, strings list and outcome list for appending sentences, and the main code for implementing queue and multiprocessing.

def process_string(que_object):

    while que_object.empty():
       time.sleep(2)

    q = que_object.get(timeout=2)
    sentence = "Here is your string_"+q
    print(sentence)
    final_sentence.append(sentence)

strings =["alskfj","alksjf"...]
final_sentences = []

if __name__ == "__main__":

    que_object = Queue()
    for i in strings:
        que_object.put(strings[strings.index(i)])
        #print(strings[strings.index(i)])
    #print(que_object)

    with Manager() as manager:
        L = manager.list(strings)
        process_list =[]
        for i in range(2):
            p = Process(target =process_string,args=(que_obejct,))
            process_list.append(p)
            p.start()
        for i in range(2):

            p.join()
            #time.sleep(1)
        print(final_sentences)

multiprocessing.Process es are, as the name suggests, seperate OS level processes. therefore when you do final_sentence.append this is being done in the memory/address space of another process and hence isn't visible to the process that does the print(final_sentences)

I'd suggest using a Pool and map ping the function over your data as in the first example in the docs .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM