简体   繁体   中英

I am not sure why multiprocessing Queue isn't working. What is it doing?

I am using python's built in socket and multiprocessing libaries to scan tcp ports of a host. I know my first function works, and I am just trying to make it work with multriprocess Queue and Process, not sure where I am going wrong.

If I remove the Queue everything seems to complete, I just actually need to get the results from it.

from multiprocessing import Process, Queue
import socket

def tcp_connect(ip, port_number):
    try:
        scanner = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
        scanner.settimeout(0.1)
        scanner.connect((str(ip), port_number))
        scanner.close()

        #put into queue
        ## Remove line below if you 
        q.put(port_number)
    except:
        pass

RESULTS = []
list_of_numbs = list(range(1,501))

for numb in list_of_numbs:

    #make my queue
    q = Queue()
    p = Process(target=tcp_connect, args=('google',numb))
    p.start()
    #take my results from my queue and append to a list
    RESULTS.append(q.get())
    p.join()

print(RESULTS)

I would just like it to print out port numbers that were open. Right now since it is scanning google.com it should really only return 80 and 443 .

EDIT: This would work if I used Pool but I went to Process and Queue is because the bigger piece of this runs in Django with celery and the don't allow Daemon when executing with Pool

For work like this, a multiprocessing.Pool would be a better way of handling it.

You don't have to worry about creating Processes and Queues; all that is done for you in the background. Your worker function only has to return a result, and that will be transported to the parent process for you.

I would suggest using multiprocessing.Pool.imap_unordered() , because it starts returning results as soon as it is available.

One thing; the worker process takes only one argument. It you need multiple different arguments for each call; wrap them in a tuple. If you have arguments that are the same for all calls, use functools.partial .


A slightly more modern aproach would be to use an Executor.map() method from concurrent.futures . Since your work consists mainly of socket calls, you could use a ThreadPoolExecutor here, I think. That should be slightly less resource-intensive than a ProcessPoolExecutor .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM