I want to control my processes while I not found working url.
But I have NameError: name 'status' is not defined
due to threads not waiting to declare a variable
import requests
from multiprocessing import Process
def first_function(i):
global status
status = False
try:
response = requests.get(f'https://ru.hexlet.io/{i}')
if response.status_code == 200:
status = True
except Exception as e:
print(str(e))
def second_function():
for i in [1,2,3,4,5,6,'courses',8,9,10]:
Process(target=first_function, args=(i, )).start()
print(i)
if status:
print('working', i)
break
if __name__ == '__main__':
second_function()
I suggest using a multiprocessing.pool.Pool
to limit the number of processes that are run concurrently (since you have indicated there might be a very large number of them).
If you use its apply_async()
method to "submit" tasks to it, you can use the optional callback
argumet to specify a function that will get called whenever one of the subprocesses finishes. This provides a way to terminate further processing by other processes submitted to the pool
.
import multiprocessing as mp
import requests
def worker(i):
try:
response = requests.get(f'https://ru.hexlet.io/{i}')
if response.status_code == 200:
return i
except Exception as exc:
print(f'{i!r} caused {exc}')
return None
if __name__ == '__main__':
def notify(i):
"""Called when a Pool worker process finishes execution."""
if i is not None:
print(f'{i!r} worked')
pool.terminate() # Stops worker processes immediately.
pool = mp.Pool()
for i in [1,2,3,4,5,6,'courses',8,9,10]:
pool.apply_async(worker, (i,), callback=notify)
pool.close()
pool.join()
print('fini')
Output:
'courses' worked
fini
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.