[英]multiprocess pool doesnt close and join terminating the script before all the process
I have created a multiprocessor application that just loop some files and compare them but for some reason the pool never close and wait to join all the process responses.我创建了一个多处理器应用程序,它只是循环一些文件并比较它们,但由于某种原因,池永远不会关闭并等待加入所有进程响应。
from multiprocessing import Pool
def compare_from_database(row_id, connection_to_database):
now = datetime.now()
connection1 = sqlite3.connect(connection_to_database)
cursor = connection1.cursor()
grab_row_id_query = "SELECT * FROM MYTABLE WHERE rowid = {0};".format(row_id)
grab_row_id = cursor.execute(grab_row_id_query)
work_file_path = grab_row_id.fetchone()[1]
all_remaining_files_query = "SELECT * FROM MYTABLE WHERE rowid > {0};".format(row_id)
all_remaining_files = cursor.execute(all_remaining_files_query)
for i in all_remaining_files:
if i[1] == work_file_path:
completed_query = "UPDATE MYTABLE SET REPEATED = 1 WHERE ROWID = {1};".format(row_id)
work_file = cursor.execute(completed_query)
connection1.commit()
cursor.close()
connection1.close()
return "id {0} took: {1}".format(row_id, datetime.now()-now)
I have try it with:我已经尝试过:
def apply_async(range_max, connection_to_database):
pool = Pool()
for i in range_of_ids:
h = pool.apply_async(compare_from_database, args=(i, connection_to_database))
pool.close()
pool.join()
Also using a context and kind of force it:还使用上下文和强制它:
from multiprocessing import Pool
with Pool() as pool:
for i in range_of_ids:
h = pool.apply_async(compare_from_database, args=(i, connection_to_database))
pool.close()
pool.join()
Even do with context shouldn't need the close/join.即使使用上下文也不应该需要关闭/加入。
The script just submit all the jobs, I can see in task manager all the python instance and are running, the print statements inside the function do print in the console fine, but once the main script finish submitting all the functions to the pools, just ends.该脚本仅提交所有作业,我可以在任务管理器中看到所有 python 实例并正在运行,function 中的打印语句确实可以在控制台中打印,但是一旦主脚本完成将所有功能提交到池中,只需结束。 doesn't respect the close/join不尊重关闭/加入
Process finished with exit code 0
if i run the function by itself runs fine returning the string.如果我自己运行 function 运行良好,返回字符串。
compare_from_database(1, connection_to_database="my_path/sqlite.db")
or in a loop works fine as well或在循环中也可以正常工作
for i in range(1, 4):
compare_from_database(i, connection_to_database="my_path/sqlite.db")
I try using python 3.7 and 3.8 and wanted to validate it with the documentation https://docs.python.org/2/library/multiprocessing.html#multiprocessing.pool.multiprocessing.Pool.join我尝试使用 python 3.7 和 3.8 并希望通过文档https://docs.python.org/2/library.multiprocessing.html#multiprocessing.html#multiprocessing验证它
Anyone gotten a similar issue or any ideas what might be?任何人都遇到过类似的问题或任何想法可能是什么?
since you want to do all the process before proceding to the next part of the script change 'async' instead async_apply that way it force to run the process and wait for the result.因为您想在进行脚本的下一部分之前完成所有过程,所以更改“async”而不是 async_apply 那样它会强制运行该过程并等待结果。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.