I'm trying to write 100 files in a parallel way.
There is a list data
and items in it should be written to separate files. I've tried concurrent.futures
but it didn't work(nothing in files).
data = ['a','b','c']
def write2file(i):
outhandles[i].write(data[i])
with concurrent.futures.ThreadPoolExecutor(max_workers=10) as executor:
executor.map(write2file,range(s,e))
Expected results:
file1.txt:'a',
file2.txt:'b',
...
Put your data into a Queue object, then use threads to pop items off of the queue in parallel. This should help and thread-safe depending on your logic and how you are combining the files.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.