简体   繁体   English

如何在调用Abaqus的Python中并行化此嵌套循环

[英]How to parallelize this nested loop in Python that calls Abaqus

I have the nested loops below. 我下面有嵌套循环。 How can i parallelize the outside loop so i can distribute the outside loop into 4 simultaneous runs and wait for all 4 runs to complete before moving on with the rest of the script? 我如何并行化外部循环,以便可以将外部循环分布为4个同时运行,并等待所有4个运行完成,然后再继续执行其余脚本?

    for r in range(4):
        for k in range( r*nAnalysis/4, (r+1)*nAnalysis/4 ):

            # - Write Abaqus INP file - #
            writeABQfile(ppos,props,totalTime[k],recInt[k],inpFiles[k],i,lineNum[k],aPath[k])

            # - Delete LCK file to Enable Another Analysis - #
            delFile(aPath[k]+"/"+inpFiles[k]+".lck")

            # - Run Analysis - #
            runABQfile(inpFiles[k],aPath[k])

I tried using multiprocess.pool as but it never gets in: 我尝试使用multiprocess.pool作为,但它永远不会进入:

            def parRunABQfiles(nA,nP,r,ppos,prop0,prop1,totalTime2Run_,recIntervals_,inpFiles_,i,lineNumbers_,aPath_):
            from os import path 
            from auxFunctions import writeABQfile, runABQfile 
            print("I am Here")
            for k in range( r*nA/nP, (r+1)*nA/nP ):
                # - Write Abaqus INP file - #
                writeABQfile(ppos,prop0,prop1,totalTime2Run_,recIntervals_,inpFiles_,i,lineNumbers_,aPath_)
                # - Delete LCK file to Enable Another Analysis - #
                delFile(aPath_+"/"+inpFiles[k]+".lck")
                # - Run Analysis - #
                runABQfile(inpFiles_,aPath_)
                # - Make Sure Analysis is not Bypassed - #
                while os.path.isfile(aPath_+"/"+inpFiles[k]+".lck") == True:
                      sleep(0.1)
            return k

        results = zip(*pool.map(parRunABQfiles, range(0, 4, 1)))

The runABQfile is just a subprocess.call to a sh script that runs abaqus runABQfile只是一个子runABQfile对运行abaqus的sh脚本的runABQfile

     def runABQfile(inpFile,path):    
         import subprocess
         import os

         prcStr1 = ('sbatch '+path+'/runJob.sh')

         process = subprocess.call(prcStr1, stdin=None, stdout=None, stderr=None, shell=True )

         return

I have no errors showing up so I am not sure why is not getting in there. 我没有出现任何错误,所以我不确定为什么不进入那里。 I know because the writeABQfile does not write the input file. 我知道,因为writeABQfile不写输入文件。 The question again is: 再次的问题是:

How can i parallelize the outside loop so i can distribute the outside loop into 4 simultaneous runs and wait for all 4 runs to complete before moving on with the rest of the script? 我如何并行化外部循环,以便可以将外部循环分布为4个同时运行,并等待所有4个运行完成,然后再继续执行其余脚本?

Use concurrent.futures module if multiprocessing is what you want. 如果需要多处理,请使用concurrent.futures模块。

from concurrent.futures import ProcessPoolExecutor

def each(r):
    for k in range( r*nAnalysis/4, (r+1)*nAnalysis/4 ):
        writeABQfile(ppos,props,totalTime[k],recInt[k],inpFiles[k],i,lineNum[k],aPath[k])
        delFile(aPath[k]+"/"+inpFiles[k]+".lck")
        runABQfile(inpFiles[k],aPath[k])

with ProcessPoolExecutor(max_workers=4) as executor:
    output = executor.map(each, range(4)) # returns an iterable

If you just want to "do" stuff rather than "produce", check out as_completed function from the same module. 如果您只想“做”事情而不是“生产”,请从同一模块中检查as_completed函数。 There are direct examples in the doc. 该文档中有直接的示例。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM