简体   繁体   中英

Python script runs slower when executed multiple times

I have a python script with a normal runtime of ~90 seconds. However, when I change only minor things in it (like alternating the colors in my final pyplot figure) and execute it thusly multiple times in quick succession, its runtime increases up to close to 10 minutes.

Some bullet points of what I'm doing:

  • I'm not downloading anything neither creating new files with my script.
  • I merely open some locally saved .dat -files using numpy.genfromtxt and crunch some numbers with them.
  • I transform my data into a rec-array and use indexing via array.columnname extensively.
  • For each file I loop over a range of criteria that basically constitute different maximum and minimum values for evaluation, and embedded in that I use an inner loop over the lines of the data arrays. A few if 's here and there but nothing fancy, really.
  • I use the multiprocessing module as follows

     import multiprocessing npro = multiprocessing.cpu_count() # Count the number of processors pool = multiprocessing.Pool(processes=npro) bigdata = list(pool.map(analyze, range(len(FileEndings)))) pool.close() 

    with analyze being my main function and FileEndings its input, a string, to create the right name of the file I want to load and the evaluate. Afterwards, I use it a second time with

     pool2 = multiprocessing.Pool(processes=npro) listofaverages = list(pool2.map(averaging, range(8))) pool2.close() 

    averaging being another function of mine.

  • I use numba's @jit decorator to speed up the basic calculations I do in my inner loops, nogil , nopython , and cache all set to be True . Commenting these out doesn't resolve the issue.
  • I run the scipt on Ubuntu 16.04 and am using a recent Anaconda build of python to compile.
  • I write the code in PyCharm and run it in its console most of the time. However, changing to bash doesn't help either.
  • Simply not running the script for about 3 minutes lets it go back to its normal runtime.
  • Using htop reveals that all processors are at full capacity when running. I am also seeing a lot of processes stemming from PyCharm (50 or so) that are each at equal MEM% of 7.9. The CPU% is at 0 for most of them, a few exceptions are in the range of several %.

Has anyone experienced such an issue before? And if so, any suggestions what might help? Or are any of the things I use simply prone to cause these problems?

可能是关闭的,该问题是由我的机器中的风扇故障引起的。

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM