I about 100 files each stored in a different directory. I have written a script but currently I am running this script for all of these files one at a time. I know if I save these files in one directory I can run them all one after the other using os.chdir, os.listdir. But for me moving them to one directory is not an option. Is there a way to execute all these files consecutively in a order and make my life easier?
You can generally use os.walk
for this kind of thing:
import os
for root, dirs, files in os.walk(os.path.abspath("/parent/dir/")):
for file in files:
if os.path.splitext(file)[1] == '.py':
print os.path.join(root, file)
Also works nicely with fnmatch
:
import os
import fnmatch
for root, dirnames, filenames in os.walk("/parent/dir/"):
for filename in fnmatch.filter(filenames, '*.py'):
# do your thing here .. execfile(filename) or whatever
I'm a little confused. If you want to do this all from within python by changing the current directory (presumably because your functions use relative paths):
directory_list = [ ... ] #list of directories. You could possibly get it from glob.glob
here = os.getcwd() #remember the "root" directory
for directory in directory_list:
os.chdir(directory) #change to the "work" directory
#do work in "work" directory
os.chdir(here) #go back to the root directory
Of course, if you already have the script cloned into your 100 directories, then you can just run it via bash:
for DIR in directory_glob_pattern; do cd $DIR && python runscript.py && cd -; done
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.