简体   繁体   中英

How to repeatedly pass arguments to a python file

I have a python script which is run as follows:

python script.py -n name 

Also, I have a file (or say list) which contains all the name values such as follows:

name1
name2
name3
name4
...
...
name1000

So, I want to run the python script to pass all these names as the arguments. The dumbest and laziest way of doing this task is that I created a shell script (I wrote a python script to generate this content) say run.sh as follows:

python script.py -n name1
python script.py -n name2
python script.py -n name3
python script.py -n name4
........
........
python script.py -n name1000

and I run this shell script as sh run.sh .

I am sure that there must be a smarter/elegant way of doing this. Any tips?

Also, is it possible to free (clear) the python memory after each execution

I would suggest passing a file location (eg 'names.txt' ) as the parameter to your python script when called from shell. Then within the python script, import the names and work with them one by one.

To your second question, if you wrap the logic of script.py in a function (eg called script_function ) which takes param name , and call script_function(name) for each name in 'names.txt' , you should keep your memory use down. The reason is that all the variables created in script_function would be local to that function call and deleted/replaced on the next function call for working with the next name.

另一种选择是awk内置的系统功能,所以:

awk '{ system("python script.py -n "$0) }' filename

try:

cat <file_with_list_of_names_1_per_line> | xargs python script.py -n

that should do it, if not ... try wrapping "python script.py -n" in a bash script. Something simple, let say a script named "call_my_script.sh" which would contain:

#!/bin/bash
python script.py -n $1

, then you could call it with:

cat <file_with_list_of_names_1_per_line> | xargs call_my_script.sh
# this will call/execute "call_my_script.sh name", for each name in the file, 1 at a time

You could also use the "-P" option of "xargs", to run several instances in parallel (watch out for concurențial access to resources, like writing to same output file or similar, could produce strange results)

.. | xargs -P <n> ...

to run "n" instances of the script in parallel

Side note: also an important aspect for whom is not familiar with "xargs", it will treat each word individually, meaning if on a line there would've been 2 (or more) words "word1 word2" ... that would make it call "the script" 2 (or more) times, 1 for each word. It might not be the expected behavior, so it worth mentioning.

Considering you have a list of names you can do the following in shell script

list="name1 name2 name 3 ... name1000"
for i in $list; do
python script.py -n $i
done

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM