简体   繁体   中英

qsub and python, imports .py modules but returns ImportError for .pyf

I am trying to submit a job via qsub myScript.sh -cwd , where the shell script runs a Python-2.7 script, myPython.py .

    # myScript.sh
    python2.7 myPython.py

Now, myPython.py needs to import a Python package that is stored in my local directory so the script starts with

    # myPython.py
    import sys
    sys.path.append('/file/path/for/package')
    import package

After submitting the job, the log file contains the following output from Python:

    ImportError: No module named problem_c

This error occurs after several modules have been imported from *.py files without returning errors, so I am fairly certain that this is not a problem with the path. The only difference I find is that the problem_c file ends in .pyf (ie problem_c.pyf ). It is stored in the same directory as the the modules ending in .py .

When I run the script locally, as python2.7 myPython.py , this error does not occur when the package loads. However, the job is too memory intensive to complete without submitting it to queue. Why are modules written as .pyf files being treated differently when I use qsub , and how can I import them in this case?

(If it matters, the cluster I'm working on was set up with Sun Grid Engine.)

Thanks for your thoughts.

Or f2py. It's obviously you didn't add the module on your cluster.

Here's how you can import a module with its path:

import imp
module_name = imp.load_source('module_name', '/abs/path/to/module/module_name.py')

# usage
module_name.function()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM