简体   繁体   中英

How to import other python modules and packages

I have the following project structure,

work_directory:
    merge.py
    a_package

(ie a python file merge.py and a directory a_package under the directory "work_directory")

I wrote a MapReduce job using MRJob in merge.py, in which I need to import a_package , like from a_package import something . But I have difficulty uploading a_package into hadoop.

I have tried this method( https://mrjob.readthedocs.io/en/latest/guides/writing-mrjobs.html#using-other-python-modules-and-packages ): I wrote

class MRPackageUsingJob(MRJob):
    DIRS = ['a_package']

and import code from inside a mapper

def mapper(self, key, value):
    from a_package import something

I also tried this one: https://mrjob.readthedocs.io/en/latest/guides/setup-cookbook.html#uploading-your-source-tree

But neither of them work, it keeps showing ImportError: No module named a_package .

What should I do?

You need just create empty file "__init__.py" in folder, what you want to use like a package. For example:

work_directory:
  __init__.py
  merge.py
  a_package

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM