I made a package in python using distutils in this sort of fashion:
#! /usr/bin/env python
from distutils.core import setup
setup(name='mypackage4.py',
version='1.0',
description='Description',
packages=['mypackage4']
)
setup.py then installs it to my python2.7/site-packages directory.
When I run
import mypackage4
it tries to run the .py files in the mypackage4 directory. The directory contains python files which contain functions (def commands). All I want to do is make the functions available for use, not to run directly on import.
If however, I do
cd python2.7/site-packages
and then do the import so it picks up mypackage4 locally, it imports without running anything, so then I can access the functions from the python command line in the normal way.
Why does python seem to run the functions when the package is not in the local directory, but not run the functions when the package is in the local directory?
Secondly, I read about placing a
if __name__ == '__main__':
pass
else:
<define functions here>
command in my .py files to prevent them from running, but then surely it doesn't run the functions, so how does it know about them?
Any help is greatly appreciated!
ps, (in case this helps) In my package directory I have an __init__.py
file to tell python which files to import
__all__ = [
'pyfile1',
'pyfile2',
'pyfile3',
]
The directory contains python files which contain functions (def commands). All I want to do is make the functions available for use, not to run directly on import.
The def
statements are code, that has to get run, or the functions don't get defined. Python always runs your code when you import a module or package. Function, class, and global variable/constant definitions are code, just like anything else.
The functions don't get called unless you write code to call them. But if you do write such code, it gets run. There's no special state where def foo(i): print(i)
happens, but foo(3)
does not; all of your code gets run.
Sometimes, you want to write a module that can be imported by other code, but can also be run as a script. (Or you may just want the module to run its tests when executed as a script.) That's where the __main__
idiom comes in.
You've got the idiom a little bit wrong—or, rather, you're putting an uncommon case in front of the typical one. Usually, you do something like this:
def foo(f):
blahblah(f)
def bar(fname):
with open(fname) as f:
foo(f)
if __name__ == '__main__':
import sys
for arg in sys.argv[1:]:
bar(arg)
When you import the module, the foo
and bar
definitions get executed, so the importing code can do qux.bar(fname)
, but the stuff inside the if
statement does not get executed (because __name__ != '__main__'
for an imported module). When you run the module as a script, that stuff does get executed (because __name__ == '__main__'
). So, you can do ./qux.py myfile.txt
(or C:\\Python33\\Python.exe .\\qux.py myfile.txt
, or whatever) and it'll foo the heck out of myfile.txt for you.
You rarely need to put anything in an else
block for the if __name__ == '__main__'
.
But rarely isn't never. Sometimes you have definitions that are needed to use the module as a module, but not needed to run it as a script. If compiling those definitions takes a long time (eg, because the module precomputes a 100000-element dict
to speed up access later), you don't want to do that every time you run the script. So, those definitions would go into the else
block.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.