This code starts up an infinity of processes and subsequently crashes my PC.
import multiprocessing
def f(process_name):
print process_name, 'says hi'
p1 = multiprocessing.Process(None,f,'1',('1st',))
p1.start()
In the console i see '1st says hi' over and over again, in the task manager i see a bazzilion of interpreters starting - i ran this from inside PyDev eclipse and within the commandline - same result.
One other result i got was having a single python interpreter running (though dying and spawning another one really fast) so that i couldn't kill it within the task manager (handle invalid). It was eating up 100% of the processor though.
I am used to the java threading API, and the module multiprocessing claims to have a similar interface to the threading module, which is a copy of java's.
Why are the processes not dying? What am I missing to simply have this spawn a single thread, print the thing and die out?
Thx, you guys rule!
You need to wrap up your initialization code in a if __name__ == "__main__":
block.
Python executes your __main__
module again after the fork starts the new interpreter. Anything not guarded by this construct will execute again. Which creates the effect you see.
I don't believe that the processes will die until you command them to do so with the myProcess.exit.set()
command. I would edit your code to make it look like this:
import multiprocessing
def f(process_name):
print process_name, 'says hi'
p1 = multiprocessing.Process(None,f,'1',('1st',))
p1.start()
p1.exit.set()
I wouldn't do this in larger programs though, as it can cause issues. See here for more information about that. I would also recommend visiting the documentation on this.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.