I am trying to run 2 processes in parallel using Python multiprocess but the second process always hangs up. I saw in other posts that importing the keras library inside the function solves the problem but it didn't work for me. Can't figure out what I am missing... Here is a short summary of my code:
import multiprocessing
import numpy as np
def detect_img():
from keras.layers import Dense
from keras.models import Sequential
#Not the actual model
model = Sequential()
model.add(Dense(15, input_shape=(4,), activation='relu'))
model.add(Dense(15, activation='relu'))
model.add(Dense(15, activation='relu'))
model.add(Dense(20, activation='relu'))
model.add(Dense(4, activation='linear'))
while True:
x = np.random.rand(1, 4)
y = model.predict(x)
print(y)
def show_cam():
import cv2
cap = cv2.VideoCapture(0)
while True:
ret, frame = cap.read()
cv2.imshow('frame', frame)
if cv2.waitKey(1) & 0xFF == ord('q'):
break
cap.release()
cv2.destroyAllWindows()
if __name__ == '__main__':
p1 = multiprocessing.Process(target=show_cam())
p2 = multiprocessing.Process(target=detect_img())
p1.start()
p2.start()
Ok. I faced quite similar behaviour in many experiments. After looking in the web I found a post explaining that opencv implements its own parallelization interfering with python GIL (Global Interpreter Lock) (I maybe can go into details if you like ... ). The solution that worked for me was to avoid using cv2!
Of course, this is not the best approach atm. but there is not enough time to look for a workaround ...
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.