简体   繁体   中英

Tensorflow.keras.Model served by a Flask app + uwsgi gets stuck in model.predict

I am trying to serve a tensorflow.keras.Model in a Flask + nginx + uwsgi application, using Tensorflow v1.14.

I load the model in the constructor of a class named Prediction in my Flask's application factory function and save the graph as an attribute of the Flask app, as suggested here .

Then I run the prediction by calling a method Prediction.process in a route named _process of my Flask app, but it gets stuck during the call of tf.keras.Model.predict ( self.model.summary() in predict.py is executed, ie the summary is shown, but not print("Never gets here:(") ).

If I initialize my class Prediction in _process (which I want to avoid to not have to load the model for every prediction), everything works fine.

If I use Flask server, it works fine, too. So it seems that it is related to uwsgi config.

Any suggestion?

def create_app():
    app = Flask(__name__)
    #(...)
    app.register_blueprint(bp)
    load_tf_model(app)
    return app


def load_tf_model(app):

    sess = tf.Session(graph=tf.Graph())
    app.sess = sess

    with sess.graph.as_default():
        weights =  os.path.join(app.static_folder, 'weights/model.32-0.81.h5')
        app.prediction = Prediction(weights)

class Prediction:

    def __init__(self,  weights):

        # build model and set weights
        inputs = tf.keras.Input(shape=SHAPE, batch_size=1)
        outputs = simple_cnn.build_model(inputs, N_CLASSES)
        self.model = tf.keras.Model(inputs=inputs, outputs=outputs)
        self.model.load_weights(weights)
        self.model._make_predict_function()

        # create TF mel extractor
        self.melspec_ex = tf_feature_utils.MelSpectrogram()


    def process(self, audio, sr):

        # compute features (in NCHW format) and labels
        data = audio2data(
            audio,
            sr,
            class_list=np.arange(N_CLASSES))
        features = np.asarray([d[0] for d in data])
        features = tf.reshape(features, (features.shape[0], 1, features.shape[1], features.shape[2]))
        labels = np.asarray([d[1] for d in data])

        # make tf.data.Dataset
        dataset = tf.data.Dataset.from_tensor_slices((features, labels))
        dataset = dataset.batch(1)
        dataset = dataset.map(lambda data, labels: (
            tf.expand_dims(self.melspec_ex.process(tf.squeeze(data, axis=[1,2])), 1)))

        # show model (debug)
        self.model.summary()

        # run prediction
        predictions = self.model.predict(dataset)

        print("Never gets here :(")

        # integrate predictions over time
        return np.mean(predictions, axis=0)

@bp.route('/_process', methods=['POST'])
def _process():

    with current_app.graph.as_default():

        # load audio
        filepath = session['filepath']
        audio, sr = librosa.load(filepath)

        # predict
        predictions =  current_app.prediction.process(audio, sr)

        # delete file
        os.remove(filepath)

        return jsonify(prob=predictions.tolist())

It was a threading issue. I had to add configure uwsgi with the following options:

master = false 
processes = 1
cheaper = 0

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM