简体   繁体   中英

Is predicting with model is more CPU consuming than training and predicting in python app?

I recently made a Disease prediction API ( Still not solved ) but that's not the matter

In the same app, I first deployed the app in a way that it trains and predicts every when requested that worked fine but when I saved a model and used the same model to predict the value I got 500 internal server error

Because I believe that would directly hit on the response time of the

So, I was curious whether predicting through model is more CPU consuming task or training and predicting so that I can work further on my API as Cloud computers have Specific CPU performance, etc

Of course, It also depends on a Tier we choose and I am working on a free tier of Heroku

It would really nice if guys answer it

Regards, Roshan

If I understand it correctly, you are hitting some API endpoint with your request and the code that runs when the same endpoint is hit trains a model and then returns some prediction.

I can't really imagine how this should work in general. Training is a time consuming process that can take hours or months (how knows how long). Also, how are you sending a training data to your backend (assuming this data can be arbitrarily large)?

General approach is to build/train a model offline and then perform only predictions via you API. (unless you are building some very low level cloud API that is to be consumed by some other ML developers)

But to answer your question. No, predicting can't take more time than training-and-predicting (assuming that you are making the prediction on the same data). You are just adding one more (much more computationally intensive) operation to the equation. And since training and predicting are two separate steps that do not influence each other directly, your prediction time stays the same whether you are just predicting or training-and-predicting.

Training + Predicting is definitely more intensive as compared to only Predicting.

Typically, we train a model and save it as a binary file. Once saved, we use to for predicting.

Keep in mind that you would need to perform the same pre-processing steps you used during training while predicting.

As for the error, I'd suggest you do the following step-by-step to pin-point what is causing the error -

  1. Try to access the API end point by sending a simple json reply.
  2. Send the input data to the API end point and and try to return the input as json just to verify whether your server is receiving data as intended. You can also print it out as opposed sending back a json file.
  3. Now, once you have the data, perform same pre-processing steps (like in training), make a prediction, and send it back to your frontend.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM