简体   繁体   中英

Keras: Question on GPhilo's post on predict vs predict_on_batch. Can't predict() already handle single images just fine?

What is the difference between the predict and predict_on_batch methods of a Keras model?

"The difference lies in when you pass as x data that is larger than one batch.

predict will go through all the data, batch by batch, predicting labels. It thus internally does the splitting in batches and feeding one batch at a time.

predict_on_batch, on the other hand, assumes that the data you pass in is exactly one batch and thus feeds it to the network. It won't try to split it (which, depending on your setup, might prove problematic for your GPU memory if the array is very big)" - GPhilo

1) I am having trouble understanding why we would ever use predict_on_batch on a single image instead of running predict() directly on a single batch. Gphilo says that the advantage of predict_on_batch is that predict_on_batch won't try to split the variable in the argument.

For a single image, what do we care if the data tries to split it or not? Shouldn't .predict() be able to handle the single image just fine? predict() will be able to correctly recognize a single image, right?

So let's say we have a single greyscale image single_image of size (1,128,128,1)

Then can't we just do:

model.predict(single_image)

instead of

model.predict_on_batch(single_image) ?

And there seems to be disadvantages to using predict_on_batch. So why would we ever use predict_on_batch if predict_on_batch can cause GPU problems? What advantage would predict_on_batch give?

2) Or am I misunderstanding? Can .predict actually not take in single images well for a reason I haven't thought of?

I am a beginner, and I think perhaps I am misunderstanding/not noticing something obvious in GPhilo's message.

Yes you can.

  • You can predict many batches (and the system will conclude it's one batch)
  • You can predict one batch (and the system will not try to calculate batches)

Both options are just fine.

There is no disadvantage in using predict_on_batch . On the contrary, it would be faster than predict because it does less things.

All memory problems that may happen in predict_on_batch would happen exactly the same in predict if batch_size == len(data) .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM