简体   繁体   中英

How to convert NumPy array of images to Tensorflow

I have two arrays of shape (600,) containing images and labels. When I try to pass them to Keras/Tensorflow in any form I get the error:

ValueError: Failed to convert a NumPy array to a Tensor (Unsupported object type numpy.ndarray).

As far as I understand the images are stored in an array of arrays. When observing the inner arrays (single images) they have the following properties:

Array of dtype=uint8 with shape: (x, 500, 3) where x is between 300 and 500.

I was able to apply tf layers on the array of images via pandas.apply in the hope the issue was in the inconsistent size of the images:

resize_and_rescale = tf.keras.Sequential([
  tf.keras.layers.experimental.preprocessing.Resizing(IMG_SIZE, IMG_SIZE),
  tf.keras.layers.experimental.preprocessing.Rescaling(1./255)
])
train_df.image = train_df.image.apply(resize_and_rescale)

This code executed successful, but the resulting eager tensors are still not compatible with tensorflow:

train_dataset = tf.data.Dataset.from_tensor_slices((train_df.image.values, train_df.label.values))

ValueError: Failed to convert a NumPy array to a Tensor (Unsupported object type tensorflow.python.framework.ops.EagerTensor).

How can I load the array of images into tf?

I already tried the following load functions unsuccessfully:

NumpyArrayIterator

from_tensor_slices

ImageDataGenerator.flow

Firstly you should try converting the input of from_tensor_slices into a list of 600 arrays. The function is currently consider it as only 1 sample, an array with the first shape of 600, thus creating the error.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM