简体   繁体   中英

How can i change interpreter output shape?

I am trying to build a tflite file on android.

So I created the following model using jupyter notebook.

After converting the created model to a tflite file, we checked whether it was converted properly.

I want the output interpreter shape result to be 1 this, but the result is still 1 [10] what should I do?

I make model layers like this.

model = tf.keras.models.Sequential([
  tf.keras.layers.Conv2D(32, (3,3), padding="same", input_shape=X_train.shape[1:], activation="relu"),
  tf.keras.layers.MaxPooling2D(pool_size=(2,2)),
  tf.keras.layers.Conv2D(32, (3,3), padding="same", activation="relu"),
  tf.keras.layers.MaxPooling2D(pool_size=(2,2)),

  tf.keras.layers.Conv2D(64, (3,3), padding="same", activation="relu"),
  tf.keras.layers.MaxPooling2D(pool_size=(2,2)),
  tf.keras.layers.Dropout(0.25),

  tf.keras.layers.Conv2D(64, (3,3), padding="same", activation="relu"),
  tf.keras.layers.MaxPooling2D(pool_size=(2,2)),
  tf.keras.layers.Dropout(0.25),

  tf.keras.layers.Flatten(),
  tf.keras.layers.Dense(256, activation="relu"),
  tf.keras.layers.Dropout(0.5),
  tf.keras.layers.Dense(1, activation="sigmoid")
])

Part of converting trained model to tflite file

model = tf.keras.models.load_model("./model/model.h5")
converter = tf.lite.TFLiteConverter.from_keras_model(model)
tflite_model = converter.convert()
open("converted_model.tflite", "wb").write(tflite_model)


interpreter = tf.lite.Interpreter(model_path="converted_model.tflite")
interpreter.allocate_tensors()

# Get input and output tensors.
input_details = interpreter.get_input_details()
print("--------------")
print("shape:", input_details[0]['shape'])
print("type:", input_details[0]['dtype'])
output_details = interpreter.get_output_details()
print("--------------")
print("shape:", output_details[0]['shape'])
print("type:", output_details[0]['dtype'])

interpreter.resize_tensor_input(input_details[0]['index'], (39, 64, 64))
interpreter.resize_tensor_input(output_details[0]['index'], (39, 5))
interpreter.allocate_tensors()

input_details = interpreter.get_input_details()
print("--------------")
print("shape:", input_details[0]['shape'])
print("type:", input_details[0]['dtype'])
output_details = interpreter.get_output_details()
print("--------------")
print("shape:", output_details[0]['shape'])
print("type:", output_details[0]['dtype'])

enter image description here

When creating the model with Keras you may use:

from tensorflow.keras.layers import Input
from tensorflow.keras.models import Model

# some stuff your code does

input = Input((THE_HEIGHT_YOU_WANT, THE_WIDTH_YOU_WANT, THE_CHANNELS_YOU_WANT))

# all the Tensorflow ops you want on "input"

model = Model(input, THE_OUTPUT_YOU_WANT)

# any other stuff your code might do before saving

model.save(THE_PATH_YOU_WANT)

loaded_model = tf.keras.models.load_model(THE_PATH_YOU_WANT)

# rest of your code for converting and saving the model

now when running on Android you can use:

tensorflowLiteInterpreterInstance.getInputTensor(inputTensorIndex).shape()

to get the shape of the model.

The shape should match (THE_HEIGHT_YOU_WANT, THE_WIDTH_YOU_WANT, THE_CHANNELS_YOU_WANT) shape.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM