I'm getting what I see as strange behavior out of a deployed model in vertex ai. I have a CNN model built with tensorflow/keras version 2.7. My input data is a 3 dimensional array with the follow shape (1, 570, 33). When I pass the input data to the model locally I have a correct response.
model = keras.models.load_model('model')
x = model.predict(input_data) # input_data is a numpy array of shape (1, 570, 33)
print(x)
[[0.1259355 0.9124526 0.65782744 0.2628207 ]]
This is a correct prediction and the model does what it is trained to do. No problems
When I upload the model to Vertex AI using the prebuilt Tensorflow 2.7 docker container with no extra settings (no acceleration for example) and deploy that model to an endpoint this is what I get when I call predict with the same input_data formatted for Vertex AI.
resp = client.predict(
endpoint=endpoint_path,
instances=input_data.toList(),
parameters=parameters,
)
input must be 4-dimensional[1,570,33]\n\t [[{{function_node __inference__wrapped_model_28143}}{{node sequential/conv2d/BiasAdd}}]]
Here is the summary of of the model
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 570, 33, 32) 320
batch_normalization (BatchN (None, 570, 33, 32) 128
ormalization)
activation (Activation) (None, 570, 33, 32) 0
conv2d_1 (Conv2D) (None, 570, 33, 32) 9248
batch_normalization_1 (Batc (None, 570, 33, 32) 128
hNormalization)
activation_1 (Activation) (None, 570, 33, 32) 0
conv2d_2 (Conv2D) (None, 570, 33, 32) 9248
batch_normalization_2 (Batc (None, 570, 33, 32) 128
hNormalization)
activation_2 (Activation) (None, 570, 33, 32) 0
conv2d_3 (Conv2D) (None, 285, 17, 64) 18496
batch_normalization_3 (Batc (None, 285, 17, 64) 256
hNormalization)
activation_3 (Activation) (None, 285, 17, 64) 0
conv2d_4 (Conv2D) (None, 285, 17, 64) 36928
batch_normalization_4 (Batc (None, 285, 17, 64) 256
hNormalization)
activation_4 (Activation) (None, 285, 17, 64) 0
conv2d_5 (Conv2D) (None, 285, 17, 64) 36928
batch_normalization_5 (Batc (None, 285, 17, 64) 256
hNormalization)
activation_5 (Activation) (None, 285, 17, 64) 0
conv2d_6 (Conv2D) (None, 285, 17, 64) 36928
batch_normalization_6 (Batc (None, 285, 17, 64) 256
hNormalization)
activation_6 (Activation) (None, 285, 17, 64) 0
conv2d_7 (Conv2D) (None, 143, 9, 96) 55392
batch_normalization_7 (Batc (None, 143, 9, 96) 384
hNormalization)
activation_7 (Activation) (None, 143, 9, 96) 0
conv2d_8 (Conv2D) (None, 143, 9, 96) 83040
batch_normalization_8 (Batc (None, 143, 9, 96) 384
hNormalization)
activation_8 (Activation) (None, 143, 9, 96) 0
conv2d_9 (Conv2D) (None, 143, 9, 96) 83040
batch_normalization_9 (Batc (None, 143, 9, 96) 384
hNormalization)
activation_9 (Activation) (None, 143, 9, 96) 0
conv2d_10 (Conv2D) (None, 143, 9, 96) 83040
batch_normalization_10 (Bat (None, 143, 9, 96) 384
chNormalization)
activation_10 (Activation) (None, 143, 9, 96) 0
conv2d_11 (Conv2D) (None, 72, 5, 128) 110720
batch_normalization_11 (Bat (None, 72, 5, 128) 512
chNormalization)
activation_11 (Activation) (None, 72, 5, 128) 0
conv2d_12 (Conv2D) (None, 72, 5, 128) 147584
batch_normalization_12 (Bat (None, 72, 5, 128) 512
chNormalization)
activation_12 (Activation) (None, 72, 5, 128) 0
conv2d_13 (Conv2D) (None, 72, 5, 128) 147584
batch_normalization_13 (Bat (None, 72, 5, 128) 512
chNormalization)
activation_13 (Activation) (None, 72, 5, 128) 0
conv2d_14 (Conv2D) (None, 72, 5, 128) 147584
batch_normalization_14 (Bat (None, 72, 5, 128) 512
chNormalization)
activation_14 (Activation) (None, 72, 5, 128) 0
conv2d_15 (Conv2D) (None, 36, 3, 160) 184480
batch_normalization_15 (Bat (None, 36, 3, 160) 640
chNormalization)
activation_15 (Activation) (None, 36, 3, 160) 0
conv2d_16 (Conv2D) (None, 36, 3, 160) 230560
batch_normalization_16 (Bat (None, 36, 3, 160) 640
chNormalization)
activation_16 (Activation) (None, 36, 3, 160) 0
conv2d_17 (Conv2D) (None, 36, 3, 160) 230560
batch_normalization_17 (Bat (None, 36, 3, 160) 640
chNormalization)
activation_17 (Activation) (None, 36, 3, 160) 0
conv2d_18 (Conv2D) (None, 36, 3, 160) 230560
batch_normalization_18 (Bat (None, 36, 3, 160) 640
chNormalization)
activation_18 (Activation) (None, 36, 3, 160) 0
conv2d_19 (Conv2D) (None, 18, 2, 192) 276672
batch_normalization_19 (Bat (None, 18, 2, 192) 768
chNormalization)
activation_19 (Activation) (None, 18, 2, 192) 0
conv2d_20 (Conv2D) (None, 18, 2, 192) 331968
batch_normalization_20 (Bat (None, 18, 2, 192) 768
chNormalization)
activation_20 (Activation) (None, 18, 2, 192) 0
conv2d_21 (Conv2D) (None, 18, 2, 192) 331968
batch_normalization_21 (Bat (None, 18, 2, 192) 768
chNormalization)
activation_21 (Activation) (None, 18, 2, 192) 0
conv2d_22 (Conv2D) (None, 18, 2, 192) 331968
batch_normalization_22 (Bat (None, 18, 2, 192) 768
chNormalization)
activation_22 (Activation) (None, 18, 2, 192) 0
conv2d_23 (Conv2D) (None, 9, 1, 224) 387296
batch_normalization_23 (Bat (None, 9, 1, 224) 896
chNormalization)
activation_23 (Activation) (None, 9, 1, 224) 0
reshape (Reshape) (None, 9, 224) 0
masking (Masking) (None, 9, 224) 0
lambda (Lambda) (None, 224) 0
dense (Dense) (None, 4) 900
=================================================================
Total params: 3,554,532
Trainable params: 3,548,772
Non-trainable params: 5,760
I've got a classic case of 'It works on my machine' here and could use any input or help:)
When using Vertex Prediction service with Tensorflow models, following is the expected format of input request as per the docs
{
"instances": [
<value>|<simple/nested list>|<object>,
...
]
}
For your scenario, try with
{
"instances": [
[
[
[138, 30, 66, ...],
[130, 20, 56, ...],
...
],
[
[126, 38, 61, ...],
[122, 24, 57, ...],
...
],
...
],
...
]
}
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.