I'm running into problems trying to use a PyTorch model exported as an ONNX model with Caffe2. Here is my export code
the_model = torchvision.models.densenet121(pretrained=True)
garbage, model_inputs = preprocessing("test.jpg")
torch_out = torch.onnx._export(the_model,
model_inputs,
"model_weights/chexnet-py.onnx",
export_params=True)
Now here is my testing code
model = onnx.load("model_weights/chexnet-py.onnx")
garbage, model_inputs = preprocessing("text.jpg")
prepared_backend = onnx_caffe2.backend.prepare(model)
W = {model.graph.input[0].name: model_inputs.numpy()}
c2_out = prepared_backend.run(W)[0]
This is returning the following error
ValueError: Don't know how to translate op Unsqueeze when running converted PyTorch Model
Additional information pytorch version 1.0.0a0+6f664d3 Caffe2 is latest version (attempted building from source, pip, and conda). All gave same result.
Try looking into this, if you have to edit package called onnx-caffe2 to add the mapping b/w Unsqueeze to ExpandDims https://github.com/onnx/onnx/issues/1481
Look for the answer:
I found that the Caffe2 equivalence for Unsqueeze in ONNX is ExpandDims, and there is a special mapping in onnx_caffe2/backend.py around line 121 for those operators that are different only in their names and attribute names, but somehow Unsqueeze isn't presented there (have no idea why). So I manually added the mapping rules for it in the _renamed_operators and _per_op_renamed_attrs dicts and the code would look like:
_renamed_operators = {
'Caffe2ConvTranspose': 'ConvTranspose',
'GlobalMaxPool': 'MaxPool',
'GlobalAveragePool': 'AveragePool',
'Pad': 'PadImage',
'Neg': 'Negative',
'BatchNormalization': 'SpatialBN',
'InstanceNormalization': 'InstanceNorm',
'MatMul': 'BatchMatMul',
'Upsample': 'ResizeNearest',
'Equal': 'EQ',
'Unsqueeze': 'ExpandDims', # add this line
}
_global_renamed_attrs = {'kernel_shape': 'kernels'}
_per_op_renamed_attrs = {
'Squeeze': {'axes': 'dims'},
'Transpose': {'perm': 'axes'},
'Upsample': {'mode': ''},
'Unsqueeze': {'axes': 'dims'}, # add this line
}
And everything works as expected.
I am not the OP, thanks to OP though.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.