[英]xrunc coreml model error for YOLACT onnx with no priors layer and softmax layer
I have convert the YOLACT pytorch model to onnx with no softmax and priors layers.我已将 YOLACT pytorch model 转换为 onnx,没有 softmax 和先验层。 And I try to convert onnx to coreml afterward.然后我尝试将 onnx 转换为 coreml。 The terminal shows it's done with no mistake.终端显示它没有错误地完成。 It also display model compilation done, and no error as below.它还显示 model 编译完成,并且没有错误如下。
210/211: Converting Node Type Concat 211/211: Converting Node Type Concat Translation to CoreML spec completed. Now compiling the CoreML model. Model Compilation done.
But as I compile coreml model on macos, error shows:但是当我在macos上编译coreml model时,错误显示:
xcrun coremlc compile yolact_test_nosoftmax_simplify.mlmodel
coremlc: Error: compiler error: Espresso exception: Invalid blob shape generic_elementwise_kernel: cannot broadcast [18, 18, 128, 1, 399] and [35, 35, 128, 1, 399]
I have no idea how to debug right now.我现在不知道如何调试。 Any suggestion will be appreciated.任何建议将不胜感激。
There is an operation in your model that tries to apply an operation to a tensor of size (18, 18, 128, 1, 399) and a tensor of size (35, 35, 128, 1, 399).您的 model 中有一个操作尝试将操作应用于大小为 (18, 18, 128, 1, 399) 的张量和大小为 (35, 35, 128, 1, 399) 的张量。 These two tensor shapes are not compatible, hence the error message.这两个张量形状不兼容,因此出现错误消息。
To solve this, you need to find out at which point in your model this happens and then fix the issue.要解决此问题,您需要找出 model 中的哪个点发生这种情况,然后解决问题。 It might be something that went wrong with the PyTorch -> ONNX conversion, or that goes wrong with the ONNX -> Core ML conversion. PyTorch -> ONNX 转换可能出现问题,或者 ONNX -> Core ML 转换出现问题。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.