-
Notifications
You must be signed in to change notification settings - Fork 5.9k
Open
Labels
Description
bug描述 Describe the Bug
PC环境:
paddlepddle-gpu=3.3.0
paddleocr=3.3.2
paddlex=3.3.11
paddle2onnx=2.1.0
Debian GNU/Linux 12,RTX5090,cuda13.0,cudnn9.13
对PP-OCRv5_mobile_det进行微调,微调后的模型能够正常导出inference模型,在导出onnx模型的时候报错
命令如下:
paddle2onnx --model_dir output/ppocr_mc20251216/ \
--model_filename inference.json \
--params_filename inference.pdiparams \
--save_file ./ppocr_mobile_ocr20260207/ppocr_mobile_ocr20260207.onnx \
--enable_onnx_checker True
报错如下:
[Paddle2ONNX] Start parsing the Paddle model file...
[Paddle2ONNX] Use opset_version = 11 for ONNX export.
2026-02-08 11:46:26 [ERROR] Failed to convert PaddlePaddle model: (Unimplemented) the 0th elementwise MUST be ir::FloatAttribute
[Hint: Expected array_list[0].isa<::pir::FloatAttribute>() == true, but received array_list[0].isa<::pir::FloatAttribute>():0 != true:1.] (at /github/workspace/paddle2onnx/parser/pir_parser.cc:814)
报错之后,我用paddle2onnx对去年12月微调并导出的inference模型,也是det模型,能够成功导出onnx。当时的环境是4090显卡,cuda是12.6,但是paddlepaddle_gpu不是最新版,所以怀疑是paddlepaddle的问题,而非paddle2onnx的问题
其他补充信息 Additional Supplementary Information
1.尝试更换 opset_version 从11到18,都是这个错误。
2.20260208更换为paddlepaddle_gpu更换为3.2.1解决上述问题,能够正常导出onnx模型,并推理成功
Reactions are currently unavailable