Skip to content
Discussion options

You must be logged in to vote

报错原因为使用微调后的模型推理时,paddlx==3.3.3会默认从config.json中加载一个quantization_config,而微调后的模型没有生成。paddlex==3.3.3不兼容,升级paddlx到3.3.9后得以解决。

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by LT-07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant