Skip to content

Conversation

@rattus128
Copy link
Contributor

There are paths where this can be None. Default to fp32 in that case.

https://discordapp.com/channels/1218270712402415686/1243610878314807337/1447705628758904832

There are paths where this can be None. Default to fp32 in that case.
@rattus128 rattus128 changed the title Handle the None case for the dtype (Fixes GGUF + Lora crash) Handle the None case for the model_dtype (Fixes GGUF + Lora crash) Dec 8, 2025
@comfyanonymous
Copy link
Owner

Already fixed: 3b0368a

manual_cast_dtype set to None means there's no cast so it should default to the weight dtype right?

@rattus128
Copy link
Contributor Author

Already fixed: 3b0368a

manual_cast_dtype set to None means there's no cast so it should default to the weight dtype right?

Yes that makes sense. I think the problem im actually chasing here is when X comes in at a higher dtype than the model and forces this cast which still needs implementation. This is self consistent with figuring out casts for the sake of the model types.

Closing.

@rattus128 rattus128 closed this Dec 8, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants