What happened?
The Qwen VAE takes too much VRAM on torch 2.9.1 for 16 GB vram: pytorch/pytorch#177406
This is is already fixed in torch 2.10, but we are currently not upgrading because of pytorch/pytorch#175058
Workaround: Upgrade to torch 2.10 or downgrade to torch 2.8
What did you expect would happen?
Relevant log output
Generate and upload debug_report.log
No response
What happened?
The Qwen VAE takes too much VRAM on torch 2.9.1 for 16 GB vram: pytorch/pytorch#177406
This is is already fixed in torch 2.10, but we are currently not upgrading because of pytorch/pytorch#175058
Workaround: Upgrade to torch 2.10 or downgrade to torch 2.8
What did you expect would happen?
Relevant log output
Generate and upload debug_report.log
No response