Describe the bug
AttributeError: module 'torch' has no attribute 'accelerator' when running distributed gather on PyTorch versions < 2.6.
This error happens because gather_size_by_comm in src/diffusers/models/_modeling_parallel.py uses torch.accelerator.current_accelerator(), which only exists in PyTorch 2.6+. Diffusers officially supports PyTorch 2.1+, so this causes a crash on versions 2.1–2.5 with AttributeError: module 'torch' has no attribute 'accelerator'.
Reproduction
Since this is a utility function; it can be triggered directly with a minimal distributed setup:
import torch.distributed as dist
from diffusers.models._modeling_parallel import gather_size_by_comm
dist.init_process_group(
backend="gloo",
init_method="file:///tmp/pg",
rank=0,
world_size=1,
)
gather_size_by_comm(1, dist.group.WORLD)
Logs
[rank0]: Traceback (most recent call last):
[rank0]: File "/home/aja/diffusers/test.py", line 11, in <module>
[rank0]: gather_size_by_comm(1, dist.group.WORLD)
[rank0]: File "/home/aja/diffusers/src/diffusers/models/_modeling_parallel.py", line 293, in gather_size_by_comm
[rank0]: gather_device = "cpu" if "cpu" in comm_backends else torch.accelerator.current_accelerator()
[rank0]: ^^^^^^^^^^^^^^^^^
[rank0]: File "/home/aja/diffusers/.venv/lib/python3.11/site-packages/torch/__init__.py", line 2216, in __getattr__
[rank0]: raise AttributeError(f"module '{__name__}' has no attribute '{name}'")
[rank0]: AttributeError: module 'torch' has no attribute 'accelerator'
System Info
diffuser : 0.37.0.dev0
torch : 2.4.0
python : 3.11
system: Linux
Who can help?
@sayakpaul @DN6
Describe the bug
AttributeError: module 'torch' has no attribute 'accelerator' when running distributed gather on PyTorch versions < 2.6.
This error happens because
gather_size_by_comminsrc/diffusers/models/_modeling_parallel.pyusestorch.accelerator.current_accelerator(), which only exists in PyTorch 2.6+. Diffusers officially supports PyTorch 2.1+, so this causes a crash on versions 2.1–2.5 withAttributeError: module 'torch' has no attribute 'accelerator'.Reproduction
Since this is a utility function; it can be triggered directly with a minimal distributed setup:
Logs
System Info
diffuser :
0.37.0.dev0torch :
2.4.0python :
3.11system: Linux
Who can help?
@sayakpaul @DN6