Roast topics
Find topics
Find it!
FullyShardedDataParallel — PyTorch 2.4 documentation
A wrapper for sharding module parameters across data parallel workers.
| pytorch.org