Implement distributed data parallelism based on torch.distributed at module level.| pytorch.org
Instances of autocast serve as context managers or decorators that| pytorch.org