A wrapper for sharding module parameters across data parallel workers.| pytorch.org
TL;DR| www.lamini.ai