torchtnt.utils.distributed.barrier¶
-
torchtnt.utils.distributed.barrier() None¶ Add a synchronization point across all processes when using distributed. If torch.distributed is initialized, this function will invoke a barrier across the global process group. For more granular process group wrapping, please refer to
PGWrapper.