Shard¶
- class paddle.distributed. Shard
-
The Shard describes how Tensor splitted across multiple devices according to specified dimensions.
- Parameters
-
dim (int) – specify the slicing dimension of the tensor.
Examples
>>> import paddle >>> import paddle.distributed as dist >>> mesh = dist.ProcessMesh([[2, 4, 5], [0, 1, 3]], dim_names=['x', 'y']) >>> a = paddle.to_tensor([[1,2,3],[5,6,7]]) >>> >>> # distributed tensor >>> d_tensor = dist.shard_tensor(a, mesh, [dist.Shard(0), dist.Shard(1)])
-
get_dim
(
self: paddle.base.libpaddle.Shard
)
int
get_dim¶
-
is_partial
(
self: paddle.base.libpaddle.Placement
)
bool
is_partial¶
-
is_replicated
(
self: paddle.base.libpaddle.Placement
)
bool
is_replicated¶
-
is_shard
(
self: paddle.base.libpaddle.Placement,
dim: Optional[int] = None
)
bool
is_shard¶