get_world_size¶
- paddle.distributed. get_world_size ( group=None ) [source]
-
Returns the number of trainers (number of processes participating in current job) in the given group. If none of the group is given, the global group will be used as default.
- Parameters
-
group (Group, optional) – The communication group you want to check world size, use global group as default if group is None.
- Returns
-
(int) The number of trainers in the given group. Return -1 if the process if not part of the given group.
Warning
Argument
group
only supports in dygraph mode.Examples
>>> >>> # Execute this script using distributed launch with one card configs. >>> import paddle >>> import paddle.distributed as dist >>> dist.init_parallel_env() >>> print("The world_size is %d" % dist.get_world_size()) The world_size is 1