SequenceParallelEnable¶
- class paddle.distributed. SequenceParallelEnable [source]
-
Sequence parallel plan for mp config. Do sequence parallel on the layer. Note the input should be in [b, s, h] format.
Examples
>>> import paddle >>> import paddle.distributed as dist >>> class MLP(paddle.nn.Layer): ... def __init__(self): ... super().__init__() ... self.fc1 = paddle.nn.Linear(8, 8) ... self.fc2 = paddle.nn.Linear(8, 8) ... ... def forward(self, input): ... return self.fc2(self.fc1(input)) >>> >>> layer = MLP() >>> mp_config = { ... 'fc1': dist.SequenceParallelEnable() ... }