graph_send_recv

paddle.incubate. graph_send_recv ( x, src_index, dst_index, pool_type='sum', name=None ) [source]

Graph Learning Send_Recv combine operator.

This operator is mainly used in Graph Learning domain, and the main purpose is to reduce intermediate memory consumption in the process of message passing. Take x as the input tensor, we first use src_index to gather the corresponding data, and then use dst_index to update the corresponding position of output tensor in different pooling types, like sum, mean, max, or min.

Given:

X = [[0, 2, 3],
     [1, 4, 5],
     [2, 6, 7]]

src_index = [0, 1, 2, 0]

dst_index = [1, 2, 1, 0]

pool_type = "sum"

Then:

Out = [[0, 2, 3],
       [2, 8, 10],
       [1, 4, 5]]
Parameters
  • x (Tensor) – The input tensor, and the available data type is float32, float64, int32, int64.

  • src_index (Tensor) – An 1-D tensor, and the available data type is int32, int64.

  • dst_index (Tensor) – An 1-D tensor, and should have the same shape as src_index. The available data type is int32, int64.

  • pool_type (str) – The pooling type of graph_send_recv, including sum, mean, max, min. Default value is sum.

  • name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.

Returns

The output tensor, should have the same shape and same dtype as input tensor x.

Return type

out (Tensor)

Examples

import paddle

x = paddle.to_tensor([[0, 2, 3], [1, 4, 5], [2, 6, 7]], dtype="float32")
indexes = paddle.to_tensor([[0, 1], [1, 2], [2, 1], [0, 0]], dtype="int32")
src_index = indexes[:, 0]
dst_index = indexes[:, 1]
out = paddle.incubate.graph_send_recv(x, src_index, dst_index, pool_type="sum")
# Outputs: [[0., 2., 3.], [2., 8., 10.], [1., 4., 5.]]