graph_send_recv¶
- paddle.incubate. graph_send_recv ( x, src_index, dst_index, pool_type='sum', out_size=None, name=None ) [source]
-
Warning
API “paddle.incubate.operators.graph_send_recv.graph_send_recv” is deprecated since 2.4.0, and will be removed in future versions. Please use “paddle.geometric.send_u_recv” instead. Reason: graph_send_recv in paddle.incubate will be removed in future
Graph Learning Send_Recv combine operator.
This operator is mainly used in Graph Learning domain, and the main purpose is to reduce intermediate memory consumption in the process of message passing. Take x as the input tensor, we first use src_index to gather the corresponding data, and then use dst_index to update the corresponding position of output tensor in different pooling types, like sum, mean, max, or min. Besides, we can set out_size to get necessary output shape.
Given: X = [[0, 2, 3], [1, 4, 5], [2, 6, 7]] src_index = [0, 1, 2, 0] dst_index = [1, 2, 1, 0] pool_type = "sum" out_size = None Then: Out = [[0, 2, 3], [2, 8, 10], [1, 4, 5]]
- Parameters
-
x (Tensor) – The input tensor, and the available data type is float32, float64, int32, int64.
src_index (Tensor) – An 1-D tensor, and the available data type is int32, int64.
dst_index (Tensor) – An 1-D tensor, and should have the same shape as src_index. The available data type is int32, int64.
pool_type (str) – The pooling types of graph_send_recv, including sum, mean, max, min. Default value is sum.
out_size (int|Tensor|None) – We can set out_size to get necessary output shape. If not set or out_size is smaller or equal to 0, then this input will not be used. Otherwise, out_size should be equal with or larger than max(dst_index) + 1.
name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.
- Returns
-
- The output tensor, should have the same shape and same dtype as input tensor x.
-
If out_size is set correctly, then it should have the same shape as x except the 0th dimension.
- Return type
-
out (Tensor)
Examples
>>> import paddle >>> x = paddle.to_tensor([[0, 2, 3], [1, 4, 5], [2, 6, 7]], dtype="float32") >>> indexes = paddle.to_tensor([[0, 1], [1, 2], [2, 1], [0, 0]], dtype="int32") >>> src_index = indexes[:, 0] >>> dst_index = indexes[:, 1] >>> out = paddle.incubate.graph_send_recv(x, src_index, dst_index, pool_type="sum") >>> print(out) Tensor(shape=[3, 3], dtype=float32, place=Place(cpu), stop_gradient=True, [[0. , 2. , 3. ], [2. , 8. , 10.], [1. , 4. , 5. ]]) >>> x = paddle.to_tensor([[0, 2, 3], [1, 4, 5], [2, 6, 7]], dtype="float32") >>> indexes = paddle.to_tensor([[0, 1], [2, 1], [0, 0]], dtype="int32") >>> src_index = indexes[:, 0] >>> dst_index = indexes[:, 1] >>> out_size = paddle.max(dst_index) + 1 >>> out = paddle.incubate.graph_send_recv(x, src_index, dst_index, pool_type="sum", out_size=out_size) >>> print(out) Tensor(shape=[2, 3], dtype=float32, place=Place(cpu), stop_gradient=True, [[0. , 2. , 3. ], [2. , 8. , 10.]]) >>> x = paddle.to_tensor([[0, 2, 3], [1, 4, 5], [2, 6, 7]], dtype="float32") >>> indexes = paddle.to_tensor([[0, 1], [2, 1], [0, 0]], dtype="int32") >>> src_index = indexes[:, 0] >>> dst_index = indexes[:, 1] >>> out = paddle.incubate.graph_send_recv(x, src_index, dst_index, pool_type="sum") >>> print(out) Tensor(shape=[3, 3], dtype=float32, place=Place(cpu), stop_gradient=True, [[0. , 2. , 3. ], [2. , 8. , 10.], [0. , 0. , 0. ]])