vjp¶
- paddle.incubate.autograd. vjp ( func, xs, v=None ) [source]
-
Computes the Vector-Jacobian product, a functional form of reverse mode automatic differentiation.
Warning
This API is in beta, the signatures could be changed in future version.
- Parameters
-
func (Callable) – A function that takes
xs
as inputs parameter and returns a sequence of Tensors or a Tensor.xs (Tensor|Sequence[Tensor]) – Used as positional arguments to evaluate
func
.xs
is accepted as one Tensor or a sequence of Tensors.v (Tensor|Sequence[Tensor]|None, optional) – The cotangent vector invovled in the VJP computation.
v
matches the size and shape offunc
‘s output. Defaults to None, which is equivalent to all ones the same size offunc
‘s output.
- Returns
-
func_out(Tensor|tuple[Tensor]): The output of
func(xs)
.vjp(Tensor|tuple[Tensor]): The vjp result.
- Return type
-
output(tuple)
Examples
import paddle def func(x): return paddle.matmul(x, x) x = paddle.ones(shape=[2, 2], dtype='float32') _, vjp_result = paddle.incubate.autograd.vjp(func, x) print(vjp_result) # Tensor(shape=[2, 2], dtype=float32, place=Place(gpu:0), stop_gradient=False, # [[4., 4.], # [4., 4.]]) v = paddle.to_tensor([[1.0, 0.0], [0.0, 0.0]]) _, vjp_result = paddle.incubate.autograd.vjp(func, x, v) print(vjp_result) # Tensor(shape=[2, 2], dtype=float32, place=Place(gpu:0), stop_gradient=False, # [[2., 1.], # [1., 0.]])