svd_lowrank¶
- paddle.linalg. svd_lowrank ( x, q=None, niter=2, M=None, name=None ) [source]
-
Return the singular value decomposition (SVD) on a low-rank matrix or batches of such matrices.
If \(X\) is the input matrix or a batch of input matrices, the output should satisfies:
\[X \approx U * diag(S) * V^{T}\]When \(M\) is given, the output should satisfies:
\[X - M \approx U * diag(S) * V^{T}\]- Parameters
-
x (Tensor) – The input tensor. Its shape should be […, N, M], where … is zero or more batch dimensions. N and M can be arbitrary positive number. The data type of
x
should be float32 or float64.q (int, optional) – A slightly overestimated rank of \(X\). Default value is None, which means the overestimated rank is 6.
niter (int, optional) – The number of iterations to perform. Default: 2.
M (Tensor, optional) – The input tensor’s mean. Its shape should be […, 1, M]. Default value is None.
name (str, optional) – Name for the operation. For more information, please refer to Name. Default: None.
- Returns
-
Tensor U, is N x q matrix.
Tensor S, is a vector with length q.
Tensor V, is M x q matrix.
tuple (U, S, V): which is the nearly optimal approximation of a singular value decomposition of the matrix \(X\) or \(X - M\).
Examples
>>> import paddle >>> paddle.seed(2024) >>> x = paddle.randn((5, 5), dtype='float64') >>> U, S, V = paddle.linalg.svd_lowrank(x) >>> print(U) Tensor(shape=[5, 5], dtype=float64, place=Place(cpu), stop_gradient=True, [[-0.03586982, -0.17211503, 0.31536566, -0.38225676, -0.85059629], [-0.38386839, 0.67754925, 0.23222694, 0.51777188, -0.26749766], [-0.85977150, -0.28442378, -0.41412094, -0.08955629, -0.01948348], [ 0.18611503, 0.56047358, -0.67717019, -0.39286761, -0.19577062], [ 0.27841082, -0.34099254, -0.46535957, 0.65071250, -0.40770727]]) >>> print(S) Tensor(shape=[5], dtype=float64, place=Place(cpu), stop_gradient=True, [4.11253399, 3.03227120, 2.45499752, 1.25602436, 0.45825337]) >>> print(V) Tensor(shape=[5, 5], dtype=float64, place=Place(cpu), stop_gradient=True, [[ 0.46401347, 0.50977695, -0.08742316, -0.11140428, -0.71046833], [-0.48927226, -0.35047624, 0.07918771, 0.45431083, -0.65200463], [-0.20494730, 0.67097011, -0.05427719, 0.66510472, 0.24997083], [-0.69645001, 0.40237917, 0.09360970, -0.58032322, -0.08666357], [ 0.13512270, 0.07199989, 0.98710572, 0.04529277, 0.01134594]])