pca_lowrank¶
- paddle.linalg. pca_lowrank ( x, q=None, center=True, niter=2, name=None ) [source]
-
Performs linear Principal Component Analysis (PCA) on a low-rank matrix or batches of such matrices.
Let \(X\) be the input matrix or a batch of input matrices, the output should satisfies:
\[X = U * diag(S) * V^{T}\]- Parameters
-
x (Tensor) – The input tensor. Its shape should be […, N, M], where … is zero or more batch dimensions. N and M can be arbitrary positive number. The data type of x should be float32 or float64.
q (int, optional) – a slightly overestimated rank of \(X\). Default value is \(q=min(6,N,M)\).
center (bool, optional) – if True, center the input tensor. Default value is True.
niter (int, optional) – number of iterations to perform. Default: 2.
name (str, optional) – Name for the operation. For more information, please refer to Name. Default: None.
- Returns
-
Tensor U, is N x q matrix.
Tensor S, is a vector with length q.
Tensor V, is M x q matrix.
tuple (U, S, V): which is the nearly optimal approximation of a singular value decomposition of a centered matrix \(X\).
Examples
>>> import paddle >>> paddle.seed(2023) >>> x = paddle.randn((5, 5), dtype='float64') >>> U, S, V = paddle.linalg.pca_lowrank(x) >>> print(U) Tensor(shape=[5, 5], dtype=float64, place=Place(cpu), stop_gradient=True, [[ 0.80131563, 0.11962647, 0.27667179, -0.25891214, 0.44721360], [-0.12642301, 0.69917551, -0.17899393, 0.51296394, 0.44721360], [ 0.08997135, -0.69821706, -0.20059228, 0.51396579, 0.44721360], [-0.23871837, -0.02815453, -0.59888153, -0.61932365, 0.44721360], [-0.52614559, -0.09243040, 0.70179595, -0.14869394, 0.44721360]]) >>> print(S) Tensor(shape=[5], dtype=float64, place=Place(cpu), stop_gradient=True, [2.60101614, 2.40554940, 1.49768346, 0.19064830, 0.00000000]) >>> print(V) Tensor(shape=[5, 5], dtype=float64, place=Place(cpu), stop_gradient=True, [[ 0.58339481, -0.17143771, 0.00522143, 0.57976310, 0.54231640], [ 0.22334335, 0.72963474, -0.30148399, -0.39388750, 0.41438019], [ 0.05416913, 0.34666487, 0.93549758, 0.00063507, 0.04162998], [-0.39519094, 0.53074980, -0.16687419, 0.71175586, -0.16638919], [-0.67131070, -0.19071018, 0.07795789, -0.04615811, 0.71046714]])