本文介绍了使用Numpy(np.linalg.svd)进行奇异值分解的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在阅读Abdi& Williams(2010)的主成分分析",我正在尝试重做SVD以获得进一步PCA的价值.

Im reading Abdi & Williams (2010) "Principal Component Analysis", and I'm trying to redo the SVD to attain values for further PCA.

文章指出,以下SVD:

The article states that following SVD:

X = P D Q ^ t

X = P D Q^t

我将数据加载到np.array X中.

I load my data in a np.array X.

X = np.array(data)
P, D, Q = np.linalg.svd(X, full_matrices=False)
D = np.diag(D)

但是当我进行检查时,我没有得到上面的相等性

But i do not get the above equality when checking with

X_a = np.dot(np.dot(P, D), Q.T)

X_a和X是相同的尺寸,但是值不相同.我是否缺少某些东西,或者np.linalg.svd函数的功能与本文的方程式不兼容?

X_a and X are the same dimensions, but the values are not the same. Am I missing something, or is the functionality of the np.linalg.svd function not compatible somehow with the equation in the paper?

推荐答案

TL; DR:numpy的SVD计算X = PDQ,因此Q已被转置.

TL;DR: numpy's SVD computes X = PDQ, so the Q is already transposed.

SVD有效地将矩阵X分解为旋转PQ和对角矩阵D.我使用的linalg.svd()版本返回PQ的正向旋转.计算X_a时,您不想变换Q.

SVD decomposes the matrix X effectively into rotations P and Q and the diagonal matrix D. The version of linalg.svd() I have returns forward rotations for P and Q. You don't want to transform Q when you calculate X_a.

import numpy as np
X = np.random.normal(size=[20,18])
P, D, Q = np.linalg.svd(X, full_matrices=False)
X_a = np.matmul(np.matmul(P, np.diag(D)), Q)
print(np.std(X), np.std(X_a), np.std(X - X_a))

我得到:1.02、1.02、1.8e-15,表明X_a非常准确地重构了X.

I get: 1.02, 1.02, 1.8e-15, showing that X_a very accurately reconstructs X.

如果您使用的是Python 3,则@运算符将实现矩阵乘法,并使代码更易于遵循:

If you are using Python 3, the @ operator implements matrix multiplication and makes the code easier to follow:

import numpy as np
X = np.random.normal(size=[20,18])
P, D, Q = np.linalg.svd(X, full_matrices=False)
X_a = P @ diag(D) @ Q
print(np.std(X), np.std(X_a), np.std(X - X_a))
print('Is X close to X_a?', np.isclose(X, X_a).all())

这篇关于使用Numpy(np.linalg.svd)进行奇异值分解的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-22 19:42