标题说明了一切。我正在呼叫np.linalg.eig并收到此错误消息,但是如果我呼叫np.isnan(X).any()np.isinf(X).any()这些都将返回False

我用于进行特征分解的类如下:

class LDA():

def __init__(self, n_discriminants=None, centered=False):
    self.n_discriminants = n_discriminants
    self.centered        = centered

def scatter_matrix_col(self, X, y, val):
    matrix_col = X[y == val].mean(0)
    return matrix_col

def build_scatter_matrix(self, X, y):
    y_vals         = np.unique(y)
    scatter_matrix = np.hstack((self.scatter_matrix_col(X, y, val)[:, np.newaxis] for val in y_vals))
    return scatter_matrix

def within_class_matrix(self, X, y):
    m_features     = X.shape[1]
    y_vals         = np.unique(y)
    S_w            = np.zeros((m_features, m_features))
    for val in y_vals:
        scat_matrix = np.cov(X[y == val].T)
        S_w         += scat_matrix
    return S_w

def between_class_matrix(self, X, y):
    col_means  = X.mean(0)[:, np.newaxis]
    mean_vecs  = self.build_scatter_matrix(X, y)
    y_vals     = np.unique(y)
    m_features = X.shape[1]
    S_b        = np.zeros((m_features, m_features))

    for i, val in enumerate(y_vals):
        n           = np.sum(y == val)
        val         = mean_vecs[:, i][:, np.newaxis] - col_means
        scat_matrix = val @ val.T * n
        S_b         += scat_matrix
    return S_b

def fit(self, X, y):
    if self.n_discriminants is None:
        self.n_discriminants  = X.shape[1]

    if self.centered:
        X_fit = standardize(X)
    else:
        X_fit = X

    S_b    = self.between_class_matrix(X_fit, y)
    S_w    = self.within_class_matrix(X_fit, y)
    inv_Sw = np.linalg.inv(S_w)

    eigen_vals, eigen_vecs = np.linalg.eig(inv_Sw @ S_b)

    eigen_pairs           = [(eigen_vals[i], eigen_vecs[:, i]) for i in range(len(eigen_vals))]
    sorted_pairs          = sorted(eigen_pairs, key=lambda x: x[0], reverse=True)
    self.discriminants_   = np.hstack((sorted_pairs[i][1][:, np.newaxis].real for i in range(self.n_discriminants)))
    self.variance_ratios_ = [np.abs(pair[0].real)/np.sum(eigen_vals.real) for pair in sorted_pairs[:self.n_discriminants]]

    return self


我在SKLearn中使用了预安装的数据集:

from sklearn.datasets import load_boston
from sklearn.preprocessing import StandardScaler

boston = load_boston()

sc = StandardScaler()
X_std = sc.fit_transform(boston.data)
y = boston.target


LDA的调用方式如下:

lda = LDA()
lda.fit(X_std, y)


然后给出以下回溯:

__main__:65: RuntimeWarning: Degrees of freedom <= 0 for slice
C:\Users\Jonat\Anaconda\lib\site-packages\numpy\lib\function_base.py:2326:
RuntimeWarning: divide by zero encountered in true_divide
  c *= np.true_divide(1, fact)
C:\Users\Jonat\Anaconda\lib\site-packages\numpy\lib\function_base.py:2326:
RuntimeWarning: invalid value encountered in multiply
  c *= np.true_divide(1, fact)
Traceback (most recent call last):

File "<ipython-input-172-1c94e8f13082>", line 1, in <module>
    lda.fit(X_std, y)

  File "C:/Users/Jonat/OneDrive/Dokumentumok/Python
Scripts/easyml/easyml/algorithms/VarianceReduction/lda.py", line 114, in fit
eigen_vals, eigen_vecs = np.linalg.eig(inv_Sw @ S_b)

File "C:\Users\Jonat\Anaconda\lib\site-packages\numpy\linalg\linalg.py", line 1262, in eig
_assertFinite(a)

File "C:\Users\Jonat\Anaconda\lib\site-packages\numpy\linalg\linalg.py", line 220, in _assertFinite
raise LinAlgError("Array must not contain infs or NaNs")

LinAlgError: Array must not contain infs or NaNs


这个问题以前曾提过,但从来没有警告过,有问题的ndarray没有任何naninf

我不知道这是一个错误,还是它指向我为获取特征值而正在做的事情。

最佳答案

好吧,我意识到自己做错了。

问题是within_class_matrix方法,该方法返回以下回溯:

__main__:65: RuntimeWarning: Degrees of freedom <= 0 for slice
C:\Users\Jonat\Anaconda\lib\site-packages\numpy\lib\function_base.py:2326:
RuntimeWarning: divide by zero encountered in true_divide
c *= np.true_divide(1, fact)
C:\Users\Jonat\Anaconda\lib\site-packages\numpy\lib\function_base.py:2326:
RuntimeWarning: invalid value encountered in multiply
c *= np.true_divide(1, fact)


我认为这里最大的问题是我在回归数据集上使用了LDA,并且y的不同值数量导致矩阵计算不正确。

关于python - LinAlgError:数组不能包含infs或NaN,但不能包含NaN的infs,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/56632418/

10-10 18:42