我实现了一种图像分类算法,将SVM与直方图相交核和随机梯度下降方法结合使用。到目前为止,我完成的代码:

import numpy as np

# calculate the kernel matrix for x and  examples
# kernel is (n_samples x 1 ) matrix
# n_samples number of the examples
def histogram_intersection_kernel(x,u):
    n_samples , n_features = x.shape
    K = np.zeros(shape=(n_samples,1),dtype=np.float)
    for d in xrange(n_samples):
        K[d][0] = np.sum(np.minimum(x[d],u),axis =1)
    return K

# related to the hinge loss
# returns 1 if y*f_{t-1}(xt) < 1
#         0 otherwise
def get_sigma(self,y,y_prediction) :
    if np.dot(y,y_prediction) <1 :
        return 1
    else :
        return 0

# alpha is a predicted matrix for f = alpha *K(x,.)
# returns the predicted value y' for given x
def get_prediction(self,X,alpha,n_samples,u):
    y_predict = np.dot(alpha.T,self.get_kernel(X,u))
    return np.sign(y_predict.item(0))

# calculate the update for gradient descent function
# alpha is the parameter of the equation that i try to find
# eta is the learning rate
# (xt,yt) is the t^th example
# update rule : f = (1-lambda*eta)*f_{t-1} + eta*sigma*yt*K(xt,.)
def update_rule(self,alpha,eta,lmbda,y,X,n_samples,xt,yt):
    param1 = (1-lmbda * eta)*alpha
    y_prediction = self.get_prediction(X,alpha,n_samples,xt)
    kernel_value = self.get_kernel(X,xt)
    param2 = eta * self.get_sigma(yt,y_prediction)*yt.item(0)*kernel_value
    return param1  + param2

# go through all of the examples
# try to find minimum
def gradient_descent(self,n_samples,X,y,alpha,eta,lmbda):
    for i in xrange(n_samples):
        alpha = self.update_rule(alpha,eta,lmbda,y,X,n_samples,X[i],y[i])
    return alpha


该程序无法使用我的数据。看来我想念什么。我想知道基本想法是否正确。


这就像梯度下降的实现一样。要找到支持向量,我是否应该计算出给预测函数赋予0的向量?
我应该如何更新eta?


谢谢。

最佳答案

Alpha是原始示例中权重的向量。支持向量是具有正权重的向量。
您可以在梯度下降过程中绘制成本函数曲线。观察曲线的趋势,选择使曲线收敛的学习率eta

关于python - 具有梯度下降的直方图相交核的SVM,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/21023705/

10-12 22:22