光环,
我是机器学习和Python的新手,我想通过梯度下降来预测Kaggle House Sales in King County dataset。
我将70%(15,000行)的训练和30%(6k行)的测试分开,我从19个中选择5个功能,但是存在性能问题,该算法花费了很多时间(超过11小时),内存为100%并且执行失败。
这是我的梯度下降课程:
class GradientDescent:
X_train = []
Y_train = []
X_test = []
Y_test = []
lr = 0
max_iter = 0
theta = 0
def __init__(self, X_train,Y_train,X_test,Y_test, lr=0.01, max_iter=100):
self.X_train = X_train
self.Y_train = Y_train
self.X_test = X_test
self.Y_test = Y_test
self.lr = lr
self.max_iter = max_iter
self.theta = np.random.randn(X_train.shape[1], 1)
print(self.theta)
def costFunction(self,theta,X,y):
"1/2m * E(h0-y)**2"
m = len(y)
y_pred = X.dot(theta)
cost = (1/2*m) * np.sum(np.square(y_pred-y))
return cost
def estimate(self):
m = len(self.Y_train)
mse_hist = np.zeros(self.max_iter)
#theta_hist = np.zeros(max_iter)
i = 0
while i < self.max_iter or mse_hist[i] > 0.01:
y_pred = np.dot(self.X_train,self.theta)
error = y_pred-self.Y_train
self.theta = self.theta - (1/m)*self.lr*(self.X_train.T.dot((error)))
mse_hist[i] = self.costFunction(self.theta,self.X_train, self.Y_train)
#print(mse_hist[i])
i+=1
return (self.theta, mse_hist)
def test(self):
res = pd.DataFrame()
for i,row in self.X_test.iterrows():
price_pred = np.dot(row.values,self.theta)
res = row
res['price_actual'] = self.Y_test[i]
res['price_predict'] = price_pred
res['r2_score'] = r2_score(res['price_actual'].values, res['price_predict'])
res.to_csv('output.csv')
有什么建议可以使它更好吗?
最佳答案
总的来说,该代码看起来不错,尽管我尚未对其进行测试。我能找到的唯一错误是您可能没有在while循环中增加i
,因此循环永不退出。
关于python - 梯度下降的多元线性回归,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/56268136/