我一直在尝试对分类问题实施逻辑回归,但这确实给了我奇怪的结果。我在梯度增强和随机森林方面取得了不错的成绩,因此我想到了基础知识,看看可以达到的最佳效果。您能帮我指出造成此问题的我在做错什么吗?
您可以从中获取数据
https://www.kaggle.com/c/santander-customer-satisfaction/data
这是我的代码:
import pandas as pd
import numpy as np
train = pd.read_csv("path")
test = pd.read_csv("path")
test["TARGET"] = 0
fullData = pd.concat([train,test], ignore_index = True)
remove1 = []
for col in fullData.columns:
if fullData[col].std() == 0:
remove1.append(col)
fullData.drop(remove1, axis=1, inplace=True)
import numpy as np
remove = []
cols = fullData.columns
for i in range(len(cols)-1):
v = fullData[cols[i]].values
for j in range(i+1,len(cols)):
if np.array_equal(v,fullData[cols[j]].values):
remove.append(cols[j])
fullData.drop(remove, axis=1, inplace=True)
from sklearn.cross_validation import train_test_split
X_train, X_test = train_test_split(fullData, test_size=0.20, random_state=1729)
print(X_train.shape, X_test.shape)
y_train = X_train["TARGET"].values
X = X_train.drop(["TARGET","ID"],axis=1,inplace = False)
from sklearn.ensemble import ExtraTreesClassifier
clf = ExtraTreesClassifier(random_state=1729)
selector = clf.fit(X, y_train)
from sklearn.feature_selection import SelectFromModel
fs = SelectFromModel(selector, prefit=True)
X_t = X_test.drop(["TARGET","ID"],axis=1,inplace = False)
X_t = fs.transform(X_t)
X_tr = X_train.drop(["TARGET","ID"],axis=1,inplace = False)
X_tr = fs.transform(X_tr)
from sklearn.linear_model import LogisticRegression
log = LogisticRegression(penalty ='l2', C = 1, random_state = 1,
)
from sklearn import cross_validation
scores = cross_validation.cross_val_score(log,X_tr,y_train,cv = 10)
print(scores.mean())
log.fit(X_tr,y_train)
predictions = log.predict(X_t)
predictions = predictions.astype(int)
print(predictions.mean())
最佳答案
您不是在配置C参数-从技术上来说,您只是在配置默认值-这是通常的过度拟合问题之一。您可以看一下GridSearchCV并用几个C参数值(例如从10 ^ -5到10 ^ 5)播放一下,看看它是否可以缓解您的问题。将惩罚规则更改为“ l1”可能也会有所帮助。
此外,该竞赛还存在一些挑战:这是一个不平衡的数据集,并且训练集和私有LB之间的分布有些不同。如果要与您抗衡,那么所有这些,尤其是在使用简单算法(例如LR)时。
关于python - 逻辑回归Python,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/38621685/