由于线性回归算法找到了训练数据的最佳拟合线,因此对新数据的预测将始终在该最佳拟合线上。然后,如该图所示,来自sklearn的线性回归模型如何非线性地预测数据。
!(https://pythonprogramming.net/static/images/machine-learning/linear-regression-prediction.png)
import Quandl, math
import numpy as np
import pandas as pd
from sklearn import preprocessing, cross_validation, svm
from sklearn.linear_model import LinearRegression
import matplotlib.pyplot as plt
from matplotlib import style
import datetime
style.use('ggplot')
df = Quandl.get("WIKI/GOOGL")
df = df[['Adj. Open', 'Adj. High', 'Adj. Low', 'Adj. Close', 'Adj. Volume']]
df['HL_PCT'] = (df['Adj. High'] - df['Adj. Low']) / df['Adj. Close'] * 100.0
df['PCT_change'] = (df['Adj. Close'] - df['Adj. Open']) / df['Adj. Open'] * 100.0
df = df[['Adj. Close', 'HL_PCT', 'PCT_change', 'Adj. Volume']]
forecast_col = 'Adj. Close'
df.fillna(value=-99999, inplace=True)
forecast_out = int(math.ceil(0.01 * len(df)))
df['label'] = df[forecast_col].shift(-forecast_out)
X = np.array(df.drop(['label'], 1))
X = preprocessing.scale(X)
X_lately = X[-forecast_out:]
X = X[:-forecast_out]
df.dropna(inplace=True)
y = np.array(df['label'])
X_train, X_test, y_train, y_test = cross_validation.train_test_split(X, y, test_size=0.2)
clf = LinearRegression(n_jobs=-1)
clf.fit(X_train, y_train)
confidence = clf.score(X_test, y_test)
forecast_set = clf.predict(X_lately)
df['Forecast'] = np.nan
last_date = df.iloc[-1].name
last_unix = last_date.timestamp()
one_day = 86400
next_unix = last_unix + one_day
for i in forecast_set:
next_date = datetime.datetime.fromtimestamp(next_unix)
next_unix += 86400
df.loc[next_date] = [np.nan for _ in range(len(df.columns)-1)]+[i]
df['Adj. Close'].plot()
df['Forecast'].plot()
plt.legend(loc=4)
plt.xlabel('Date')
plt.ylabel('Price')
plt.show()
最佳答案
通过线性回归生成的模型在所有预测特征(即X
)中都是线性的。您的模型似乎已使用'HL_PCT', 'PCT_change', 'Adj. Volume'
功能进行了训练。但是,该图在X轴上仅包含一个特征(就像所有2D图一样)Date
,它甚至都不是您的预测性特征之一。即使Date
是X
中的预测功能之一,从多个维度向下缩小到1的投影也可能使模型看起来是非线性的。
关于python - sklearn中的线性回归模型如何在以下代码中进行非线性预测?,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/57837376/