我已经找到了很多拟合零回归线性回归的例子。

但是,我想用固定的x截距拟合线性回归。换句话说,回归将从特定的x开始。

我有以下代码进行绘制。

import numpy as np
import matplotlib.pyplot as plt

xs = np.array([0.1, 0.2, 0.4, 0.6, 0.8, 1.0, 2.0, 4.0, 6.0, 8.0, 10.0,
              20.0, 40.0, 60.0, 80.0])


ys = np.array([0.50505332505407008, 1.1207373784533172, 2.1981844719020001,
              3.1746209003398689, 4.2905482471260044, 6.2816226678076958,
              11.073788414382639, 23.248479770546009, 32.120462301367183,
              44.036117671229206, 54.009003143831116, 102.7077685684846,
              185.72880217806673, 256.12183145545811, 301.97120103079675])


def best_fit_slope_and_intercept(xs, ys):
    # m = xs.dot(ys)/xs.dot(xs)
    m = (((np.average(xs)*np.average(ys)) - np.average(xs*ys)) /
         ((np.average(xs)*np.average(xs)) - np.average(xs*xs)))
    b = np.average(ys) - m*np.average(xs)
    return m, b


def rSquaredValue(ys_orig, ys_line):
    def sqrdError(ys_orig, ys_line):
        return np.sum((ys_line - ys_orig) * (ys_line - ys_orig))
    yMeanLine = np.average(ys_orig)
    sqrtErrorRegr = sqrdError(ys_orig, ys_line)
    sqrtErrorYMean = sqrdError(ys_orig, yMeanLine)
    return 1 - (sqrtErrorRegr/sqrtErrorYMean)


m, b = best_fit_slope_and_intercept(xs, ys)
regression_line = m*xs+b

r_squared = rSquaredValue(ys, regression_line)
print(r_squared)

plt.plot(xs, ys, 'bo')
# Normal best fit
plt.plot(xs, m*xs+b, 'r-')
# Zero intercept
plt.plot(xs, m*xs, 'g-')
plt.show()


而且我想要像下面这样的东西,回归线从(5,0)开始。
python - 如何在python中使用固定x截距应用线性回归?-LMLPHP

谢谢。任何和所有帮助表示赞赏。

最佳答案

我已经思考了一段时间,并且找到了解决该问题的可能方法。

如果我理解的很好,您想找到具有固定x轴截距的线性回归模型的斜率和截距。

在这种情况下(假设您希望x轴截距取值forced_intercept),就好像您在x轴上“移动”了所有点-forced_intercept次,然后您强制了scikit-learn使用等于0的y轴截距。您将获得斜率。要找到截距,只需将b与y = ax + b隔离开并强制点(forced_intercept,0)。这样做时,您将得到b = -a * forced_intercept(其中a是斜率)。在代码中(注意xs重塑):

import numpy as np
import matplotlib.pyplot as plt
from sklearn.linear_model import LinearRegression

xs = np.array([0.1, 0.2, 0.4, 0.6, 0.8, 1.0, 2.0, 4.0, 6.0, 8.0, 10.0,
              20.0, 40.0, 60.0, 80.0]).reshape((-1,1)) #notice you must reshape your array or you will get a ValueError error from NumPy.


ys = np.array([0.50505332505407008, 1.1207373784533172, 2.1981844719020001,
              3.1746209003398689, 4.2905482471260044, 6.2816226678076958,
              11.073788414382639, 23.248479770546009, 32.120462301367183,
              44.036117671229206, 54.009003143831116, 102.7077685684846,
              185.72880217806673, 256.12183145545811, 301.97120103079675])

forced_intercept = 5 #as you provided in your example of (5,0)

new_xs = xs - forced_intercept #here we "move" all the points
model = LinearRegression(fit_intercept=False).fit(new_xs, ys) #force an intercept of 0
r = model.score(new_xs,ys)
a = model.coef_

b = -1 * a * forced_intercept #here we find the slope so that the line contains (forced intercept,0)

print(r,a,b)
plt.plot(xs,ys,'o')
plt.plot(xs,a*xs+b)
plt.show()


希望这就是您想要的。

关于python - 如何在python中使用固定x截距应用线性回归?,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/59493675/

10-11 19:04