我在使用scipy的fmin求解器时遇到问题。我想要它为我的ARIMA模型优化参数。直接运行代码(不使用fmin)时,一切正常,但是放入优化器时会出现错误:
Traceback (most recent call last):
File "D:/Work/repo_python/bandwidth_estimation/estima.py", line 169, in <module>
optimum = fmin(criterion, x0)
File "C:\Users\Pigeon\AppData\Local\Programs\Python\Python36\lib\site-packages\scipy\optimize\optimize.py", line 442, in fmin
res = _minimize_neldermead(func, x0, args, callback=callback, **opts)
File "C:\Users\Pigeon\AppData\Local\Programs\Python\Python36\lib\site-packages\scipy\optimize\optimize.py", line 585, in _minimize_neldermead
fsim[k] = func(sim[k])
File "C:\Users\Pigeon\AppData\Local\Programs\Python\Python36\lib\site-packages\scipy\optimize\optimize.py", line 326, in function_wrapper
return function(*(wrapper_args + args))
File "D:/Work/repo_python/bandwidth_estimation/estima.py", line 18, in criterion
model_arima = ARIMA(train, order=parametres)
File "C:\Users\Pigeon\AppData\Local\Programs\Python\Python36\lib\site-packages\statsmodels\tsa\arima_model.py", line 988, in __new__
mod.__init__(endog, order, exog, dates, freq, missing)
File "C:\Users\Pigeon\AppData\Local\Programs\Python\Python36\lib\site-packages\statsmodels\tsa\arima_model.py", line 1009, in __init__
self._first_unintegrate = unintegrate_levels(self.endog[:d], d)
TypeError: slice indices must be integers or None or have an __index__ method
这是代码:
def criterion(parametres):
parametres = tuple(parametres)
control_sum = 0
train = [5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.361409086288781, 5.899251779517743, 5.766256093003701, 5.795872889382786, 5.637485909370455, 5.7453759249241045, 5.066030606246879, 5.0944229577563425, 5.0944229577563425, 5.0944229577563425, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.361409086288781, 5.899251779517743, 5.766256093003701, 5.795872889382786, 5.637485909370455, 5.7453759249241045, 5.066030606246879, 5.0944229577563425, 5.0944229577563425, 5.0944229577563425, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.361409086288781, 5.899251779517743, 5.766256093003701, 5.795872889382786, 5.637485909370455, 5.7453759249241045, 5.066030606246879, 5.0944229577563425, 5.0944229577563425, 5.0944229577563425, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.361409086288781, 5.899251779517743, 5.766256093003701, 5.795872889382786, 5.637485909370455, 5.7453759249241045, 5.066030606246879, 5.0944229577563425, 5.0944229577563425, 5.0944229577563425, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.361409086288781, 5.899251779517743, 5.766256093003701, 5.795872889382786, 5.637485909370455, 5.7453759249241045, 5.066030606246879, 5.0944229577563425, 5.0944229577563425, 5.0944229577563425, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.361409086288781, 5.899251779517743, 5.766256093003701, 5.795872889382786, 5.637485909370455, 5.7453759249241045, 5.066030606246879, 5.0944229577563425, 5.0944229577563425, 5.0944229577563425, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.361409086288781, 5.899251779517743, 5.766256093003701, 5.795872889382786, 5.637485909370455, 5.7453759249241045, 5.066030606246879, 5.0944229577563425, 5.0944229577563425, 5.0944229577563425, 5.545763155811349]
model_arima = ARIMA(train, order=parametres)
model_arima_fit = model_arima.fit()
predictions = model_arima_fit.forecast(steps=24)[0]
for i in range(0,len(predictions)):
control_sum += (predictions[i]-values[i])**2
print(control_sum)
return control_sum
x0 = [1,1,1]
optimum = fmin(criterion, x0)
如我之前所说-如果我在函数外部运行代码-一切正常。问题在于将其投入fmin求解器中时的功能。
最佳答案
这是因为fmin
将浮点数传递给参数,并且它们必须是int
您可以通过将parametres
强制转换为int来修复它parametres = parametres.astype(int)
以及所有:
def criterion(parametres):
parametres = parametres.astype(int)
control_sum = 0
train = [5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.361409086288781, 5.899251779517743, 5.766256093003701, 5.795872889382786, 5.637485909370455, 5.7453759249241045, 5.066030606246879, 5.0944229577563425, 5.0944229577563425, 5.0944229577563425, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.361409086288781, 5.899251779517743, 5.766256093003701, 5.795872889382786, 5.637485909370455, 5.7453759249241045, 5.066030606246879, 5.0944229577563425, 5.0944229577563425, 5.0944229577563425, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.361409086288781, 5.899251779517743, 5.766256093003701, 5.795872889382786, 5.637485909370455, 5.7453759249241045, 5.066030606246879, 5.0944229577563425, 5.0944229577563425, 5.0944229577563425, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.361409086288781, 5.899251779517743, 5.766256093003701, 5.795872889382786, 5.637485909370455, 5.7453759249241045, 5.066030606246879, 5.0944229577563425, 5.0944229577563425, 5.0944229577563425, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.361409086288781, 5.899251779517743, 5.766256093003701, 5.795872889382786, 5.637485909370455, 5.7453759249241045, 5.066030606246879, 5.0944229577563425, 5.0944229577563425, 5.0944229577563425, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.361409086288781, 5.899251779517743, 5.766256093003701, 5.795872889382786, 5.637485909370455, 5.7453759249241045, 5.066030606246879, 5.0944229577563425, 5.0944229577563425, 5.0944229577563425, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.545763155811349, 5.361409086288781, 5.899251779517743, 5.766256093003701, 5.795872889382786, 5.637485909370455, 5.7453759249241045, 5.066030606246879, 5.0944229577563425, 5.0944229577563425, 5.0944229577563425, 5.545763155811349]
print(parametres)
model_arima = ARIMA(train, order=parametres)
model_arima_fit = model_arima.fit()
predictions = model_arima_fit.forecast(steps=24)[0]
for i in range(0,len(predictions)):
control_sum += (predictions[i]-values[i])**2
print(control_sum)
return control_sum
x0 = [1,1,1]
optimum = fmin(criterion, x0)
另外,您在函数中使用变量
values
时未在函数中定义变量fmin
,这也会引起问题。注意:为了在之外使用此函数,您将需要传入一个numpy数组,而不要将其转换为函数中的元组。
关于python - Python-切片索引必须为整数或无,或具有__index__方法,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/59501401/