本文介绍了如何更改ARMAX.predict的maxlag?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

仍然在理解ARIMA源代码的过程中可以预测一些数据. (我使用两个时间序列(indexed_df和external_df分别具有365个数据点.)

Still in the process of understanding the ARIMA source code to forecast some data. (I use two time series (indexed_df and external_df with 365 data points each.)

我想比较ARMA和ARMAX之间的预测准确性.

I want to compare the forecast accuracy between ARMA and ARMAX.

ARMA的预测过程似乎运行良好.但是,使用一个附加的外部变量进行预测就无法正常进行:

The forecasting process for ARMA seems to work fine. But forecasting with one additional external variable does not work somehow:

获取ARMAX的p和q值:

Getting p and q values for ARMAX:

arma_mod1 = sm.tsa.ARMA(indexed_df, (2,0), external_df).fit()
y = arma_mod1.params
print 'P- and Q-Values(ARMAX):'
print y

出局:

P- and Q-Values(ARMAX):
const      34.739272
0           0.000136
ar.L1.0     0.578090
ar.L2.0     0.129253
dtype: float64

获得预测值(样品中):

Getting an predicted value (in-sample):

start_pred = '2013-12-30'
end_pred = '2013-12-30'
period = (start_pred, end_pred)

predict_price1 = arma_mod1.predict(start_pred, end_pred, exog=True, dynamic=True) 
print ('Predicted Price (ARMAX): {}' .format(predict_price1))

出局:

Traceback (most recent call last):

  File "<ipython-input-102-78b3d705d411>", line 6, in <module>
    predict_price1 = arma_mod1.predict(start_pred, end_pred, exog=True, dynamic=True)

  File "/Applications/anaconda/lib/python2.7/site-packages/statsmodels-0.6.1-py2.7-macosx-10.5-x86_64.egg/statsmodels/base/wrapper.py", line 92, in wrapper
    return data.wrap_output(func(results, *args, **kwargs), how)

  File "/Applications/anaconda/lib/python2.7/site-packages/statsmodels-0.6.1-py2.7-macosx-10.5-x86_64.egg/statsmodels/tsa/arima_model.py", line 1441, in predict
    return self.model.predict(self.params, start, end, exog, dynamic)

  File "/Applications/anaconda/lib/python2.7/site-packages/statsmodels-0.6.1-py2.7-macosx-10.5-x86_64.egg/statsmodels/tsa/arima_model.py", line 736, in predict
    start, method)

  File "/Applications/anaconda/lib/python2.7/site-packages/statsmodels-0.6.1-py2.7-macosx-10.5-x86_64.egg/statsmodels/tsa/arima_model.py", line 327, in _arma_predict_out_of_sample
    exog)

  File "/Applications/anaconda/lib/python2.7/site-packages/statsmodels-0.6.1-py2.7-macosx-10.5-x86_64.egg/statsmodels/tsa/arima_model.py", line 293, in _get_predict_out_of_sample
    X = lagmat(np.dot(exog, exparams), p, original='in', trim='both')

  File "/Applications/anaconda/lib/python2.7/site-packages/statsmodels-0.6.1-py2.7-macosx-10.5-x86_64.egg/statsmodels/tsa/tsatools.py", line 328, in lagmat
    raise ValueError("maxlag should be < nobs")

ValueError: maxlag should be < nobs

我对maxlag的理解是(如果之前未定义)要观察的滞后数将通过以下方式自动计算:

My understanding of maxlag is that (if not defined before) the number of lags to be observed will be automatically calculated with:

maxlag = int(round(12*(nobs/100.)**(1/4.)

但是我不知道我可能在哪里更改此计算或设置maxlag的数量.

but i dont understand where i might change this calculation or set the number of maxlag.

我对点的理解是时间步长,即时间序列中的值. (以我的情况为365).

My understanding of nobs is the number of time steps, i.e. values i have in my time series. (in my case 365).

所以这意味着我需要maxlag< 365对吧?

So that means i need maxlag < 365, right?

在哪里可以定义最大滞后次数?

Where can i define the number of maxlag?

此问题中发生了相同的错误:在statsmodels中使用ADF测试在Python中但我不知道在哪里为ARMAX预测设置maxlag.

The same error occurred in this question: ADF test in statsmodels in Python but i have no clue where to set maxlag for ARMAX prediction.

感谢帮助

推荐答案

代码:

predict_price1 = arma_mod1.predict(start_pred, end_pred, exog=True, dynamic=True) 
print ('Predicted Price (ARMAX): {}' .format(predict_price1))

必须更改为:

predict_price1 = arma_mod1.predict(start_pred, end_pred, external_df, dynamic=True) 
print ('Predicted Price (ARMAX): {}' .format(predict_price1))

那样工作!

我比较了没有external_df的值以及它们之间的不同,我认为这可以作为证明.

I compared the values without external_dfand they where different which can be seen as a proof i guess.

这篇关于如何更改ARMAX.predict的maxlag?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-25 07:40