相关文章推荐
豪情万千的小熊猫  ·  java - ...·  1 年前    · 
耍酷的骆驼  ·  彭程·  1 年前    · 
Collectives™ on Stack Overflow

Find centralized, trusted content and collaborate around the technologies you use most.

Learn more about Collectives

Teams

Q&A for work

Connect and share knowledge within a single location that is structured and easy to search.

Learn more about Teams

ARIMA miscalculates the result and I get the error: Warning:Non-stationary starting autoregressive parameters found.Using zeros as starting parameters

Ask Question

I need to calculate ARIMA on several values in a loop. All results are obtained successfully, except the one that has values x = [0, 2] . Maybe there are two values are too few, but it's part of a loop where the number of values is various, so I can't predict where there are only two values (assuming the problem is that there are two values)

The calculated result is -9.995715227417712e-06 , and i get this error:

    warn('Non-stationary starting autoregressive parameters'
UserWarning: Non-stationary starting autoregressive parameters found. Using zeros as starting parameters.
    warnings.warn("Maximum Likelihood optimization failed to "
ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals

I tried to insert any third number into x ([0, 2, 0] or [0, 2, 2], or [0, 2, 3], etc...) and i don't see the error (except for 1, for example x = [0, 2, 1] , which still makes me see the error).

How can I use x = [0, 2] (just 0 and 2) and correctly calculate the result without seeing the error?

Here's how I'm using ARIMA:

from statsmodels.tsa.arima.model import ARIMA
x = [0, 2]
model = ARIMA(x, order=(1,0,0))
model_fit = model.fit()
forecast = model_fit.forecast(steps=1)
print(forecast[0])

I found a solution, but I'm not 100% sure why it works.

By default trend is set to 'c', which is constant trend (basically a linear intercept.) If you set it to 'n' or 'ct', the error goes away.

If you set it to 'n', it predict a near-zero value.

If you set it to 'ct', it basically becomes a linear regression, and predicts a point by drawing a straight line through your two points and predicting 4.

Code example:

model = ARIMA(x, order=(1,0,0), trend='ct')

I also got convergence errors when using 'ct' on more than 2 values if those values were in an exactly straight line.

I think this has something to do with the fact that the default settings have three unknowns and two observations, and that would explain why 'n' works, because removing trends reduces it to two parameters. But it doesn't explain why 'ct' works, because that has four parameters.

I'm looking for a different result. For example if I use x = [1, 2, 2, 1, 2], its simple arithmetic mean is 1.6. With model = ARIMA(x, order=(1,0,0)), I get 1.38. So I look for a result that is slightly lower than the simple average, as in the case of 1.6 becoming 1.38. If I use x = [0, 2], the simple average is 2. So I expected to receive a result less than 2. With trend='ct' I receive 3.39, while with trend='n' I receive 9.45-06. Is there anything that would cause me to receive a value below the simple average? – Joseph_123 Oct 28, 2023 at 22:12 @Joseph_123 Wouldn't the average of [0, 2] be 1? Can you clarify what you mean by simple average? – Nick ODell Oct 28, 2023 at 22:21 Yes, 1. Sorry, careless error. Considering that the average of [0,2] is 1, then I would like to get a result that is less than 1 (for example 0.65, or 0.70, or 0.75, or 0.80, or something like that). It's the same thing that happened with simple model = ARIMA(x, order=(1,0,0)), when for example I used x = [1, 2, 2, 1, 2] and received 1.38 as a result (average was 1.6) I would like to do the same for x= [0, 2] (or something similar anyway) – Joseph_123 Oct 28, 2023 at 22:29

Thanks for contributing an answer to Stack Overflow!

  • Please be sure to answer the question. Provide details and share your research!

But avoid

  • Asking for help, clarification, or responding to other answers.
  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.