1. What is a Time Series?
- A time series = sequence of data points collected over time at regular intervals.
- Examples: stock prices, weather data, energy demand, website traffic.
- Goal: understand past patterns and predict future values.
2. Time Series Forecasting
- Forecasting = using historical data to predict future observations.
- Different from regular regression:
- Observations are ordered in time.
- Values are often correlated with past values (autocorrelation).
- Seasonality and trends must be considered.
3. Components of a Time Series
- Trend – long-term increase/decrease in data.
- Seasonality – repeating patterns (e.g., daily, weekly, yearly).
- Cyclic patterns – irregular cycles (business cycles).
- Noise – random variation not explained by the above.
A forecasting model tries to separate these components.
4. Methods for Forecasting
A. Statistical Models
- ARIMA (AutoRegressive Integrated Moving Average)
- Captures autoregression (past values), differencing (trend removal), moving average (error terms).
- SARIMA – ARIMA + seasonality.
- Exponential Smoothing (ETS, Holt-Winters)
- Weights recent observations more strongly.
- State Space Models (Kalman filters).
B. Machine Learning Models
- Regression-based: Use lagged variables as features.
- Tree models: Random Forests, XGBoost (need feature engineering).
- Support Vector Regression (SVR).
C. Deep Learning Models
- RNNs (Recurrent Neural Networks): Good for sequential data.
- LSTMs/GRUs: Handle long-term dependencies.
- Temporal CNNs.
- Transformers (e.g., Informer, TFT): State-of-the-art for large-scale forecasting.
5. Forecasting Process
- Exploratory Data Analysis (EDA): plot, check stationarity, autocorrelation (ACF/PACF).
- Preprocessing:
- Handle missing values.
- Normalize or scale data.
- Transform if needed (log to stabilize variance).
- Model Selection: choose ARIMA, LSTM, etc.
- Training: fit model on historical data.
- Validation: use rolling window or walk-forward validation (not random split).
- Forecasting: predict next kkk steps (point forecasts or intervals).
6. Example (Simple)
Suppose monthly sales:
[120, 135, 150, 160, 145, 170, …]
- Trend: upward.
- Seasonality: higher sales in holiday months.
- Forecast: next month ~ 180 (based on model).
7. Applications
- Finance: stock, currency prediction.
- Energy: electricity load forecasting.
- Retail: demand forecasting.
- Healthcare: patient flow, disease outbreaks.
- Web: server load, traffic prediction.
8. Challenges
- Non-stationarity: mean/variance changes over time.
- Noise and anomalies: unpredictable shocks (COVID-19, market crashes).
- Multiple series: need hierarchical or multivariate forecasting.
- Accuracy vs Interpretability: deep learning may be accurate but hard to explain.
Summary:
Time series forecasting = predicting future values based on past data. Traditional methods (ARIMA, ETS) handle trend/seasonality well, while machine learning and deep learning (LSTM, Transformers) are powerful for complex patterns. Core challenges include non-stationarity, noise, and choosing the right validation scheme.
