Time Series Analysis [MA3411]

A time series is a collection of observations, each of which is recorded at a specific time point. Such data sets are numerous, examples are the daily average temperature in Munich, the population growth of the last hundred years, the win/loss profile of a football club, the number of claims per month of an insurance company, or the price of a financial asset measured in milliseconds. The aim of this course is the development of mathematical tools and statistical methods to model and analyse such time series.

The main topics of this lecture are the following:

  • Modelling: How do we obtain a plausible mathematical description for a given time series?
  • Analysis: What are the theoretical properties of the model?
  • Estimation: How do we estimate model parameters based on observations?
  • Testing: How do we test a data set for specific properties?
  • Prediction: What can be said about the future development of the time series based on present and historical information?

Chapter I: The lecture starts with a brief discussion of various types of time series and lays the foundation for the mathematical treatment of time series in the subsequent chapters. We discuss different examples of time series arising from econometrics, physics, biology and other disciplines. Furthermore, we recall some familiar stochastic models such as the random walk or the Bernoulli process, and introduce the moving average process as a fundamental building block that accompanies the whole lecture.

Chapter II: The first step in analysing a time series is to separate deterministic components from stochastic influences. For instance, a deterministic trend or a seasonal component is found in many time series, and we learn methods how to extract them from a given time series. The most interesting and mathematically challenging part is, however, the modelling of the stochastic influences on the time series, that is, the components after removing the deterministic parts. Here, the important concept of stationarity comes in: these components, often only after the application of a suitable transformation, have certain stochastic properties that do not change over time.

After different notions of stationarity are clarified, the central object in studying stationary time series will be the autocovariance function. It determines the dependence structure of a stationary sequence and distinguishes it from the simple case of an i.i.d. sequence. A full characterization of autocovariance functions among real-valued functions is proved. Moreover, we investigate the stationarity of two key examples: the AR (autoregressive) and the MA (moving-average) process.

Chapter III: In this chapter, we investigate linear models for stationary sequences. By the so-called Wold decomposition, every stationary sequence is a linear process plus a predictable sequence. Thus, a thorough understanding of linear processes is inevitable. After establishing basic properties of linear processes, the class of ARMA (autoregressive moving-average) processes is examined in more detail. As an extension of the AR and the MA models, they form a rich model class that exhibits nice analytical properties.

Chapter IV: The goal of this chapter is to derive tools for estimating the mean and the autocovariance function of a stationary sequence. For both of them, we will derive estimators and prove their consistency (that is, a law of large number) and asymptotic normality (that is, a central limit theorem). The methods involved go beyond those developed in standard probability courses since stationary processes are usually not i.i.d. sequences.

Chapter V: Next, we proceed to parameter estimation and prediction theory. First, we investigate the Yule-Walker equations for AR processes, and prove the asymptotic normality of the according estimators. Second, different prediction methods for stationary sequences are presented. It is known that the conditional expectation is the best predictor. However, it is usually difficult to calculate, which is the reason why linear predictors are considered. These are indeed feasible and we explicitly calculate the best linear predictor for general stationary sequences. However, the resulting formulas may be numerically inefficient, such that it is advisable to apply recursive methods to compute the best linear predictor. Two such algorithms, the Durbin-Levinson and the innovation algorithm, are treated and applied to ARMA processes.

Chapter VI: The Wold decomposition theorem ensures that every stationary stochastic process (after removing deterministic components) has a representation as a moving average process with white noise innovations. This noise, however, has a distributional structure that can be highly relevant, in particular for the prediction of financial time series. Stylized facts of financial time series include conditional heteroscedasticy, that is, a conditionally (on the past) time-varying and even stochastic variance. Modelling the white noise by some stochastic volatility model has created a major breakthrough in the modelling and prediction of financial time series. First examples for such stochastic volatility models include regime-switching and self-exciting threshold autoregressive models.

Chapter VII: In this chapter, we investigate in detail the GARCH (generalized autoregressive conditionally heteroscedastic) processes, going back to Nobel prize winning economist Robert Engle. In particular, we cover stationarity and ergodicity results, discuss the moment structure and investigate the relationship to ARMA processes.

Chapter VIII: The final chapter investigates prediction and estimation methods for GARCH models. Maximum likelihood methods are given particular attention

R code to produce the above figures.

References

  1. P.J. Brockwell and R.A. Davis. Time Series: Theory and Methods. Springer, New York, 2nd edition, 1991.
  2. C. Francq and J.-M. Zakoian. GARCH models. Wiley, Chichester, 2010.