Skip to main content
main-content
Top

About this book

This book is aimed at the reader who wishes to gain a working knowledge of time series and forecasting methods as applied to economics, engineering and the natural and social sciences. It assumes knowledge only of basic calculus, matrix algebra and elementary statistics. This third edition contains detailed instructions for the use of the professional version of the Windows-based computer package ITSM2000, now available as a free download from the Springer Extras website. The logic and tools of time series model-building are developed in detail. Numerous exercises are included and the software can be used to analyze and forecast data sets of the user's own choosing. The book can also be used in conjunction with other time series packages such as those included in R. The programs in ITSM2000 however are menu-driven and can be used with minimal investment of time in the computational details.

The core of the book covers stationary processes, ARMA and ARIMA processes, multivariate time series and state-space models, with an optional chapter on spectral analysis. Many additional special topics are also covered.

New to this edition:

A chapter devoted to Financial Time Series
Introductions to Brownian motion, Lévy processes and Itô calculus
An expanded section on continuous-time ARMA processes

Table of Contents

1. Introduction

Abstract
In this chapter we introduce some basic ideas of time series analysis and stochastic processes. Of particular importance are the concepts of stationarity and the autocovariance and sample autocovariance functions. Some standard techniques are described for the estimation and removal of trend and seasonality (of known period) from an observed time series. These are illustrated with reference to the data sets in Section 1.1. The calculations in all the examples can be carried out using the time series package ITSM, the professional version of which is available at http://extras.springer.com. The data sets are contained in files with names ending in.TSM. For example, the Australian red wine sales are filed as WINE.TSM. Most of the topics covered in this chapter will be developed more fully in later sections of the book. The reader who is not already familiar with random variables and random vectors should first read Appendix A, where a concise account of the required background is given.
Peter J. Brockwell, Richard A. Davis

2. Stationary Processes

Abstract
A key role in time series analysis is played by processes whose properties, or some of them, do not vary with time. If we wish to make predictions, then clearly we must assume that something does not vary with time. In extrapolating deterministic functions it is common practice to assume that either the function itself or one of its derivatives is constant. The assumption of a constant first derivative leads to linear extrapolation as a means of prediction. In time series analysis our goal is to predict a series that typically is not deterministic but contains a random component. If this random component is stationary, in the sense of Definition 1.4.2, then we can develop powerful techniques to forecast its future values. These techniques will be developed and discussed in this and subsequent chapters.
Peter J. Brockwell, Richard A. Davis

3. ARMA Models

Abstract
In this chapter we introduce an important parametric family of stationary time series, the autoregressive moving-average, or ARMA, processes. For a large class of autocovariance functions γ(⋅ ) it is possible to find an ARMA process {X t } with ACVF γ X (⋅ ) such that γ(⋅ ) is well approximated by γ X (⋅ ).
Peter J. Brockwell, Richard A. Davis

4. Spectral Analysis

Abstract
This chapter can be omitted without any loss of continuity. The reader with no background in Fourier or complex analysis should go straight to Chapter 5 The spectral representation of a stationary time series {X t } essentially decomposes {X t } into a sum of sinusoidal components with uncorrelated random coefficients. In conjunction with this decomposition there is a corresponding decomposition into sinusoids of the autocovariance function of {X t }. The spectral decomposition is thus an analogue for stationary processes of the more familiar Fourier representation of deterministic functions. The analysis of stationary processes by means of their spectral representation is often referred to as the “frequency domain analysis” of time series or “spectral analysis.” It is equivalent to “time domain” analysis based on the autocovariance function, but provides an alternative way of viewing the process, which for some applications may be more illuminating. For example, in the design of a structure subject to a randomly fluctuating load, it is important to be aware of the presence in the loading force of a large sinusoidal component with a particular frequency to ensure that this is not a resonant frequency of the structure. The spectral point of view is also particularly useful in the analysis of multivariate stationary processes and in the analysis of linear filters. In Section 4.1 we introduce the spectral density of a stationary process {X t }, which specifies the frequency decomposition of the autocovariance function, and the closely related spectral representation (or frequency decomposition) of the process {X t } itself. Section 4.2 deals with the periodogram, a sample-based function from which we obtain estimators of the spectral density. In Section 4.3 we discuss time-invariant linear filters from a spectral point of view and in Section 4.4 we use the results to derive the spectral density of an arbitrary ARMA process.
Peter J. Brockwell, Richard A. Davis

5. Modeling and Forecasting with ARMA Processes

Abstract
The determination of an appropriate ARMA(p, q) model to represent an observed stationary time series involves a number of interrelated problems. These include the choice of p and q (order selection) and estimation of the mean, the coefficients {ϕ i , i = 1, , p}, {θ i , i = 1, , q}, and the white noise variance σ 2. Final selection of the model depends on a variety of goodness of fit tests, although it can be systematized to a large degree by use of criteria such as minimization of the AICC statistic as discussed in Section 5.5. (A useful option in the program ITSM is Model>Estimation>Autofit, which automatically minimizes the AICC statistic over all ARMA(p, q) processes with p and q in a specified range.)
Peter J. Brockwell, Richard A. Davis

6. Nonstationary and Seasonal Time Series Models

Abstract
In this chapter we shall examine the problem of finding an appropriate model for a given set of observations {x 1, , x n } that are not necessarily generated by a stationary time series.
Peter J. Brockwell, Richard A. Davis

7. Time Series Models for Financial Data

Abstract
In this chapter we discuss some of the time series models which have been found useful in the analysis of financial data. These include both discrete-time and continuous-time models, the latter being used widely, following the celebrated work of Black, Merton and Scholes, for the pricing of stock options. The closing price on trading day t, say P t , of a particular stock or stock-price index, typically appears to be non-stationary while the log asset price, X t : = log(P t ), has observed sample-paths like those of a random walk with stationary uncorrelated increments, i.e., the differenced log asset price, Z t : = X t X t−1, known as the log return (or simply return) for day t, has sample-paths resembling those of white noise.
Peter J. Brockwell, Richard A. Davis

8. Multivariate Time Series

Abstract
Many time series arising in practice are best considered as components of some vector- valued (multivariate) time series {X t } having not only serial dependence within each component series {X ti } but also interdependence between the different component series {X ti } and {X tj }, ij. Much of the theory of univariate time series extends in a natural way to the multivariate case; however, new problems arise.
Peter J. Brockwell, Richard A. Davis

9. State-Space Models

Abstract
In recent years state-space representations and the associated Kalman recursions have had a profound impact on time series analysis and many related areas.
Peter J. Brockwell, Richard A. Davis

10. Forecasting Techniques

Abstract
We have focused until now on the construction of time series models for stationary and nonstationary series and the determination, assuming the appropriateness of these models, of minimum mean squared error predictors. If the observed series had in fact been generated by the fitted model, this procedure would give minimum mean squared error forecasts. In this chapter we discuss three forecasting techniques that have less emphasis on the explicit construction of a model for the data. Each of the three selects, from a limited class of algorithms, the one that is optimal according to specified criteria.
Peter J. Brockwell, Richard A. Davis

11. Further Topics

Abstract
In this chapter we touch on a variety of topics of special interest. In Section 11.1 we consider transfer function models, designed to exploit for predictive purposes the relationship between two time series when one acts as a leading indicator for the other. Section 11.2 deals with intervention analysis, which allows for possible changes in the mechanism generating a time series, causing it to have different properties over different time intervals. In Section 11.3 we introduce the very fast growing area of nonlinear time series analysis, and in Section 11.4 we discuss fractionally integrated ARMA processes, sometimes called “long-memory” processes on account of the slow rate of convergence of their autocorrelation functions to zero as the lag increases. In Section 11.5 we discuss continuous-time ARMA processes which, for continuously evolving processes, play a role analogous to that of ARMA processes in discrete time. Besides being of interest in their own right, they have proved a useful class of models in the representation of financial time series and in the modeling of irregularly spaced data.
Peter J. Brockwell, Richard A. Davis
Additional information