# Methods of information recovery

## Probability Theory and Statistical Inference - Econometric Theory - An Introduction to Applied Econometrics - Econometric Foundations

Aris Spanos's latest book is conceived as a development of his earlier well-regarded econometrics textbook. That book was distinguished most particularly by its serious treatment of underlying probability theory as a firm basis for an approach that clearly distinguished theoretical and statistical aspects to modelling. The new textbook is "a more ripened elucidation" concentrating in greater depth on the statistical roots of econometrics.

Central to Probability Theory and Statistical Inference 's line of argument is the contention that what most distinctly characterises econometrics is the nature of the generation of economic data. In particular, the non-experimental aspect that necessitates careful attention to the stochastic processes behind the explanatory variables. This, of course, is a feature common to many disciplines, including biology, physical geography, meteorology and other social sciences such as sociology and political science. In the latter examples, as in economics, the data is generated by choices of agents. It could be argued that econometrics is also distinctive in its connection to economic theory as a way of conceptualising these agents' decisions.

A clear methodological viewpoint is advanced here that strongly favours prioritising the statistical over the theoretical perspective on model development. This requires construction of a statistically adequate model devoid of intrinsic economic theoretical content as a basis for testing economic theory. Statistical models underlying regression are developed by conditioning on an appropriate information set, to leave a derived error term corresponding to the deviation from the conditional expectation for the dependent variable. This is in contrast with many econometric practices.

The focus of this book is not on the details of particular econometric techniques but on the basic concepts of probability theory. The coverage is comprehensive, and the text can serve as a useful and detailed reference on many topics such as precise forms of distributions, varieties of limit theorems and so on. The material is developed not only mathematically but also with attention to the historical background, including a treatment of philosophical issues behind interpretation of statistical statements. Reference to historical controversies is used to illuminate issues, for instance in contrasting the familiar Neyman-Pearson approach to hypothesis testing with that of Fisher, with an endorsement of the advantages of the latter in some contexts, such as misspecification testing. Courses in econometrics do not always go into such depth, but the book could serve as a useful complement to a full introductory course in probabilistic foundations of econometrics, and it contains material that would be useful for students on other courses.

Ron Mittelhammer, George Judge and Douglas Miller have written a textbook, Econometric Foundations , that, though more interested than Spanos's in the specifics of particular econometric procedures, still has a strong methodological flavour. The book begins with a clear statement of the author's views on the role of econometrics. Modelling is seen as a process of information recovery, an "inverse problem with noise" in which sampled observations are used to recover information on unobservable aspects of an econometric model. The roles of economic theory and sampling and probability models are more closely interwoven than in Spanos's textbook. The authors are less concerned at specifying a regression model by addition of an autonomous noise component to a partly theory-driven systematic component for the explained variable, and then making assumptions necessary to equate this with the required conditional expectation.

The book has an innovative approach to integrating electronic resources and software applications into its presentation. Statistical background material of the sort covered in depth by Spanos and usually relegated to the appendices of econometrics textbooks is in electronic chapters on an accompanying CD-Rom, which also provides software for implementing procedures discussed in the text. (This CD-Rom was unavailable with the advance uncorrected proofs provided for review, so it is impossible to offer any judgement on the material included.) Furthermore the authors are not shy of including the software code directly into the body of the text where they wish to emphasise the ease of applicability of the procedures discussed.

The main text assumes familiarity with fairly advanced statistical concepts. It freely introduces notions such as complete sufficient statistics, uniformly most powerful tests, moment-generating functions and so on with only a reference to the CD-Rom for explanation. Students not readily conversant with such ideas will have difficulty following the argument, especially if they are not reading the book with access to the CD-Rom or an appropriate text to hand.

The ordering of topics and the manner of discussion is unconventional. All chapters start with a comprehensive tabular presentation outlining the features of the probability model to be assumed throughout the chapter. The book starts with the normal linear model with fixed regressors, estimation being treated by maximum likelihood methods and testing by generalised likelihood ratio methods. It then goes on to consider relaxation of the normality assumption, where least squares is introduced as an estimation method and other test procedures make an appearance. Techniques applied to these models are then pointed out, to be subsumed under the general class of extremum estimation, and inference methods and further elaborations of the models are developed. Stochastic regressors are not covered until the tenth chapter.

The book's richest section is an extended unifying discussion of approaches including quasi-maximum likelihood, empirical likelihood and generalised method of moments under a general framework of estimating equations. I found a careful reading of this stimulating, and it also yielded genuinely fresh insights. Econometric Foundations has a bold, novel perspective that is often rewarding. A lot of interesting and sensible things are said, but accessibility to students without a solid grounding in statistical preliminaries may be limited. It is not introductory reading, but it can enhance a more advanced course.

Time-series issues are almost completely eschewed by Mittelhammer, Judge and Miller, with no mention, for example, of cointegration theory. These topics are central to James Davidson's textbook, Econometric Theory . As with Spanos, the preferred methodological viewpoint presents errors as derived rather than autonomous. The concentration is on statistical modelling of conditional expectations, with a focus on sequential conditioning in the time-series setting. Regressors are treated throughout as stochastic. The book has a sensible discussion of model selection issues; much of the discussion centres on linear models, with some discussion of non-linear models in a section on extremum estimators. Maximum-likelihood methods are treated consistently as an example of this general approach to estimation and test procedures developed completely for the general case. This is a clear and rewarding approach.

The book is strong on linear dynamic modelling of time series and has an excellent coverage of recent developments in econometrics for non-stationary time series. Cointegration theory is given a comprehensive and clear treatment, including an exposition of the underlying probability background - stochastic processes on function spaces, Brownian motion and so on - which I found to enhance understanding considerably. This will be a useful book, particularly to those teaching advanced courses in time-series econometrics. Overall, it is a fine and well-written piece of work.

Kerry Patterson's An Introduction to Applied Econometrics is in the same tradition and acknowledges the influence of the more innovative recent contributors to time-series econometrics. This is a book with a much stronger applied focus and stress on accessibility to those with less interest in more recondite statistical or econometric theory. The author aims to avoid matrix algebra where possible, though this is a forlorn cause when dealing with much of the recent multivariate cointegration theory that motivates the book.

Procedures are explained in accessible terms, and Patterson shows a helpful ability to explain things simply without compromising accuracy, though explaining the complex in simplified terms can make for a lengthy exposition. Proofs of the technical fundamentals tend to be omitted, and, while this can make the text more approachable, it can also make a satisfactory understanding more difficult - for me, reading this book together with Davidson's made following some of the more potentially abstruse sections, such as that on endogeneity in cointegrated systems, easier. The text is replete with references to actual data and a hefty section is given over to detailed exploration of four macroeconomic applications. The laudable aim is to bring the advances of the past 20 years in time-series econometrics to the attention of the prospective applied economist, and this is an aim that it deserves to achieve.

Ian Preston is senior lecturer in economics, University College London.

## Probability Theory and Statistical Inference: Econometric Modeling with Observational Data. First Edition

Author - Aris Spanos
ISBN - 0 521 41354 0 and 42408 9
Publisher - Cambridge University Press
Price - £60.00 and £24.95
Pages - 815

### Register to continue

Why register?

• Registration is free and only takes a moment
• Once registered, you can read 3 articles a month