Good econometric practice

Dynamic Econometrics

五月 26, 1995

David Hendry's work provides a lively and valuable challenge to scepticism about the value of econometrics. He is sharply critical of much published empirical work in economics but argues forcefully that the appropriate response to widespread predictive failure and multiplication of conflicting "corroborated" models should not be to retreat from econometric enquiry but rather to do it better. He and colleagues have evolved a distinctive approach to the practice of time series econometrics which aims to place results on a "credible evidential basis". This book elaborates and brings together his views on good practice in econometric methodology.

Hendry stresses his commitment to the potentially scientific status of the discipline, grounded in the willingness to confront economic ideas with empirical evidence. The form in which empirical knowledge is accumulated and consolidated is in models and this book concerns itself in a critical manner with the process of model discovery and evaluation in empirical economics. Although the book has the appearance of an econometrics textbook, and there is much here in the appendices and in the body of the text that could be useful in this regard, the approach is in sharp contrast with the standard approach which assumes prior knowledge of the form of economic relationships and which therefore concentrates on methods for estimation of parameters. Actual economic enquiry typically operates at a level where our ignorance extends beyond this to the appropriate specification of the relationships we seek to investigate. The purpose of econometric enquiry is to discover the shape of the appropriate model rather than just to fill in the numerical gaps in one already known. The difficulty of this is not disguised there exist no realistic sufficient conditions for discovery of good models besides maybe "clever ideas, creative insights, and good luck." However, econometric tools can identify poor models and that at least allows formulation of procedures for eliminating those that are not convincing contenders.

Two salient ideas constitute the most distinctive organising ideas to the book the theory of reduction and the theory of encompassing. The theory of reduction deals with the modelling process, characterised as a journey from the most general possible formulation of the mechanism underlying the generation of the econometrician's data to a simplified parsimonious representation. It seeks to enumerate all possible steps that can be taken along the way and to provide criteria for their validity. Different routes will lead to different (sometimes seemingly incommensurable) models and the theory of encompassing deals with criteria for comparison. The book succeeds in bringing out the intimacy of the links between the two theories: if all reductions are valid, in the sense of involving no loss of information on relationships of interest, then the resulting model should encompass all rivals.

The economy is an enormous multiple-dimensioned and interrelated system the analogy of this book is with a waterbed, all parts of which move when any part is disturbed and economic data are generated by the imposition onto that economy of a measurement system. In the extremely general exposition favoured by Hendry, reduction begins from a postulated representation of the joint density of the complete set of measured economic variables relevant to the economy under investigation over the chosen time span (this is what he calls the "Haavelmo distribution"). In practice this is, of course, unmanageably large and attention is anyway focussed on some issue of interest, typically defined by prior theoretical concern. The first reduction is therefore a restriction on the dimensionality of the mechanism by prior untested theoretical ideas on the relevance of variables. In the macroeconomic time series context it is at this stage that the analyst marginalises with respect to all microeconomic variation. The potential for information loss here is huge yet some loss is inevitable.

Standard estimators of econometric relationships perform very differently when applied to series in which the effects of the past are permanent and those in which its effects are transient. For such "integrated" variables absence of any genuine relationship is no barrier to high empirical correlation - indeed there are well-attested circumstances in which unrelated variables are more likely to appear strongly associated than not. Application of conventional statistical methods requires the elimination of any such "nonsense" combinations of variables from the analysis and Hendry lays stress on the advisability of transforming as early as possible to variables which can be shown to be stationary (ie not integrated). Here the book gives an excellent introduction to recent developments in cointegration techniques designed to identify stationary linear combinations of integrated variables. The necessity of applying such techniques at the system level is one important justification for beginning with the joint density of variables under consideration in the dynamic context with which the book deals.

Several other arguments are presented for joint modelling even when the focus of interest is on determination of a single variable. Concentrating on the conditional distribution of a variable of interest given a set of regressors is convenient and avoids problems associated with misspecification of the marginal processes generating the explanatory variables. One consequence of the non-experimental nature of economic data may however be an inability to ignore these marginal processes without loss of information on the relationship of interest. The situations in which conditioning is justified as a further step in reduction are the subject of the theory of exogeneity to which Hendry has made telling contributions.

Further reductions are simplifications through restrictions on lag lengths, functional form and parameter values. Here theory has a role as a "creative stimulus" offering hints towards further simplification subject to testing. "Theory consistency" provides a further check on empirical outcomes, requiring "no evaluation conflict between the model and the theory interpretation."

A sequence of valid reductions will, by definition, produce a model which has lost no information on parameters of interest relative to the initial general formulation. No relevant results of the general model will defy explanation yet the reduced model will have the advantage of simplicity. In the preferred terminology such a "congruent" model is said to "parsimoniously encompass" the original. The limit to reduction is the most parsimonious congruent model. Such a model can "play the role of" the data generation process and ought therefore to be capable of explaining the empirical features also of rival models arising from invalid reductions. This is the principle of "encompassing" as a model selection criterion. The recommended methodology properly followed is presented as an antidote to "data mining", a widely condemned but ill-defined practice given specific interpretation here as suppression of evidence of non-congruency. Here the ideas begin to assemble into a solid conceptual framework, though one more developed on paper than in practice. Instances of encompassing applications remain uncommon and even Hendry's impressive illustrative concluding chapter desists for absence of formal rival models in the context chosen.

Ian Preston is a lecturer in economics, University College, London, and a research fellow of the Institute for Fiscal Studies.

Dynamic Econometrics

Author - David F. Hendry
ISBN - 0 19 8283172 and 828316 4
Publisher - Oxford University Press
Price - £50 and £25
Pages - 869

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.