European monetary policy

How statistics are used


Building a credible economic policy for the euro area has depended on good statistics, a solid analytical framework and sound judgment. There have been challenges, and more lie ahead.

Central bankers, like all other policymakers, operate in an environment of high uncertainty regarding the functioning of the economy, as well as its prevailing state and future development. In addition, the second half of the 1990s and the first years of this century have been characterised by structural changes, some of them on a global scale, others confined to Europe, all of which have added to the normal sources of uncertainty. (…) Disentangling the shocks that continually hit the euro area economy and assessing their impact on the risks to price stability in real time remains a very demanding task, in spite of the progress made in statistical data compilation, economic theory and econometrics over the past decades.

In such a complex environment, a single model or a limited set of key indicators is not a sufficient guide for monetary policy. Instead, an encompassing and integrated set of data is required. The development of statistics for the euro area and the future priorities for further enhancements reflect this requirement. A rich set of timely statistical data is a necessary, but insufficient, precondition for sound monetary policymaking. Only if the information is structured and analysed in a consistent way will monetary policymakers be in a position to take the most appropriate decisions to obtain their policy goals. (…)

Monetary policy and data

If we lived in the world of macroeconomic textbooks, a few simple models with a limited set of variables would be a sufficient basis for monetary policymaking and the statistical requirements could be kept to a minimum. As we all know, the real world is much more complex and, therefore, information needs are much more elaborated. (…) The economy is never at rest. A multitude of disturbances of diverse nature affect the economy all the time: financial shocks, demand shocks, supply shocks, etc., and these cannot easily be distinguished in real time, let alone foreseen.

In their attempts to identify disturbances and track how they spread through the economy, central banks are assisted by models. But too often, existing models are not sufficiently sophisticated instruments to identify shocks. For one thing, their focus may be too narrow, in a way that makes them unduly selective. A partial representation of the economic structure can only partially help in monetary policy decision-making, notably in real time. (…)

Another example is provided by the many pitfalls that come with using “synthetic” indicators of inflationary pressure. The most famous of these “synthetic” indicators is the so-called output gap. The output gap can be defined as a measure of the deviation of the aggregate output of an economy from the maximum level that would be consistent with price stability. This maximum level is then the “potential output”. When current output is above potential, the pressure on the scarce resources should then translate into an increase in prices, that is, inflation. (…)

Unfortunately, we cannot assume to know the potential output. The best we can do is to try to estimate the output gap by using observations of many other correlated variables. We all know how imprecise and model-dependent this estimated measure can be. It would then be dangerous to derive monetary policy decisions from such an indicator. (…)

A second reason why models – and the synthetic theoretical constructs that accompany them – may often be elusive guides for policy lies in the fact that they are subject to rapid obsolescence, as the structure of the economy is subject to permanent change. We central bankers, like private agents, need to learn constantly about the environment in which we operate. (…)

If this is of the essence for all central banks, this is particularly true for the ECB, which has been put in charge of the monetary policy of a totally new economic entity.

The creation of EMU represents a major structural change for the European economy and a great challenge to policymakers. Nobody could tell, at the outset, to what extent the introduction of the single currency would affect the functioning of the single market of goods and services, or the very nature of financial markets or price and wage-setting behaviour across the euro area. (…) Such a situation has called for a cautious interpretation of model results and, even more importantly, for a broad information basis in order to cross-check the interpretation of various pieces of information. This implies a need for detailed and high-quality statistics for the euro area. (…)

©André Faber

Statistics for the euro area

Today, euro area statistics compare favourably with those of other major countries in many respects. A concrete example is the monthly balance of payments for the euro area and the availability of a timely, flash estimate of the euro area Harmonised Index of Consumer Prices (HICP). Moreover, the ECB compiles an elaborate and timely set of monthly statistics on monetary developments in the euro area and interest rates. (…)

First and foremost, the achievements have been possible thanks to the intensive and fruitful co-operation and co-ordination with other statistical agencies. (…) Secondly, I would like to stress the importance of close co-operation with users, which should guarantee that the statistics are "fit for purpose". (…)

A third important success factor in the design of statistical systems is a strong legal basis. (…) Legal instruments are necessary to achieve satisfactory standards and equal treatment across Member States. methodologies used by the various countries for the collection and production of statistics. This enables us, for example, to compile meaningful area-wide aggregates.

The need for the harmonisation of statistical concepts also applies at an international level. The application of the international standards is, first of all, a way to ensure that the statistics remain independent of the policy users. Secondly, application of the international standards allows for meaningful comparisons and aggregation. Given that cross-country comparability of official statistics is key to their usefulness and credibility, all countries across the globe should want to implement worldwide standards in their statistics. (…)

Finally, perhaps the most important lesson of all, the independence of statistical institutes is key to the quality and integrity of the underlying statistics. Recent incidents involving government finance statistics have demonstrated this very clearly. The compilation and reporting of statistics must not be vulnerable to political and electoral cycles. (…)

Despite the significant achievements and the good quality of euro area statistics in general, further improvements and enhancements are planned. (…) Important items on this agenda are: a full system of euro area quarterly accounts for each institutional sector, more comprehensive statistics for the monitoring and analysis of financial stability, the further development of external statistics, promoting the compilation of Principal European Economic Indicators, including the application of the first-for-Europe principle, and an increasing focus on the various quality dimensions of European statistics. (…)

The analytical framework

Statistics, as well as additional information such as synthetic indicators, model forecasts and anecdotal evidence, provide economic analysts with the raw material to derive a consistent and timely judgment about the true prevailing economic circumstances and the position of the economy in its business cycle. At the same time, just as a good meal not only requires high-quality ingredients but also an excellent cook and recipe, high-quality monetary policy analysis also requires excellent staff and an appropriate analytical framework. The importance of the latter can hardly be overstated. While central bankers around the world are “data fiends” in their heroic attempts to minimise errors of inference and to make robust decisions amidst uncertainty about the true structure of the economy, the availability of a vast wealth of raw data from diverse sources may mean that policymakers become constantly bombarded by conflicting signals.

It is against this background that the ECB decided to adopt an analytical framework within which all possible sources of information – statistical as well as judgemental – can be brought together in a coherent fashion, while at the same time allowing for alternative and diverse models and perspectives of the workings of the economy. In our view, this framework enables a wealth of information to be digested routinely without compromising the ultimate objective: to maintain a clear sense of direction.

In order to give a structure to the diverse sources of information, a structure that is consistent with our view of the monetary policy transmission mechanism, we have organised our analytical framework into two “pillars”. These consist of two complementary perspectives on the determination of price developments. One perspective, which we refer to as the “economic analysis”, is grounded on the belief that price developments over the short to medium term are largely influenced by the interplay of supply and demand in the goods, services and factor markets. (...) The ECB and the national central banks use a variety of models for macroeconomic analysis and forecasting. (…) While the information synthesised from various indicators serves as an important input into the decision-making process, the Governing Council of the ECB does not take decisions only on the basis of projections. (…) In the end, monetary policy requires judgement on the part of the policymaker to assess not only the plausibility of all possible scenarios, but also the nature of the shocks and the best policy reaction in order to deal with this uncertain environment. (...)

In order not to lose sight of the low frequency developments that may influence inflation over longer horizons than are used in the forecasts, the ECB has reinforced its analytical framework with a monetary perspective. This “pillar” looks at price formation from a medium to longterm standpoint and is intended to purge monetary policy from the risk of becoming unduly short-sighted and overreacting to the latest economic news. By constantly reminding the central bank that in the long run, prices and money stock increases are correlated, the monetary pillar is a standing support to the ECB’s commitment to price stability at all horizons that are relevant for economic decisions. (…)

The European Central Bank has played a leading role in the general trend towards greater transparency and openness. It has, in fact, set standards of transparency in the practice of monetary policy. The ECB was the first major central bank to display its diagnosis immediately after its decisions and to hold regular press conferences after each of its monetary policy meetings. It is still the only one that does so. (…)

This extract is from Mr Trichet’s speech to the OECD World Forum on Statistics. A full nine-page version can be downloaded at under “speeches and interviews”, as well as at and will be published shortly in our conference proceedings. For more information on the ECB, visit

©OECD Observer No 246/247, December 2004-January 2005

Economic data

GDP growth: -9.8% Q2/Q1 2020 2020
Consumer price inflation: 1.3% Sep 2020 annual
Trade (G20): -17.7% exp, -16.7% imp, Q2/Q1 2020
Unemployment: 7.3% Sep 2020
Last update: 10 Nov 2020

OECD Observer Newsletter

Stay up-to-date with the latest news from the OECD by signing up for our e-newsletter :

Twitter feed

Digital Editions

Don't miss

Most Popular Articles

NOTE: All signed articles in the OECD Observer express the opinions of the authors
and do not necessarily represent the official views of OECD member countries.

All rights reserved. OECD 2020