Concluding Remarks at the 2016 ECB Statistics Conference

Speech by Danièle Nouy, Chair of the Supervisory Board of the Single Supervisory Mechanism,
Frankfurt am Main, 6 July 2016

1. Welcome to participants

Ladies and gentlemen,

Let me warmly thank you for your attendance at, and valuable contributions to, the eighth ECB Statistics Conference. Please allow me to make a few concluding remarks before the end of this successful event. It is my pleasure to participate in this conference for the second time as Chair of the Supervisory Board and I greatly appreciate this opportunity to address such a distinguished audience.

2. Central Banks Statistics: moving beyond the aggregates

About a year and a half ago, banking supervision was brought from the national to the European level. The ECB now directly supervises 129 of the largest banks and banking groups in the euro area – or, measured in terms of assets, about 82% of the banking sector.

The establishment of a European Banking Supervisor was the biggest advancement in European financial integration since the introduction of the Euro itself. European banking supervision adequately reflects the realities of an integrated banking system and a single currency. In this context, and as we have seen during the conference, we also see a paradigm shift in competent authority’s statistics.

Data at the individual entity level, as you call micro data, have gained considerably in importance. The recent financial crisis has shown how important micro data are in understanding complex economic relationships. Many of the new questions raised can only be answered using such data, as only they provide detailed information on distributions and links, thus making it possible to examine how the decisions of individual players impact on aggregate variables.

You have had the opportunity to discuss how granular or disaggregated data can promote the flexibility of the data used for aggregate macroeconomic and monetary analysis and that with granular data, policy needs can be met with a minimum of delay. As such, granular data has become more relevant in recent years. An example was the comprehensive assessment in the run-up to the SSM, and more specifically the asset quality review (AQR). In what still seems like an impossibly short period of time, banks and national competent authorities had to learn how to supply, receive, process, validate and analyse huge amounts of loan-by-loan data. This highly demanding exercise heralded a new style of supervision on financial institutions, one that would be more data-driven than we were used to previously.

You also witnessed an increased density of data requests and deep-dive analysis compared to the previous decades. Competent Authorities sought more detailed reporting on capital requirements and liquidity, resolution topics (i.e. reporting required under the aegis of the BRRD), and statistical topics (i.e. AnaCredit). The recent introduction of new reporting schemes like Minimum Requirement for own funds and Eligible Liabilities (MREL), Leverage Ratio, Liquidity ratios, Money-Market Statistical Reporting (MMSR), G-SIB disclosures, as well as the enrichment of the existing reporting on Own Funds and Risk Weighted Assets, just to mention some of them, put the banking system under an unprecedented scenario. A new statistical frontier is revealing itself with each passing year: not only statistical information, but also figures concerning resolution and prudential framework are now requested on a single-deal basis.

You have also addressed the question, if micro data is a push for transparency. Our focus on data dissemination and transparency has been further heightened and the crisis exposed glaring needs for additional data to better understand the build-up of risks in the financial sector, cross-border financial linkages, and the vulnerability of domestic economies to shocks. Whereas there has been a significant expansion in data dissemination the focus now needs to shift to make more granular data publicly available, beyond traditional aggregates. We are committed to push for more data transparency and very much welcome the IMF’s Data Standards Initiatives.

The world is evolving and the banking system is quickly adapting to this evolution. Data, statistics, are the basic material for assessing these changes. If statistical practices do not evolve we will find ourselves unable to properly analyse reality.

Although aggregate data can provide indications on real economic activity or on banks experiencing major funding difficulties for example, only data at the level of individual credit relationships between borrowers and banks permit a more robust analysis, taking into account detailed information on loan requests (including those completely rejected), the enterprise‘s economic situation, its creditworthiness and the bank‘s financial situation. In the current environment, micro data can be used to analyse whether banks‘ risk appetite is increasing and what conclusions should be drawn from this when setting monetary policy.

However, micro data play a role not only in monetary policy analysis but also in the implementation of monetary policy.. Furthermore, micro data play a key role in analysing financial stability. In order to be able to assess whether the failure of individual institutions might threaten the functioning of the entire system, information is required on the scale of financial linkages between institutions within a country or also across other jurisdictions. This is the only way to trace, simulate and forecast the different transmission channels as well as the mutual strengthening or dampening mechanisms. The impact of the actions of systemically important units on individual financial institutions, sectors as well as on entire states or currency areas can only be examined using micro data.

Let us not forget that the goal of the micro data initiatives is also for data to only have to be collected once where possible and then used to compile various statistics. Although such a paradigm shift entails initial investment costs, both for the reporting entities and for the Competent Authority processing the data, these are offset by manifold information gains and potential future savings as a result of consolidating or even replacing existing reporting requirements for traditional statistics.

In the years to come, we will be faced with the challenge of structuring the upcoming paradigm shift away from mainly providing aggregated data to providing much more micro data that have multidimensional uses in a single process. The aim of collecting data only once and subsequently use to create a wide variety of statistics requires a fundamental rethink of the existing reporting systems and will need time. Different statistical fields and users of statistics will have to harmonise and standardise their information needs and translate these into new reporting requirements. Efficiency gains can be achieved, however, if new data aggregates are derived from available information rather than requiring special surveys to be conducted or additional data collected. In this context, the vision of a European Reporting Framework (ERF) covering all data collections for supervision as well as for monetary policy and macro-prudential purposes is worth to be pursued and should be pursued.

In that sense we also have to strengthen banks’ risk data aggregation capabilities and internal risk reporting practices, one of our supervisory priorities. I want to thank Statistics in particular for their contribution to the thematic review on banks’ compliance with the principles of the Basel Committee on Banking Supervision for effective risk data aggregation and risk reporting, and in general for their important work on data quality.

I said that we are committed to push for more data transparency. At the ECB we have been publishing supervisory statistics on balance sheets, profitability and minimum capital requirements of significant institutions. The published information includes so far aggregated data on banks’ financial position, profits and losses, non-performing loans and regulatory capital adequacy. We are currently broadly reviewing and will be significantly increasing the data we are making available to the public, which should include further breakdowns and more key risk indicators. Our aim is to increase the granularity step by step.

3. Conclusions

I repeatedly said that I cannot promise that the ECB can once and for all eliminate the risk of another financial crisis. But ECB is equipped to minimise this risk. And statistics play a crucial role here. It is worth recalling that the inability to correctly measure and analyse the risks associated to banking activity was one of the hurdles to identify both the causes and possible solutions to the financial crisis.

Developing and communicating accurate and timely statistics is essential for avoiding the repetition of this in the future. For that reason, we all, persons and institutions involved in the banking statistical process, reporters, regulators, statisticians and supervisors, share a common responsibility towards society. Let’s keep on working on the construction of a more solid basis for the financial system of the future.

Thank you very much for your interest, attention and attendance.

Media contacts