Remarks of Richard Berner at the Financial Data Summit
Published: March 30, 2016
Remarks of Richard Berner, Director, Office of Financial Research, at the Financial Data Summit: Data Standards Mean Business! hosted by the Data Transparency Coalition March 29, 2016, Washington, D.C.
Good morning. Thank you, Bryce, for that kind introduction. Thanks also to Hudson Hollister and the Data Transparency Coalition for sponsoring this second annual Financial Data Summit and for inviting me here again this year.
In my remarks last year, I focused on data standards as a key to creating appropriate, quality, and accessible data. That story has not changed. Today, I will restate and illuminate the benefits of fit-for-purpose, quality, and accessible data and why data standards are critical to attaining them. And I’ll talk about what we can do together to adopt and implement them.
Common Goals and Progress
We in the policy community and you in industry have common goals. We both need financial data that are fit for the purpose, high quality, and accessible. We need them to assess and monitor risks across the financial system. Firms need them to manage risk and to reduce costs.
We have made significant progress:
On data scope – We have been working with industry to begin to fill critical data gaps, like those in bilateral repo, securities lending, and transactions data for reference rates.
On data quality – The development of the global legal entity identifier, or LEI, system is a good case study on the value of shared goals and cooperation. Government regulators and private financial firms both recognized the tremendous benefit that the LEI could bring. Industry provided the technical expertise and officials solved the “collective action” problems that were hurdles to adoption. That model has proved highly successful.
On data access – Regulators engaged in securely sharing critical data, such as those used for stress tests, are working on ways to share aggregated data with firms.
But there is much more to do:
Significant data gaps remain in a variety of areas, including those in asset management activities, insurance, and even in the trading of Treasury securities.
The use of data standards to assure data quality is far from ubiquitous, and some data collections are proceeding without appropriate standards. Swap data reporting is still fraught.
Understandably, especially in an age of cybersecurity threats, both regulators and firms are wary of information sharing. But deciding not to share in the face of threats is simply the wrong approach. Secure sharing is the answer.
None of this is news to this audience. We have common goals, and we agree that we should create and appropriately share the right high-quality data. So why are we still talking about how hard it is to do that?
Obstacles to Success
There are at least three obstacles to success. All are consequential, but none is going to stop us from what we need to do.
First, we are aiming at a moving target. The growing volume of information, financial migration, financial innovation, and the complexity of financial activity all create challenges. These challenges are legitimate. Volume is high and growing rapidly, for example, in corporate debt issuance. In addition, activity is migrating to new areas of the financial system, invalidating static collection procedures. Examples include short-term wholesale funding and securities financing transactions, such as repurchase agreements, or repo — a significant source of short-term funding in the financial system; the electronification of trading activities; and the rapid growth of alternative trading platforms.
The chains of financial activity became more complex before the crisis, obscuring risks. Although many of these went dormant in the wake of the crisis, new chains have arisen, for example in the plumbing of the financial system, where central clearing has become prevalent.
These aren’t sufficient reasons to give up. Our capacity to capture and analyze information has more than kept pace with these developments. So, we should have a better and more complete understanding of the financial system.
Second is the need for data security. Here, too, there are legitimate issues. Balance is needed between security and access.
Third is the frequent lack of alignment between the interests or incentives of industry and those of regulators. This obstacle is the most challenging and it takes several forms. Regulators often collect data that do not reflect the true nature of the risks in activity as seen by market participants. Data on private funds are a good example. Market participants use granular data for risk management that don’t reflect the true nature of their risk as seen by regulators. Counterparty data are good examples of that.
Aligning those public and private interests is a first key step to collecting the right data once, to adopt standards for our mutual benefit and to balance security needs with appropriate access to data. Aligning public and private interests will facilitate measuring financial activity in a consistent way across different markets and sources.
There is no technical obstacle to this objective. The only obstacles are organizational and institutional. We just have to do it.
What Does Success Look Like?
In the remainder of my time I’ll briefly discuss how to overcome these obstacles. I’ll discuss best practices for collection, standards, and sharing. These are the three legs of the financial data stool; all are essential and interdependent. For example, the application of standards assures quality and supports the effective sharing of financial data, as they enable precise conversations about the data to be shared.
I’ll close with the need for public and private collaboration and partnership.
Data must be comprehensive for a broad view across the financial system, as well as granular to help us identify tail risks during periods of stress.
Filling data gaps involves three steps:
- Decide on the key questions to answer.
- Identify the data needed to answer them.
- Assess whether existing data match those needs.
Best practices help align regulators’ and private interests. We undertake data collections as necessary, only after consultation with our fellow member agencies of the Financial Stability Oversight Council, or FSOC, to ensure we don’t duplicate efforts and create undue reporting burdens. As regulatory data demands have become increasingly extensive and granular, we are mindful of the burden on industry, and we continually seek new ways of gaining information.
Already, we have discussed with FSOC member agencies ways that standards organizations and regulators can exchange ideas before regulators put forth specifications on data collections. For example, the SEC’s security-based swap reporting provided data specifications and a map to FpML and to FIX.
We are committed to conducting pilot projects whenever possible before undertaking permanent data collections, to refine our methodology and maximize opportunities to align our interests with those of reporting companies.
Scope Examples: Data Collection Pilots
At last year’s summit, I noted our collaboration with the Federal Reserve and Securities and Exchange Commission, or SEC, on a pilot project to fill gaps in data describing repurchase agreements, or repo — in this case, bilateral repo, which accounts for about half of the $3.4 trillion daily U.S. repo market activity.
We have now finished collecting the pilot data from companies on a voluntary basis. In January, we published an OFR brief with aggregated data from the survey to provide greater transparency into the bilateral repo market for participants and policymakers.
We now have a second, related pilot project underway to fill gaps in data related to securities lending, in which securities owners lend stocks or bonds to other parties. We are planning permanent collections based on these pilot projects, in collaboration with FSOC member agencies.
You all know that data must be high quality to inform good policy decision-making and good risk management. As the financial crisis demonstrated, high-quality data are essential for the systemwide analysis needed for effective oversight. Quality makes financial data usable and transparent.
Although data standards do not guarantee quality, they are a key to achieving the uniqueness, accuracy, consistency, and completeness that data quality demands. Comparing, aggregating, and analyzing disparate datasets for financial stability analysis are nearly impossible without standards.
Unlike other industries, the financial services industry has been slow to define and implement consistent data standards and formats. Solving this “collective action problem” to speed the development of data standards is our job.
The LEI linchpin initiative did just that, and offers a roadmap for other initiatives. It involves three steps:
- Engage with industry.
- Publish and get comments on a white paper based on the results of that engagement.
- Publish a policy statement that announces criteria for picking the solution, and use them to select the winner.
This process engages with industry, aligns our interests with yours, and decides on a solution.
Quality Examples: LEI, UPI, UTI, Reference Data
You all know that the LEI is the centerpiece of the OFR’s work on data standards and the foundation for further achievement. As I mentioned, the process used to establish the global LEI system is a model for aligning the incentives of the public and private sectors, and thereby facilitating the collaboration necessary for success. The OFR has led the global, public-private collaboration that got the LEI system up, running, and growing in just a few short years.
We are also building on the LEI with two of our international partners — the Committee on Payments and Market Infrastructures and the International Organization of Securities Commissions, known collectively as CPMI-IOSCO — through a global working group to create a unique product identifier, or UPI, a unique transaction identifier, or UTI. We are also collaborating to define common descriptions of critical elements for derivatives data.
The UTI and UPI are important for precisely identifying each derivatives product and transaction that are reported to derivatives data repositories. They will enable the clear definition and communication of data in standardized formats across jurisdictions, and will help achieve successful global aggregation and sharing of derivatives data.
The OFR’s data quality program includes the development of what we call “reference data solutions.” These solutions will include a project to address our mandate under the Dodd-Frank Act to develop and publish a financial instrument reference database — an authoritative source for precise, common definitions, and for descriptive data, known as metadata.
The third part of our three-pronged approach to improving financial stability data involves exploring ways to make them accessible. Data must be accessible to those who need to look at risk systemwide.
Sharing data appropriately among regulators and industry helps to align interests, reduce the reporting burden, support risk management, and facilitate cross-market analysis. One prominent opportunity for such cross-market analysis would entail expanded collaboration and data sharing among our various regulators and the OFR in cash, options, and futures markets across a variety of assets classes.
During the financial crisis, the inability to access or share certain data prevented market participants and regulators from fully understanding the size and scope of risks throughout the financial system. The need to securely share data in a timely way remains paramount — for example, for monitoring the system during periods of relative calm or for forensic analysis after the breakout of a disruption to the system.
Accessibility Examples: Metadata Catalogs and Best Practices
We believe metadata catalogs are important to facilitate sharing. Metadata catalogs inform parties about the data — what they are and where they are located — without having to first grant access to the actual datasets. We are promoting the use of metadata catalogs by enhancing our own catalog and linking it to other agencies’ catalogs. If other agencies do not have catalogs, we plan to assist in creating and linking them.
In designing our approach to metadata catalogs, we are using established technologies and best practices used successfully in other industries. We also plan to create and promote a set of best practices for data sharing among regulators and between regulators and the public. For example, we are working to standardize memorandums of understanding that set the terms for timely and appropriate access to nonpublic data for the regulatory community.
We have made the financial system substantially more resilient since the crisis, but we still have more work to do. This work requires both engagement and collaboration, and I welcome your commitment and partnership as we move forward. Thank you again for having me here today. I would be happy to respond to questions.