Public Appearances

| By

Remarks by OFR Acting Director James Martin at OFR’s Conference of Climate-related Risks and Financial Stability

Remarks of James Martin, Acting Director, Office of Financial Research, at the OFR's Climate Implications for Financial Stability Conference on September 9, 2022.

Trust and the Global Coffeehouse: Digital Verification and the Legal Entity Identifier in a Modern Financial Market

Remarks by Dino Falaschetti, Director, Office of Financial Research, at the Global Legal Entity Identifier Foundation Forum on "Accelerating into a Digital Future: Simplifying Entity Identification for the Digital Age"

Remarks of Dino Falaschetti at the GovDATAx Summit

Remarks of Dino Falaschetti, Director, Office of Financial Research, at the Data Coalition’s GovDATAx summit, October 30, 2019, Washington, D.C.

Globalization and Financial Stability

Remarks of Richard Berner at the IMF 18th Jacques Polak Annual Research Conference on "The Global Financial Cycle"

Reducing the Regulatory Reporting Burden

Remarks of Richard Berner at the Financial Data Summit hosted by the Data Transparency Coalition

Remarks of Richard Berner at the Power of Transparency Speaker Series hosted by the Atlantic Council and Thomson Reuters

The OFR exercises the power of transparency by improving the quality, scope, and accessibility of financial data.

Remarks of Richard Berner at a conference on Big Data in Finance

The OFR and the Center on Finance, Law, and Policy at the University of Michigan hosted the conference.

The Interdisciplinary Approach to Financial Stability Analysis

Remarks of Richard Berner at the Conference on the New Pedagogy of Financial Regulation at the Columbia Law School

Remarks of Richard Berner at the conference on the Interplay Between Financial Regulations, Resilience, and Growth

The Federal Reserve Bank of Philadelphia, the Wharton Financial Institutions Center, Imperial College Business School, and the Journal of Financial Services Research sponsored the conference.

Remarks of Richard Berner at the conference on New Research and Outlook on Credit Markets

Conference hosted by S&P Global Market Intelligence, New York University Salomon Center and Stern School of Business, and S&P Global Institute.

Is the Current Credit Structure Conducive to Financially Stable Recovery?

Remarks of OFR Director Richard Berner: Is the Current Credit Structure Conducive to Financially Stable Recovery? Delivered at the 25th Annual Hyman P. Minsky Conference on the State of the U.S. and World Economies

Remarks of Richard Berner at the Financial Data Summit

The Data Transparency Coalition hosted the event, "Data Standards Mean Business!"

The Risk Outlook and Financial Stability

Remarks by OFR Director Richard Berner at the 2015 Annual Meeting and Public Policy Forum of the American Academy of Actuaries

Remarks by OFR Director Richard Berner at the conference on Interdisciplinary Approaches to Financial Stability

The OFR and the Center on Finance, Law, and Policy at the University of Michigan sponsored the conference, with support from the Smith Richardson Foundation, the University of Michigan College of Engineering, and its Ross School of Business.

Remarks by OFR Director Richard Berner at the Third Annual Workshop on Financial Interconnectedness

The Bank for International Settlements, the Netherlands Bank, the Deutsche Bundesbank, and the Review of Finance hosted the workshop.

Remarks by OFR Director Richard Berner at the Brookings Institution: Can the Financial Sector Promote Growth and Stability?

Good morning. It’s a pleasure for me to be here. I want to thank Martin Bailey and Doug Elliott for organizing this event and for inviting me to join you.

I’m delighted that Martin and Doug persevered after this event was snowed out back in March, because the topic is clearly central to discussions about financial stability. I will start by noting that the views I express here are solely my own and do not reflect those of the Department of the Treasury.

You framed the topic in the question, “Can the financial system promote growth and stability?” My answer is a strong yes — but that answer depends on the resilience and proper governance of the financial system.

As the financial crisis of 2007-2009 clearly demonstrated, when those critical ingredients are lacking, shocks can be highly disruptive to growth and stability. Thanks to strong intervention, the financial system and the economy have recovered, but challenges to build and assure resilience remain.

To this audience, the proposition that growth depends on a strong and stable financial system may now seem obvious. Yet a decade ago, many of you would not have asked whether the financial system could promote growth and stability. The consensus was that monetary policy had achieved price stability — low and stable inflation — and by reducing the inflation risk premium, had unlocked the door to strong, sustainable economic growth. To be sure, nagging questions lingered about financial booms and bond-market conundrums. But 20 years of the “Great Moderation” fed a lack of caution.

In my remarks this morning, I will try to look at these issues in greater detail and answer three related questions:

  • Has the legacy of the crisis held back economic recovery?
  • Are there tradeoffs between stability and growth?
  • What have we done to assure financial stability, what is left to do, and what are the current threats?

The short answers to those questions are yes; yes and no; a lot, a lot, and three that I will highlight. In answering the last question, I will of course describe what we at the Office of Financial Research are doing to promote financial stability.

Regarding the first question, the slow healing of the financial system and of its capacity to provide credit continues to be a headwind to growth.

This is not news. The crisis impaired the availability of credit by forcing financial firms, other businesses, and households to deleverage; that is, to reduce their debt. However, deleveraging also helped advance U.S. financial healing. As evidence, household debt and debt-service ratios have declined to, or below, sustainable levels and some forms of credit are now growing along with the economy.

But healing is incomplete. For example, U.S. mortgage credit availability is still constrained and that’s a factor behind our subpar economic performance. Slow financial healing is a key reason why central banks globally continue to deploy both conventional and unconventional monetary policies, and why real, long-term interest rates are at such low levels.

Are there tradeoffs between stability and growth? In the short run, some of the efforts to increase resilience may require adjustments that could raise intermediation costs and hence be temporary headwinds to growth. But in the spring and summer of 2009, the incentives of the stress tests and future TARP repayment requirements enabled banks massively to raise capital. That stabilized the system and enabled banks to resume lending to support the recovery.

That there may be some short-run trade-offs should hardly be surprising. Analogously, between 1979 and 1982, achieving price stability incurred significant short-run economic costs. But the long-run gains were considerable, and I think the long-run gains in this case will also be significant.

Yet concerns are arising that regulation may perversely be contributing to more permanent adjustments that could impair market functioning. For example, several developments since the financial crisis have altered trading liquidity in securities markets and the ways investors redeem holdings to get cash. Some of these developments are by design: Regulations imposing tighter restrictions on bank leverage have increased the cost of securities financing activities — even against low-risk securities, such as Treasuries — and have reduced incentives to maintain these activities and the portfolios behind them. Bank trading books require more capital, and the Volcker rule requires banks and their affiliates to refrain from proprietary trading.

However, regulations are far from the only factors at play. Some causes are cyclical, such as changes in the supply of — and demand for — collateral, and changes in risk preferences. Other causes appear to be structural, such as changes in the investor base, in securities markets, and in the development of new financial products. In addition, changes in market structure, such as the spread of high-frequency trading and algorithmic trading to fixed-income markets, may be at work in the sharp movements in prices we observe from time to time.

Some traditional indicators, such as bid-ask spreads, suggest that reasonable market liquidity persists. But others do raise concerns. Since the crisis, market liquidity has become more fragmented in a few markets, such those for sovereign bonds in emerging markets and U.S. corporate bonds. Signs of bifurcation or fragmentation include the concentration of dealer inventories in high-quality liquid assets, declines in trading turnover relative to market size, declines in the size of average trades, and increased settlement failures.

Perhaps more importantly, liquidity appears to have become increasingly brittle, even in the world’s largest bond markets. Although liquidity in these markets looks adequate during normal conditions, it seems to disappear abruptly during episodes of market stress, contributing to disorderly price changes. In some markets, these episodes are occurring with greater frequency. Examples include the mid-2013 sell-off in U.S. fixed-income markets, the October 2014 dislocation in U.S. Treasuries and futures markets, and the sharp moves in euro-area government bonds in early May of this year and in the past few days. None of these episodes disrupted U.S. financial stability, nor do we yet sufficiently understand their causes. But together they highlight a potential weakness in markets that could amplify the impact of financial shocks.

Research to identify causes is a cottage industry. For example, in a recent OFR working paper, our researchers explored patterns that connect daily liquidity conditions across a broad range of financial markets to financial conditions in the aggregate, both in normal times and under stress. This basic research represents a potential framework for monitoring market liquidity, to extract signals to warn of impending disruptions.

What about long-term trade-offs between stability and growth? I think the benefits of stability for growth will far outweigh any costs. In fact, financial stability is essential for sustainable growth. To quote former Fed Chairman Bernanke, “Even in (or perhaps, especially in) stable and prosperous times, monetary policymakers and financial regulators should regard safeguarding financial stability to be of equal importance as — indeed, a necessary prerequisite for — maintaining macroeconomic stability.”

Let me now move to my final questions and my final answers.

The questions were, “What have we done to assure financial stability, what is left to do, and what are current threats to it?” And my answers were, “a lot, a lot, and three that I will highlight.” To elaborate on those answers, I want to provide some context.

Out of the crisis came a widespread appreciation for a different approach to policymaking. Financial stability is now a widespread policy objective. Policy analysis is focused on assessing threats to financial stability and policymakers are creating more tools to combat those threats — developing what we call the macroprudential toolkit. Macroprudential is a fancy word that means we now look across the entire financial system, not just in individual institutions or markets, to assess and mitigate threats to financial stability.

The crisis also exposed the need to improve the quality and scope of financial data to monitor activity across the financial system. Before the crisis, the data available to measure financial activity and exposures were too aggregated, too limited in scope, too out of date, or otherwise incomplete. No wonder regulators and policymakers poorly understood the extent of leverage, liquidity, and maturity transformation, the growth of nonbank activity, and exposures. The data failed to show them.

As you know, the Dodd-Frank Act established the Financial Stability Oversight Council (FSOC or Council) to develop and implement the toolkit, and created the Office of Financial Research to fill the gaps in data and analysis. The Council is charged with assessing and monitoring threats to financial stability, developing remedies for those threats, and restoring market discipline by eliminating too big to fail. We at the OFR have a mission to help promote financial stability by collecting and improving the quality of financial data and developing tools to evaluate risks to the financial system. Simply put, our work supports economic growth by helping to strengthen the financial infrastructure that growth requires.

Our new macroprudential toolkit needs to assess the fundamental sources of vulnerability, to be more forward-looking, and to test the resilience of the financial system to a wide range of events and incentives. Assuring financial stability is not about predicting, much less preventing, the next financial crisis. Rather, the toolkit must be aimed at improving financial system resilience to withstand the next crisis and assure system functionality under stress.

In the past five years, federal financial regulators have taken important steps to make the financial system more resilient. Since the crisis, officials have conceived and put in place banks’ new capital requirements. Bank regulators also agreed on key components of liquidity regulation and minimum requirements for firms’ holdings of liquid assets. In addition, two tools have dramatically changed the approach to increasing resilience. The first is stress testing, which helps calibrate resilience and thus capital requirements. The second is a new regime to resolve large, complex, and troubled financial institutions in an orderly way.

These are consequential achievements that have made the banking system stronger. But vulnerabilities are still present outside the banking perimeter and across the financial system. We need tools to address them, and to develop the tools, we need to analyze and measure the vulnerabilities. That’s especially important as financial activity migrates to more opaque and potentially less resilient parts of the financial system.

Here too, there is progress. Work is ongoing to assess risks in aspects of so-called shadow banking and to develop tools to limit them. For example, there is agreement that minimum floors on haircuts can strengthen secured, short-term wholesale funding markets. New regulations are also in place to strengthen derivatives markets and make them more transparent. Because these initiatives must cut across the financial system, close collaboration among U.S. financial regulators is critical for their success. The Council and the OFR can each play important roles in such collaboration.

In the past five years, we have improved our understanding of how the financial system functions, and our ability to measure financial activity and spot vulnerabilities. But we need to do more to understand how the financial system fails to function under stress, to spot vulnerabilities in the shadows, and to gather and standardize the data needed for our critical analysis and policymakers’ responses to identified threats. We know that financial innovation and the migration of financial activity create a moving target, so our goal to eliminate gaps in data and analysis will always elude us. But we will continue to fill the most important ones.

At the OFR, we are looking across the financial system to fill gaps in financial data and analysis. I’ll give you a few examples of our work.

First are the data initiatives, which distinguish us from other macroprudential authorities:

  • We are filling gaps in bilateral repo data in collaboration with the Federal Reserve. This project promises to improve our measurement and understanding of a key short-term funding market. A repurchase agreement, or repo, is essentially a collateralized loan — when one party sells a security to another party with an agreement to repurchase it later at an agreed price. Of the $3.8 trillion in funding the U.S. market provides daily, about half are in bilateral transactions, but data on such repos are scant. Because the repo market remains vulnerable to runs and asset fire sales, obtaining more information about these transactions will fill an important data gap.

  • The OFR is helping the Commodity Futures Trading Commission and other regulators improve data quality in registered swap data repositories. These repositories are designed to be high-quality, low-cost collection points for data that are critical to understand exposures and connections across the financial system. We and the CFTC are jointly working to enhance the quality, types, and formats of data collected. This work is inherently global, so we are each collaborating on it with our counterparts at the Bank of England and the European Central Bank.

  • The OFR is improving the quality of financial data by developing and promoting the use of data standards. We have led the initiative among governments and private industry worldwide to establish a global Legal Entity Identifier or LEI — a data standard that is like a bar code for precisely and uniquely identifying parties to financial transactions. If the LEI system had been in place in 2008, the industry, regulators, and policymakers would have been better able to trace the exposures and connections of Lehman Brothers and others across the financial system. The LEI initiative has become fully operational in just a few years. But ubiquity is needed to realize its full benefits, so I have called for mandating its use for regulatory reporting.

In our Research and Analysis Center, we are developing new tools to assess and monitor vulnerabilities. For example:

  • Our Financial Stability Monitor helps us assess risks in five functional areas — macroeconomic, market, credit, funding and liquidity, and contagion — instead of in institutions or markets. By so doing, the monitor helps us look across the financial system and spot threats wherever they arise.

  • We are developing tools to assess risks in each of these five categories. For example, we are using agent-based models to assess contagion risks in financial networks. These models have been used to study the spread of epidemics and ways to mitigate them. Likewise, they hold great promise for understanding the dynamics of fire sales, the spillovers from the default of a major counterparty in central clearing counterparties, and other chains of complex events.

  • We supplement our financial stability analysis at the OFR with market intelligence. In February, we launched a Financial Markets Monitor that summarizes major developments and emerging trends in global capital markets. By making it public, we aim to increase transparency, to enhance the availability of financial information, and to facilitate timely reactions by the private sector to emerging risks and thereby to defuse them.

Before I close this morning, I want to elaborate on the response to the final hard question that I posed at the beginning of my remarks: What are the current threats to financial stability? In our 2014 Annual Report that we published in December, we said the U.S. financial system has continued to recover and strengthen. Compared with the period just before the financial crisis, threats to financial stability are moderate. But we noted that the relatively benign backdrop is no cause for complacency because several financial stability risks increased during the previous year.

Six months later, that’s still true. In my remarks today, I have touched on two of those risks. The first is vulnerabilities associated with market liquidity and the second is the migration of financial activities toward opaque and less-resilient corners of the financial system. A third major risk is due to excessive risk-taking in some markets during the extended period of low interest rates and low volatility.

Someone asked me recently what risks keep me up at night. I worry most about the risks I understand the least. Where are our blind spots? Has the continuous evolution and innovation in the system caused a build-up of risks in a part of the system where we are not looking? The unknown risks are what keep me up at night.

To sum up, the legacy of the financial crisis lingers and we are just emerging from its effects on economic performance. Growth and financial stability can not only coexist, but financial stability is a predicate for sustainable prosperity. We have done a lot to build a more resilient financial system, but we have much more to do. Assuring financial stability will always be an ongoing challenge.

Thank you for your attention. I will be happy to take your questions.

Remarks by OFR Director Richard Berner at the SIFMA OPS 2015 Operations Conference & Exhibition

Good morning, it’s a pleasure to be here. I want to thank SIFMA for sponsoring this conference and for inviting me.

I also want to express my special thanks for SIFMA’s strong support for the legal entity identifier, or LEI, initiative and for continuing support for future initiatives to promote high-quality financial data through the use of data standards.

Developing and promoting standards for financial data are central to the mission of the Office of Financial Research. I believe that our engagement with you in industry to develop and promote the global LEI system is a model for cooperation between government and industry on a worldwide scale — a model that I hope will serve as a blueprint for progress in the future.

This morning I will focus on the OFR’s data standards agenda, past, present, and future. Related, I will discuss how we at the OFR are meeting the critical needs to share data and to fill data gaps. I will conclude by outlining some current threats to financial stability and explain how our work illuminates their assessment and monitoring.

As you know, the OFR was created by the Dodd-Frank Act in 2010 to help promote financial stability by delivering high-quality financial data and analysis for the benefit of the Financial Stability Oversight Council — FSOC — and the public.

Financial stability monitoring, analysis, and research certainly are not new. But the financial crisis that began in 2007 changed the conversation. The crisis exposed critical gaps in our analysis and understanding of the financial system, in the data and metrics used to measure and monitor financial activities, and in the policy tools available to mitigate potential threats to financial stability. These gaps — in analysis, data, and policy tools — contributed to the crisis and hampered efforts to contain it.

These three gaps are interconnected, like links in a chain. Weakness in any of the three links could impair our overall ability to spot and address weaknesses in the financial system. We need good analysis to make good policy. And we need good data — solid, reliable, granular, timely, and comprehensive data — to conduct good analysis and monitoring. In other words, good data are the foundation for success in our work and for effective risk management in financial companies.

It may seem obvious to all of you in this room, but I cannot overemphasize the importance of quality in making financial data usable and transparent. When Lehman Brothers failed six years ago, its counterparties could not assess their total exposures to Lehman. Financial regulators were also in the dark because there were no industry-wide standards for identifying and linking financial data representing entities or instruments.

Fortunately, we have made progress in improving both the scope and the quality of financial data. However, gaps persist and it’s our job to fill them.

The global LEI is the cornerstone for financial data standards. As you know, the LEI is like a bar code for precisely identifying parties to financial transactions.

The LEI has gone from conception to nearly full-fledged operational system in just a few years. Currently, the OFR’s Chief Counsel serves as chairman of the Regulatory Oversight Committee, which oversees the LEI system.

Had the LEI system been in place in 2008, the industry, regulators, and policymakers would have been better able to trace Lehman’s exposures and connections across the financial system. The LEI system also generates efficiencies for financial companies in internal reporting and in collecting, cleaning, and aggregating data. In addition, I expect it will ease companies’ regulatory reporting burdens by reducing — and eventually eliminating — overlap and duplication.

Within SIFMA and across the financial services industry, support for the LEI has been strong. In fact, SIFMA and other major trade groups have called for government regulators to mandate its use — a rare example of industry asking for more regulation.

The global LEI system is up, running, and growing. Like any network, the LEI system has benefits that will grow as the system grows.

As many of you know, the OFR is working to accelerate adoption by calling for regulators to require broader use of the LEI in regulatory reporting. We have also called for broad adoption of standards for instruments and products as they become available. And I repeat those calls today.

Regulators have begun to respond. The Securities and Exchange Commission required the use of the LEI in rules announced a couple of months ago for reporting data related to securities-based swaps. More recently, the Federal Reserve Board announced a proposal to require banking organizations to include their existing LEIs on certain regulatory reporting forms.

At the OFR, we recognize that the LEI system is only the start. There is much more to be done. On that foundation, we are helping to build other standards — that’s where we need your continuing help and support.

The LEI gives us insight into “Who is who?” among legal entities. A second standard can help us identify “Who owns whom?” — a standard for hierarchies, or corporate structures, for easily identifying firms’ subsidiaries. This standard promises to help us make continued progress in tracing the interconnections in the financial system.

We are also working on a spectrum of other identifiers to help us answer the question, “Who owns what?”

Over the past year, we have begun to develop plans for a reference database for financial instruments, as we are required to do by law. Rather than start from scratch, we want to leverage work already done by others. For example, private firms, nonprofits, and academics offer products for instrument identification and analytics. We don’t want to compete with them; we want to include them in the design so that their systems can talk to each other. The reference database would thus connect these components together.

As we further develop our plans for this reference database, we expect to consult with interested parties and invite comments on how best to take advantage of existing work, while providing a coherent, systemwide reference. We hope this initiative will result in new opportunities for collaboration, research, and innovation across the financial system.

One example of the systemwide approach to data quality improvement is in derivatives markets. Financial reform sought to improve transparency in derivatives markets by requiring that data related to transactions in swaps be reported to swap data repositories. Swap data are critical to understand exposures and connections across the financial system, and the repositories are designed to be high-quality, low-cost data collection points.

We at the OFR and our colleagues at the Commodity Futures Trading Commission — CFTC —both want to promote the use of data standards in swap data reporting to assure data quality and utility. A year ago, we began a joint project to enhance the quality, types, and formats of data collected from registered swap data repositories. Together, we are aggressively moving forward to address key data quality issues and inconsistencies in how data are reported across repositories. We are also collaborating on developing uniform global unique transaction identifiers and unique product identifiers.

OFR collaboration on data standards includes not only U.S. regulators but also regulators overseas. For example, the OFR joined with the Bank of England and the European Central Bank in mid-January to convene a forum entitled, “Setting Global Standards for Granular Data,” the first of two workshops. Discussions in this workshop built on our work with the CFTC and on work by the Committee on Payments and Market Infrastructures, the International Organization of Securities Commissions, and the Financial Stability Board to identify core standards needed to share and use data on over-the-counter swaps on a global basis.

This need to improve ways to securely share sensitive data, both among authorities within the same jurisdiction and across borders, is important to improve the quality of financial data, to reduce or eliminate duplication, and to carry out our work. Data sharing is essential because none of us — no single regulator, company, or industry segment — possesses or has access to all of the data needed to paint a complete picture of threats to financial stability. The financial system is complex and ever-changing, so even if we put all of our data together in one place, significant gaps would remain and new ones would emerge.

A perfect understanding of how to fill those gaps will always elude us. But by working together, we can fill many of them. Our goals of sharing and standardizing financial data across the globe certainly face hurdles, including legal barriers, data security concerns, and confidentiality restrictions. These challenges are legitimate and potentially daunting. But we do not want to risk the potential consequences if future financial shocks were to trigger another crisis simply because we lacked the benefit of high-quality data to illuminate financial system vulnerabilities and possible ways to mitigate them.

With collaboration, we can make critical information available to decision makers, while finding ways to secure the information, protect confidentiality, and honor legal requirements.

So, how do we approach data security? We know that the critical need to maintain the security of confidential data is an obstacle to data sharing. Bad outcomes can result if highly sensitive data fall into the wrong hands. Government organizations that collect and maintain the data have well-established and time-tested security measures in place. Taking the data outside the sphere of these protective measures could potentially introduce grave risks.

The remedy for this obstacle is that data sharing must occur with controls and safeguards every bit as rigorous as the controls and safeguards that the sources of the data have employed and refined over time.

To share data, regulators must work out legal agreements and determine the technology and standards to use for exchanging the data. At the OFR, we are committed to protocols and procedures for collecting, storing, and appropriately sharing data that meet or exceed the strict standards of our data-sharing partners. In fact, the OFR has developed world-class data security procedures. And we have defined and adopted a data security classification scheme and matching controls to assure secure data handling and sharing.

Through bilateral data-sharing agreements among FSOC member agencies, all participants can be assured that shared data will be protected, secured, and treated consistently.

We are already sharing data under such agreements. Examples include our access to the Securities and Exchange Commission’s detailed data about hedge funds and other private funds in Form PF, and their detailed money market fund data in Form N-MFP. Using the Form PF data, the OFR is evaluating leverage across different hedge fund strategies, with a particular interest in sources of leverage.

To form a complete picture of the financial system, we must also fill gaps by collecting new data from firms and markets. Our partnership with the Federal Reserve to fill gaps in data describing repurchase agreements, or repo, is a good example of such initiatives. In October, we announced a pilot project to understand how to fill these gaps, and today, we are well underway.

As you know, a repo is essentially a collateralized loan — when one party sells a security to another party with an agreement to repurchase it later at an agreed price. Repos are an important source of short-term funding for the financial system. The U.S. repo market provides an estimated $3.8 trillion in funding daily. However, the repo market can also contribute to risks to financial stability.

The repo market is comprised of two parts: the triparty repo market, in which transactions are centrally settled by two large clearing banks, and the bilateral market, where repo transactions are cleared and settled privately between two firms. (The General Collateral Financing Repo [GCF Repo®] Service, in which the Fixed Income Clearing Corporation acts as a central counterparty, also settles on the triparty platform.)

Information and data on the triparty market are published regularly, but information about bilateral repos is scant.

The project is designed to fill the gaps in bilateral repo data, and it marks the first time the OFR is going directly to industry to collect financial market information. Participation in the pilot project is voluntary, and participating companies have provided input on what data should be gathered and what templates should be used for data collection. This pilot is intended to inform future collection efforts, which we hope to initiate quickly. Aggregated data from the pilot will be published to provide greater transparency into the bilateral repo market for participants and policymakers.

This repo pilot project is one example of how the OFR looks across the financial system to fill gaps in analysis and financial data. We have also begun a related initiative — a first cousin to the repo project — to fill gaps in securities lending data. Working with the Fed and the SEC, we are reaching out to lenders and borrowers to understand where the gaps are. And we are developing templates for a pilot project similar to the repo pilot, to collect data on loans, terms, and collateral uses.

Let me conclude by discussing our assessment of current threats to financial stability. Compared with the period just before the financial crisis, I think such threats are moderate. But vigilance remains critical because we see some risks to financial stability increasing, such as excessive risk-taking in some markets during the extended period of low interest rates and low volatility, vulnerabilities associated with declining market liquidity in a few markets, and the migration of financial activities toward opaque and less resilient corners of the financial system.

Let’s step back for a moment. I started my remarks this morning by recalling how the financial crisis exposed deficiencies in data, in our understanding of the functioning of the financial system, and in tools to promote financial stability. Thanks to strong intervention, the financial system and the economy have both recovered, but there is more work to do.

To this audience, the point may seem simplistic but is still worth noting: Economic growth depends on a strong and stable financial system. At the OFR, our work supports such growth by helping to promote the resilience in the financial infrastructure that growth requires.

Yet our current environment of slow growth is raising questions about whether tradeoffs exist between resilience and growth. My answer: Only in the short run. Efforts to increase resilience may require adjustments that could be temporary headwinds to growth.

To be sure, there are concerns that more permanent adjustments could impair market functioning. A moment ago, I mentioned my concerns about liquidity in some financial markets. Indeed, several developments since the financial crisis have altered the amount of liquidity available in the financial system and the ways investors redeem holdings to get cash.

By design, regulation is intended to strengthen the capital positions of financial institutions and limit excessive risk-taking. As a consequence of some of these measures, balance-sheet constraints have increased the cost of securities financing activities and reduced incentives to maintain them.

However, regulations are far from the only factors at play. Some causes are cyclical, such as a decline in available collateral and changes in risk preferences. Other causes appear to be structural, such as changes in the investor base, in the development of new financial products, and in the structure of securities markets, such as the spread of high-frequency trading and algorithmic trading to fixed-income markets.

Traditional indicators do not support excessive concern about market liquidity. But, since the crisis, market liquidity has become more fragmented in a few markets, such those for sovereign bonds in emerging markets and U.S. corporate bonds. Signs of bifurcation or fragmentation are evident in the concentration of dealer inventories in high-quality liquid assets, declines in trading volumes, declines in the size of average trades, and settlement failures.

The fragile nature of liquidity was especially evident during the sell-off in fixed-income markets in mid-2013 and during market dislocations in September and October 2014. Neither development was widespread nor severe enough to prompt outsized price declines or to force firms to reduce their debt, but these developments bear watching, measuring, and analyzing.

Someone asked me recently what risks keep me up at night. The truth is, the risks that worry me the most are the risks I understand the least. Where are our blind spots? Are there significant risks in the system we are not seeing? Has the continuous evolution and innovation in the system caused a build-up of risks in a part of the system where we are not looking? The unknown risks are what keep me up at night and because the financial system is constantly changing, new risks are constantly emerging.

In short, I love my job, but I’ve learned to accept that sleepless nights come with it.

Thank you again for having me here this morning. I will be happy to take your questions.

Remarks by OFR Director Richard Berner at the Financial Regulation Summit: Data Transparency Transformation

Good afternoon, it’s a pleasure to be here. I want to thank Hudson Hollister and the Data Transparency Coalition for sponsoring this important Summit, and for inviting me to discuss how standards can improve financial data and thus transparency.

As you know, developing and promoting standards for financial data are central to the mission of the Office of Financial Research. Likewise, I know that financial data standards are important to everyone in this room, and I appreciate your interest in, and support for, their use — now and in the future.

Today I will focus on the OFR’s data standards agenda, past, present, and future. Related, I will discuss how we at the OFR are meeting the critical needs to share data and to fill data gaps.

As you know, the OFR was created by the Dodd-Frank Act in 2010 to help promote financial stability by delivering high-quality financial data and analysis for the benefit of the Financial Stability Oversight Council — FSOC — and the public.

Financial stability monitoring, analysis, and research certainly are not new. But the financial crisis that began in 2007 changed the conversation. The crisis exposed critical gaps in our analysis and understanding of the financial system, in the data and metrics used to measure and monitor financial activities, and in the policy tools available to mitigate potential threats to financial stability. These gaps — in analysis, data, and policy tools — contributed to the crisis and hampered efforts to contain it.

These three gaps are interconnected, like links in a chain. Weakness in any of the three links could impair our overall ability to spot and address weaknesses in the financial system. We need good analysis to make good policy. And we need good data — solid, reliable, granular, timely, and comprehensive data — to conduct good analysis and monitoring. In other words, good data are the foundation for success in our work and for effective risk management in financial companies.

It may seem obvious to all of you in this room, but I cannot overemphasize the importance of quality in making financial data usable and transparent. When Lehman Brothers failed six years ago, its counterparties could not assess their total exposures to Lehman. Financial regulators were also in the dark because there were no industry-wide standards for identifying and linking financial data representing entities or instruments.

Standards are needed to produce high-quality data. Transparency follows from quality; for data, transparency means that all users understand what the data represent.

Fortunately, we have made progress in improving both the scope and the quality of financial data. However, gaps persist and it’s our job to fill them.

The global legal entity identifier, or LEI, is the cornerstone for financial data standards. As you know, the LEI is like a bar code for precisely identifying parties to financial transactions.

The OFR has led the global LEI initiative from the start, and it has gone from conception to nearly full-fledged operational system in just a few years. Currently, the OFR’s Chief Counsel serves as chairman of the Regulatory Oversight Committee, which oversees the LEI system.

I don’t have to tell this audience that the case for ubiquitous adoption of this data standard is strong.

Had the LEI system been in place in 2008, the industry, regulators, and policymakers would have been better able to trace Lehman’s exposures and connections across the financial system. The LEI system also generates efficiencies for financial companies in internal reporting and in collecting, cleaning, and aggregating data. In addition, I expect it will ease companies’ regulatory reporting burdens by reducing — and eventually eliminating — overlap and duplication.

The financial services industry has strongly supported the LEI initiative. In fact, major trade groups have called for government regulators to mandate its use — a rare example of industry asking for more regulation.

The global LEI system is up, running, and growing. Like any network, the LEI system has benefits that will grow as the system grows.

To accelerate adoption, the OFR has been calling for regulators to require broader use of the LEI in regulatory reporting. We are also calling for broad adoption of standards for instruments and products as they become available.

Regulators have begun to respond. The Securities and Exchange Commission required the use of the LEI in rules announced a month ago for reporting data related to securities-based swaps. Last week the Federal Reserve Board announced a proposal to require banking organizations to include their existing LEIs on certain regulatory reporting forms.

I don’t have to dwell on how these data standards can help you, your companies, and your agencies. But I would like to outline what you could do to help us all realize their full benefits.

In particular, we’d like to call on you in the Data Transparency Coalition as a collaborator in pursuing data standards. That would be appropriate given your past work in helping regulatory agencies understand, use, and benefit from more effective use of data standards, in particular XBRL.

As you know, the Federal Deposit Insurance Corporation adopted XBRL as the data standard for commercial bank financial statement reports, often referred to as Call Reports. Although this project has streamlined the call report process by reducing errors, cutting processing times, decreasing costs for banks, and improving the comparability of the data, different federal agencies are using different standards for identifying entities in XBRL data. The lack of a common standard limits XBRL reporting efficiency and raises costs. Adopting the LEI for entity identification across agencies would improve efficiency and the quality of the reported data. The time has come to do just that, and we’d like your support for it.

At the OFR, we recognize that the LEI is a critical and foundational data standard. But we also recognize that it is only the start and there is much more to be done. On that foundation, we are helping to build other standards — that’s where we also need your help and support.

The LEI gives us insight into “Who is who?” among legal entities. A second standard can help us identify “Who owns whom?” — a standard for hierarchies, or corporate structures, for easily identifying firms’ subsidiaries. This standard promises to help us make continued progress in tracing the interconnections in the financial system.

The OFR is also working on a spectrum of other identifiers to help us answer the question, “Who owns what?”

Over the past year, we have begun to develop plans for a reference database for financial instruments, something that we are required to do by law. We do not want to start from scratch; rather we want to leverage work already done by others. For example, private firms, nonprofits, and academics offer products for instrument identification and analytics. We don’t want to compete with them; we want to include them in the design so that their systems can talk to each other. The reference database would thus connect these components together.

We expect that as we further develop our plans for this reference database, we will consult with interested parties and invite comments on how best to take advantage of existing work while providing a coherent, system-wide reference. We would thus expect this initiative to result in new opportunities for collaboration, research, and innovation across the financial system.

One example of the system-wide approach to data quality improvement is in derivatives markets. Financial reform sought to improve transparency in derivatives markets by requiring that data related to transactions in swaps be reported to swap data repositories. Swap data are critical to understand exposures and connections across the financial system, and the repositories are designed to be high-quality, low-cost data collection points.

We at the OFR and our colleagues at the Commodity Futures Trading Commission — CFTC —both want to promote the use of data standards in swap data reporting to assure data quality and utility. A year ago, we began a joint project to enhance the quality, types, and formats of data collected from registered swap data repositories. Together, we are aggressively moving forward to address key data quality issues and inconsistencies in how data are reported across repositories. We are also collaborating on developing uniform global unique transaction identifiers and unique product identifiers.

OFR collaboration on data standards includes not only U.S. regulators but also regulators overseas. We view the international cooperation behind the LEI as a blueprint for future progress on other global financial data standards.

For example, the OFR joined with the Bank of England and the European Central Bank in mid-January to convene a forum entitled “Setting Global Standards for Granular Data,” the first of two workshops. Discussions in this workshop built on our work with the CFTC and on work by the Committee on Payments and Market Infrastructures, the International Organization of Securities Commissions, and the Financial Stability Board to identify core standards needed to share and use data on over-the-counter swaps on a global basis.

Data standardization is essential but not sufficient to improve the quality of financial data, to reduce or eliminate duplication, and carry out our work. We also need to improve ways to securely share sensitive data, both among authorities within the same jurisdiction and across borders. Data sharing is essential because none of us — no one regulator or company alone — possesses or has access to all of the data needed to paint a complete picture of threats to financial stability. The financial system is complex and ever-changing, so even if we put all of our data together in one place, significant gaps would remain and new ones would emerge. It is a puzzle with many interlocking types and pieces of data.

A perfect understanding of how to fill those gaps will always elude us. But by working together, we can fill many of them. Your companies and agencies hold some of those puzzle pieces. It is essential for us to collaborate appropriately to put them together, to see where the data gaps are, and to fill them. Equally, sharing is critical for the regulatory community to reduce duplication and overlaps in data collection.

You may be thinking that’s easier said than done. We all realize that our goals of sharing and standardizing financial data across the globe face hurdles, including legal barriers, data security concerns, and confidentiality restrictions. These challenges are legitimate and potentially daunting. But we do not want to risk the potential consequences if future financial shocks were to trigger another crisis simply because we lacked the benefit of high-quality data to illuminate financial system vulnerabilities and possible ways to mitigate them.

We believe the efforts underway nationally and internationally demonstrate that, with collaboration, we can make critical information available to decision makers, while finding ways to secure the information, protect confidentiality, and honor legal requirements. For example, the recent report from the Irving Fisher Committee on Central Bank Statistics contains an excellent summary of recommendations and best practices.

As I noted, the critical need to maintain the security of confidential data is an obstacle to data sharing. Bad outcomes can result if highly sensitive data fall into the wrong hands. Government organizations that collect and maintain the data have well-established and time-tested security measures in place. Taking the data outside the sphere of these protective measures could potentially introduce grave risks.

The remedy for this obstacle is that data sharing must occur with controls and safeguards every bit as rigorous as the controls and safeguards that the sources of the data have employed and refined over time.

To share data, regulators must work out legal agreements and determine the technology and standards to use for exchanging the data. Now is the time for agencies to work through the legal and technological issues. During times of crisis, we will have neither the latitude nor the time to ensure that we can share data safely and quickly.

At the OFR, we are committed to protocols and procedures for collecting, storing, and appropriately sharing data that meet or exceed the strict standards of our data-sharing partners. In fact, the OFR has developed world-class data security procedures. And we have defined and adopted a data security classification scheme and matching controls to assure secure data handling and sharing. Using them, supervisors can know that their data are just as secure with us as they are in their own systems.

We have also sponsored exploratory research and discussions on the use of cryptological tools to protect data. And we are routinely communicating with providers of information about how to anonymize information to ensure that confidentiality is protected, whether information is viewed alone or in combination with other information.

Through bilateral data-sharing agreements among FSOC member agencies, all participants can be assured that shared data will be protected, secured, and treated consistently.

We are already sharing data under such agreements. Examples include our access to the Securities and Exchange Commission’s detailed data about hedge funds and other private funds in Form PF, and their detailed money market fund data in Form N-MFP. Using the Form PF data, the OFR is evaluating leverage across different hedge fund strategies, with a particular interest in sources of leverage.

As we all know, sharing alone will not fill all of the data gaps we need to fill to form a complete picture of the financial system. We must also fill gaps by collecting new data from firms and markets.

Our partnership with the Federal Reserve to fill gaps in data describing repurchase agreements, or repo, is a good example of such initiatives. In October, we announced a pilot project to understand how to fill these gaps, and today, we are well underway.

As you know, a repo is essentially a collateralized loan — when one party sells a security to another party with an agreement to repurchase it later at an agreed price. Repos are an important source of short-term funding for the financial system. The U.S. repo market provides an estimated $3.8 trillion in funding daily. However, the repo market can also contribute to risks to financial stability.

The repo market is comprised of two parts: the triparty repo market, in which transactions are centrally settled by two large clearing banks, and the bilateral market, where repo transactions are cleared and settled privately between two firms. (The General Collateral Financing Repo [GCF Repo®] Service, in which the Fixed Income Clearing Corporation acts as a central counterparty, also settles on the triparty platform.)

Information and data on the triparty market are published regularly, but information about bilateral repos is scant.

The project is designed to fill the gaps in bilateral repo data, and it marks the first time the OFR is going directly to industry to collect financial market information. Participation in the pilot project is voluntary, and participating companies have provided input on what data should be gathered and what templates should be used for data collection. This pilot is intended to inform future collection efforts, which we hope to initiate quickly. Aggregated data from the pilot will be published to provide greater transparency into the bilateral repo market for participants and policymakers.

Let me sum up: Promoting data standards and filling data gaps are core elements of the OFR mission. Without good data, we can’t do good research and provide the good analysis needed by policymakers to protect financial stability. Lacking good data, firms will have much more difficulty assessing and managing their risks. Your collaboration is an essential ingredient in making good data available to all of us.

Thank you again for the opportunity to join you here today. I would be happy to respond to your questions.

Measurement Challenges in Macroprudential Policy Implementation: Essential Data Elements for Preserving Financial Stability

Remarks by OFR Director Richard Berner at the 2014 Financial Stability Conference co-hosted by the Federal Reserve Bank of Cleveland and the OFR

Who is who?; Who owns who?; and Who owns what?

Remarks by OFR Chief Counsel Matthew Reed at the CUSIP Annual Industry Summit

Financial Stability: Progress and Challenges.

Remarks by OFR Director Richard Berner at the Money Marketeers of New York University.

The Financial Industry in a Post-Crisis World Symposium.

Remarks by OFR Director Richard Berner at the Joint Conference of the Center for Financial Policy at the Robert H. Smith School of Business at the University of Maryland and The Clearing House

Remarks by OFR Chief Data Officer Linda Powell at the Object Management Group Technical Meeting

Good morning. Thank you for inviting me to be with you here today.

I would like to say a special thanks to Dennis Wisnosky of the EDM Council for extending the invitation for me to join you, as well as Dr. Richard Soley for his leadership of this important group. I would also like to recognize Mike Atkin, also of the EDM Council and a member of our Financial Research Advisory Committee at the Office of Financial Research, or OFR.

Dennis asked me to speak for a few minutes today about, “The Importance of Standards and Semantics Across the Regulatory Landscape” and I am very happy to do that.

Data standardization is a key part of the mandate of the OFR, so we spend a good deal of time talking about the importance of standards. It is exciting to be in a roomful of people who have an interest in that subject and who share my passion for data standards.

The law that created the OFR — the Dodd-Frank Act — directs the Office to standardize the types and formats of data reported and collected. Although the title of our Office refers to research and not data, we like to point out that you can’t do good research without good data. That’s where standards come in — making data fit for the purposes of aggregating, comparing, and sharing.

Good data and good research can help us gain insights about the financial system, about its vulnerabilities, and about ways that shocks can spread across markets. By better understanding how these forces work, we can make headway in pursuing our mission of promoting financial stability.

So, I would like to talk about good data this morning by describing some of the OFR’s data standards initiatives, then discuss the benefits for financial regulators and for industry of uniform standards for financial data.

At the OFR, the centerpiece of our initiatives on financial data standards is the legal entity identifier, or LEI, which is like a bar code for identifying entities that engage in financial market transactions. The LEI is a linchpin for making connections in the massive volumes of financial data that course through the international economy every day.

The LEI is taking hold on a global scale and its governance structure is nearing completion. Already, a dozen early-stage registrars have issued more than 220,000 identifiers that are being used for regulatory reporting in North America and Europe.

As the use of the LEI continues to grow, we at the OFR are increasingly turning our attention to other facets of financial data standards, such as the proposed universal mortgage identifier, or UMI. The need for such a standard is pressing in the U.S. because debt related to home mortgage loans represents 70 percent of household liabilities. A single UMI that protects personal privacy would bring coherence to fragmented data and would significantly benefit households, industry, regulators, and researchers.

The latest OFR research working paper, which reflects substantial input from several other federal agencies, discusses the characteristics that a UMI should have and criteria for implementation.

Another area of focus for the OFR is the need for standards for data held by swaps and trade repositories. Dodd-Frank required for the first time that derivatives trades be reported to these repositories. This requirement promised transparency in derivatives markets and keener insight into the types and levels of exposure throughout the financial system.

However, this promise is not yet realized because the data are currently fragmented, with different trade repositories in different jurisdictions collecting different information in different ways. This fragmentation is keeping us from developing a complete picture of the market.

By collaborating with the industry, the repositories, and the international regulatory community, we can establish standards for reporting, so that data can be aggregated and analyzed to promote the stability of the global financial system. The OFR is working on this issue with the Commodity Futures Trading Commission and, internationally, as part of the Financial Stability Board.

A final data standard that I will mention relates closely to the LEI. During the fall of Lehman Brothers in the early days of the financial crisis, one of the most vexing problems was the inability to identify counterparty transactions, not only with Lehman, but with Lehman’s subsidiaries. Understanding and documenting corporate structures, or hierarchies, has been part of the global LEI since the G-20 directed the Financial Stability Board to develop the LEI framework. Incorporating hierarchies in the LEI system promises valuable information for tracking the often complex structure of legal entities.

Coupled with the LEI, information about corporate hierarchies will give financial regulators deeper insights into how large financial institutions are structured and how they are connected to each other. We are addressing this need through both the Global LEI System and in individual countries.

So, how do these OFR initiatives relate to semantics? As all of you know, the theme for today’s technical meeting is, “Semantics – Crossing the Chasm.”

I think a pretty strong case can be made that the expanding adoption of the LEI demonstrates that financial standards in general are crossing the chasm from early adopters to more widespread acceptance. We expect this movement toward consistent data standards to lay the groundwork for greater use of semantic technologies.

Historically, the financial industry has not been at the forefront of standardization. Other industries embraced standardization decades ago or even centuries ago. Decades ago, data were not as central to the world of financial services businesses as they are today. We now live in a world that is data-driven as never before. A consensus is emerging within the financial industry and among policymakers across the globe that data standards are essential for effective risk management, analysis, monitoring, and supervision.

Without standards to harmonize all of the data, we have only an enormous amount of noise — and an enormous expense. Industry groups have estimated that the world’s largest banks spend more than $1 billion per year on integrating disparate data sources.

Semantics are a key part of any discussion about standards because data cannot be aggregated, compared, and shared if those data do not share a common language. Semantics provides the bedrock of definitions for standards.

An example of the importance of semantics would be asking someone at today’s meeting what OMG stands for. You would get a very different answer than the response you would receive from my children.

In financial terms, when you talk about capital and owner’s equity, does everyone have the same understanding? By applying semantic technology, we can ensure that we are all on the same page, speaking the same language.

This quest for common meaning is essential for government and for industry.

With a shared language and an understanding of industry terminology, government regulators can know how to ask industry for data and reports. Industry representatives will then know how best to comply with those requests, will understand how regulators measure compliance, and will be able to define the underlying data required for compliance.

A related concept is an ontology, or a standard way to define relationships between entities.

I mentioned earlier that Mike Atkin is a member of the OFR’s Financial Research Advisory Committee. Just last month, the committee presented the OFR with a half-dozen recommendations, including a proposal that “the OFR adopt the goal of developing and validating a comprehensive ontology for financial instruments as part of its overall effort to meet its statutory requirement to ‘prepare and publish’ a financial instrument reference database.”

The committee proposed that the OFR work with industry and standards bodies to evaluate how ontology might contribute to transparency and financial stability analysis. The committee also recommended that the OFR consider playing a leadership role in governance for an industrywide ontology.

A key part of our mission at the OFR is to strive for the identification and adoption of standards that will improve the quality and utility of financial data. As a result, we strongly support exploring the benefits of standard semantics and ontology in the financial industry.

Government can act as a catalyst for continued progress, but cannot succeed alone. For that, we need strong collaboration from all of you at this meeting and others like you. We need industry and government regulators to work toward consensus on uniform semantics and an ontology useful for both business purposes and regulatory reporting. The benefits are inviting and success is well within our sights.

Thank you again for having me here today.

Remarks by OFR Chief Data Officer Linda Powell to the GS1 Global Forum 2014

Good afternoon. Thank you for inviting me to join you here today.

I would like to offer special thanks to Tim Smucker, Ravi Mather, Donna Alexander, Ken Traub, and the nominees to the Board of Directors of the Global LEI Foundation who are here today, especially Gerard Hartsink.

Data standards are a passion of mine and it is exciting to be in a roomful of people who share my passion. It is also great to be in this international forum, which is one more sign that data standards have gone global.

As you can see from your program for this conference, the topic of my remarks is, “Progress on the Global LEI System – Mandates and Milestones.” I am very happy to share with you the latest information about the Legal Entity Identifier, or LEI, and discuss the impressive progress that the global community has made on this important standard. In just a few short years, we have moved from concept to worldwide acceptance and now a global rollout.

I would also like to take this opportunity to put the LEI into the larger context of standards in general and financial standards in particular.

The law that created the Office of Financial Research, where I work, directs the OFR generally to standardize the types and formats of data reported and collected. The law also calls for the creation of certain public-facing databases, such as a legal entity identifier database. Because the LEI must be universal to be successful, the solution for that database is not U.S.-centric, but global — and that is the path we have all chosen for the LEI.

The LEI is currently the centerpiece of the OFR’s initiatives on financial data standards because it is the foundational standard upon which others will build. But, it is not the only one. Today, I will tell you what else we are working on.

First, I would like to set the stage by discussing the development of financial data standards. Historically, the financial industry has not been at the forefront of standardization. Other industries embraced standardization decades ago or even centuries ago.

Decades ago, data were not as central to the world of financial services businesses as they are today. We now live in a world that is data-driven as never before. As data become increasingly important, momentum continues to build to find ways to make our data better — and that’s where standards play a valuable role.

As a result of this realization, the financial industry and the regulatory community have awakened to the vast benefits of standardization. Standards for financial data are an important tool for companies to manage their risk and for government regulators to analyze data related to financial stability. The rapid proliferation of early-stage LEIs around the world is unmistakable evidence of that.

As I mentioned, data standardization is part of the mandate of the OFR, so we spend a good deal of time talking about the importance of standards. We have a couple of favorite examples and I would like to share them with you today.

The first example is the standard shipping container, which can be loaded onto trucks, ships, and trains across the world. Before the mid-1960s, shipping containers came in many sizes. Goods were loaded, reloaded, stored, and stocked at ports and depots across the world. All of this packing and repacking made goods vulnerable to theft and increased transportation time. After standardization, a container full of freight could be locked securely at its departure point and transported faster and at lower cost. In this way, standardized containers streamlined the flow of commercial goods across the globe. Whether globalization drove standardization in shipping or the other way around is the subject of debate, but clearly globalization of our financial markets is driving us toward standards in the financial industry.

Another example is the fire hydrant. Unfortunately, it is an example not only of the benefits of standardization, but also of the pitfalls of failing to achieve universal adoption. In the U.S., the impetus for standardization of fire hydrants came from the Great Baltimore Fire of 1904, which burned for 30 hours and destroyed 1,526 buildings over 70 city blocks.

Fire crews from nearby Washington, D.C., responded to Baltimore to help fight the fire, but their fire hoses did not fit Baltimore’s fire hydrants. Although the U.S. National Fire Protection Association later adopted a national standard for fire hydrant connections, the standard was never government-mandated or universal adopted. As recently as 1991, a replay of the Baltimore tragedy occurred during a fire in Oakland, California, where 25 people died.

Those are both great examples of the benefits of standardization, but neither is my personal favorite. The example that really strikes a chord with me is the example of the bar code. My personal connection with the bar code is not related to the role GS1 has played in the evolution of the bar code over the last 40 years, or even that the bar code is featured so prominently in the brochure for this conference, on the bottle of water I drank this morning, or on the ticket for my flight here.

No, my reason is more personal. As a teenager, I worked in a food market. One of my most vivid memories of this job is how much trouble we had managing the inventory of food — before the advent of the bar code. Store managers were continuously checking the inventory to determine what they needed to order. As manager of the delicatessen section, I based my ordering on intuition and experience, rather than data. I learned to analyze trends in consumer buying behavior with minimal empirical information. I suspect that this experience paved the way for my interest in data standards and data quality.

Because of bar codes, the checkout process became faster and stores could keep better track of their inventories. The creation of standards that began in food markets and spread across the retail sales industry revolutionized the way stores operate, making them more efficient and cost effective — although perhaps a bit less fun for red-haired teenagers in the deli section.

At the OFR, we like to say that the LEI is like a bar code for identifying entities that engage in financial market transactions. It is a linchpin for making connections in the massive volumes of financial data that course through the international economy every day.

The sad truth is that nothing accelerates progress like a crisis. When Lehman Brothers failed in 2008, the financial industry held its collective breath as the fallout ensued. Financial market participants were unable to assess their total exposures to Lehman. Neither they nor government regulators could quickly determine their exposure to the network of Lehman firms. The recent financial crisis, and the lessons learned from it, have propelled progress on the LEI in the U.S. by underscoring the long-standing need for a global system to identify and link data.

The LEI promises a wide array of benefits. It is expected to save enormous sums that the financial industry spends on cleaning, mapping, and aggregating disparate data and on reporting data to regulators. Precise identification of counterparties would also give firms a clearer picture of their exposures in the marketplace.

For financial regulators, such identification would provide insight into ways shocks can spread across financial markets and would help in identifying vulnerabilities in the financial system.

The Director of the OFR — Dr. Richard Berner — is sometimes asked whether the LEI is taking too long to put in place. He responds by saying that, in fact, LEI already exists, and full implementation is moving at lightning speed. For a global standard to progress from being only a “gleam in the eye” to where we are today in only about three years is remarkable. The LEI success story is a tribute to standards supporters around this room and around the world.

In 2010, during the aftermath of the financial crisis, regulators began discussing how to create an LEI. The OFR issued a policy statement calling for the establishment of an LEI. The statement provided impetus to efforts by regulators and industry. The financial industry responded with a proposed solution and two U.S. regulators, the Securities and Exchange Commission and the Commodity Futures Trading Commission, helped to spur adoption by proposing swaps rules that required use of an LEI. Meanwhile, in Europe, central bankers and others began calling for a common identification system.

Late in 2011, the G-20 directed the Financial Stability Board to begin developing a framework for a global LEI standard. The adoption of ISO 17442, which is a technical standard developed by the International Organization for Standardization, came a few months later.

Another milestone for the LEI came in June 2012, when the G-20 endorsed an FSB report that contained the blueprint for the LEI system. The report outlined a three-tiered public-private governance framework that would protect the public interest, while meeting private sector needs.

Overseeing the LEI system is a Regulatory Oversight Committee, or ROC. OFR Chief Counsel Matthew Reed serves as ROC Chair, and members from the Japan Financial Service Agency and Banque de France are Vice Chairs. I work closely with Matt and others on the ROC, particularly on technical issues, as we roll out the LEI.

The ROC has established committees to set up the rest of the governance framework for the global LEI system. The center tier of the framework is the Global Legal Entity Identifier Foundation, organized in Switzerland. As I mentioned, several of the nominees to the foundation’s Board of Directors are here with us today, including the nominated Chair, Gerard Hartsink, and Tim Smucker, both of whom bring exceptional leadership resumes. The board will direct the organization, which is entrusted with building the technology infrastructure of the LEI system and ensure adherence to governing principles and standards, including reliability, quality, and uniqueness.

In a few months, we hope to have the board membership established. In the meantime, the nominees have been hard at work organizing the foundation and preparing to get it under way.

The third tier of the LEI system is an international network of Local Operating Units to register entities, assign LEIs, validate and maintain the reference data associated with each LEI, and make the data continuously available to the public and regulators, free of charge.

Today, a dozen early-stage registrars have issued about 189,000 LEIs in 169 countries. The ROC has an interim system to recognize these early stage registrars, so that the codes they issue can be used for regulatory reporting all over the globe.

As LEI adoption continues to grow, OFR Director Berner has been calling on regulators in the U.S. and around the world to help accelerate progress by requiring the use of the LEI in regulatory reporting. Already, regulators in the U.S., Canada, Europe, and parts of Asia have imposed such requirements for some reporting related to swaps, insurance, and banking.

As the use of the LEI continues to snowball — an appropriate term given the harsh winter we are having back in Washington — we at the OFR are increasingly turning our attention to the next steps for the LEI that promise to compound the benefits of the LEI.

Turning back the clock again to 2008, one of the most vexing issues during the fall of Lehman Brothers was the inability to identify counterparty transactions not only with Lehman, but with Lehman’s subsidiaries. Understanding and documenting corporate structures, or hierarchies, has been part of the global LEI since the G-20 directed the FSB to develop the LEI framework.

Incorporating hierarchies in the LEI system promises valuable insights to track the often complex structure of legal entities.

Coupled with the LEI, information about corporate hierarchies will give financial regulators deeper insights into how large financial institutions are structured and how they are connected to each other. We are addressing this need through both the Global LEI System and in our individual countries.

Another data standards initiative of interest to the OFR is the proposed universal mortgage identifier, or UMI. The need for such a standard is particularly pressing in the U.S., where debt related to home mortgage loans represents 70 percent of the liabilities of households. A single UMI that protects personal privacy would bring coherence to fragmented data and would significantly benefit households, industry, regulators, and researchers.

The latest OFR research working paper, which reflects substantial input from several other U.S. government agencies, discusses the characteristics that a UMI should have and criteria for implementation.

I would like to turn now to one other key area of focus for the OFR — a focus that we share with the international standards community — and that is the need for standards for data held by swaps and trade repositories.

We are all familiar with the huge exposure to credit default swaps that pushed American International Group, or AIG, to the brink of collapse during the financial crisis.

Before the crisis, these types of derivatives were traded between parties with no central record of who was trading with whom. The size of the market and the exposures within the network of trading connections were hidden from view.

After the crisis, the U.S. financial reform law — the Dodd-Frank Act — required for the first time that derivatives trades be reported to centralized data warehouses known as “swap data repositories.” This requirement holds the promise of transparency for our derivatives markets and keener insight into the types and levels of exposure throughout the financial system.

However, this promise has not yet been realized. Currently, the data are fragmented, with many different trade repositories in different jurisdictions, collecting different information in different ways. This fragmentation is keeping us from putting the information together to develop a full picture of the market.

At the OFR, we are fully committed to rolling up our sleeves and tackling the obstacles to progress. By working with the industry, the repositories, and the international regulatory community, we can establish standards for reporting, so that data can be aggregated and analyzed to promote the stability of the global financial system.

The OFR and the U.S. Commodity Futures Trading Commission, or CFTC, are participating in a Financial Stability Board initiative to design data standards for aggregating data across trade and swap data repositories. We are also collaborating with the CFTC on standards to improve the quality of data collected from these repositories.

Like other financial data standards initiatives across the world, these efforts are in their early stages. We are only at the beginning of financial data standardization and, although we have had some encouraging successes, many challenges remain.

At the OFR, a key part of our mission is to strive for the identification and adoption of standards that will improve the quality and utility of financial data.

Government can act as a catalyst for continued success, but cannot succeed alone. For that, we need strong collaboration from everyone at this forum today and others like you across the globe. For both industry and government regulators, the incentive to act is strong, the benefits are promising, and success is within our sights.

Thank you again for having me here today. I would be happy to respond to your questions.

Remarks by OFR Director Richard Berner to the Exchequer Club of Washington, D.C.

Thank you, Chancellor Ryan, for that introduction and for your kind invitation. It’s an honor to follow the prominent officials and thought leaders who have joined you at lunches like this one. Many of them have discussed the financial crisis and the responses to it, which were designed to strengthen our financial system and our economy.

In that tradition, today I want to tell you about the Office of Financial Research and our role in promoting financial stability.

Financial stability occurs when the financial system functions to provide its basic services, even under stress. Clearly, many threats to financial stability emerged in the crisis.

Out of the crisis has come a widespread appreciation for a different approach to policymaking. Financial stability is now a statutory policy objective for every federal financial regulator, policy analysis is focused on assessing threats to financial stability, and policymakers are creating more tools to combat those threats — developing what we call the macroprudential toolkit. Macroprudential is a fancy word to mean that we now must look across the entire financial system, not just in a few institutions or markets, to assess and mitigate threats to financial stability.

This different perspective is essential because, as we saw in the years preceding 2008, the standard tools and data used to measure risks provided little indication of the vulnerabilities that were growing in the financial system. Market participants and regulators broadly misperceived the extent of leverage and maturity transformation. They did not see the migration of such activities to the so-called shadow banking system, or the economic exposures of supervised firms to these activities. And they collectively underestimated how disruptions could spread across interconnected companies and markets, and impair the functioning of the financial system, with severe consequences for the economy.

The new analytical toolkit needs to assess these fundamental sources of vulnerability, to be more forward-looking, and to test the resilience of the financial system to a wide range of events and incentives.

Likewise, there is no mistaking the need to improve the quality and scope of financial data to monitor activity across the financial system. Available data measuring financial activity and exposures prior to the crisis were too aggregated, too limited in scope, too out of date, or otherwise incomplete. No wonder leverage and maturity transformation, the growth of non-bank activity, and exposures were poorly understood — the data failed to show them.

The Dodd-Frank Act recognized that new institutions were needed to fill these gaps in tools and data. The Act created the Financial Stability Oversight Council and the Office of Financial Research. The Council is charged with assessing and monitoring threats to financial stability, developing remedies for those threats, and restoring market discipline. Our work at the OFR is aimed at supporting the needs of the Council by collecting and improving the quality of financial data and developing tools to evaluate risks to the financial system.

With this institutional framework in place, I want to discuss five critical challenges that we in the OFR face, how we are confronting them, and what we have accomplished so far.

The five challenges are:

  1. Improving financial stability analysis and monitoring
  2. Prioritizing and meeting data needs for analysis and monitoring
  3. Developing standards to improve the quality and utility of financial data
  4. Balancing confidentiality with data sharing, and
  5. Developing the macroprudential toolkit

1. Improving financial stability analysis and monitoring

Improving financial stability analysis and monitoring requires a strong framework. Our framework begins by looking to the six basic services provided by the financial system:

  1. Credit allocation, in which lenders are matched with borrowers, such as companies, households, and governments.
  2. Maturity transformation, in which a lender obtains short-term funds to invest or loan at longer duration.
  3. Risk transfer, in which intermediaries accept or insure against risks, such as credit or duration risk, at a price, from those who don’t want to hold it.
  4. Price discovery, in which the interaction of buyers and sellers determines fair market prices for financial assets.
  5. Liquidity provision, which facilitates transacting in timely fashion without materially affecting prices.
  6. Facilitation of payments, or the services through which transactions and payments are made, cleared, and settled.

Financial stability occurs when these services function smoothly. But each of these functions may be vulnerable to shocks when market participants take them to extremes and markets fail. Such vulnerabilities are threats to financial stability — and the Council and the OFR have complementary mandates to identify them and assess how they might transmit or amplify such shocks across the financial system. To do that, we must understand how vulnerabilities develop.

We know, for example, that the low volatility and cheap funding prevalent in times of market calm can prompt return-seeking market participants to take on leverage, maturity transformation, and credit risk. This increased risk-taking can promote vulnerabilities in short-term funding markets that promote runs and fire sales.

We know that threats can arise in large, complex financial institutions. Supervisors are using so-called stress tests to model losses that such institutions could suffer when exposed to an unpredictable variety of shocks. Test results help to calibrate their needs for capital and liquidity. Our mandate at the OFR is to evaluate the capacity of such tests to assess overall financial stability. This capacity is limited because current stress tests do not focus on the transmission of stress across institutions and markets. The OFR is researching the frontiers of stress testing, evidenced by three recent Working Papers and by an all-day research workshop at the Treasury last year. We also assess and monitor financial stability indicators with exotic names like CoVar, DIP, and SES that may indicate systemwide sources of stress in financial companies.

We know that threats can arise in nonbank financial companies in what has come to be known as the shadow banking system. That system includes activities such as dealer-intermediated finance, securitization, and funding in wholesale markets. It includes companies such as broker-dealers, hedge funds, private equity funds, asset managers, and insurance companies. These firms may engage in activities involving maturity, liquidity, and credit risk through reinvestment of cash collateral in securities lending or providing tail-risk insurance in credit markets.

Our recent report about the asset management industry, produced at the Council’s request, examines some of these activities. It identifies ways that asset management activities could create vulnerabilities, including through redemption risk, use of leverage, cash collateral reinvestment, and crowded trades.

Public attention has been understandably focused on what happens next for asset management. That is up to the Council, not the OFR. Having delivered the report to the Council, our focus will be on continuing to support its work with appropriate data and analysis.

2. Prioritizing and meeting data needs for analysis and monitoring

High-quality data are critical for the work of the Council and the OFR. We can’t analyze what we can’t measure. In order to measure, we need solid, reliable, granular, timely, and comprehensive data. Let me repeat: There are still significant gaps in our financial data. The OFR’s job is to fill those gaps by prioritizing and meeting data needs for analysis and monitoring.

Three steps are involved in filling data gaps. First, prioritize critical questions, and decide which data are needed to illuminate and answer them. Second, assess gaps by taking stock of data already available and comparing them with needs. To further that process, we created and maintain the first inventory of the data held by all Council member agencies. Finally, prioritize and decide on how to fill the gaps.

For example, we are working to fill gaps in repo market data. In a repo — or repurchase — agreement, a dealer obtains short-term funding for securities by selling them to an investor with an agreement to buy them back. Because repo markets represent critical sources of funding for securitization, their impairment during the crisis had adverse consequences for the entire financial system. The scarcity of repo data in 2007 and 2008 masked these vulnerabilities and made assessment of possible policy responses problematic. The New York Fed has improved the quality and scope of repo data since the crisis, but significant gaps remain. We at the OFR are collaborating with Fed staff to fill them.

3. Develop and promote standards to improve the quality and utility of financial data

Filling data gaps is critical; it’s equally important to have high-quality, easily accessible data. Our third challenge is thus to develop and promote standards to improve the quality and utility of financial data.

Data that are standardized are critical for analysis. Without standards, we cannot aggregate. Without standards, we cannot compare. We need data that can be aggregated and linked with other datasets for analytical comparisons.

Data standards help us collect more and better data, while reducing the reporting burden for industry. They enable us to aggregate and to compare data on an apples-to-apples basis. They provide regulators and policymakers with a more accurate view of the financial system, including the interconnections between companies and markets.

The financial crisis highlighted the need for data standards. When Lehman Brothers collapsed in 2008, many market participants were unaware of their total exposures to Lehman because that name did not appear on all of their contracts.

The global legal entity identifier, or LEI, offers a solution for this problem.

The LEI is like a bar code or unique ID for parties in financial transactions. It is a 20-digit, alpha-numeric code, which connects to basic business card information to identify clearly and uniquely companies participating in global financial markets. It promises to provide major benefits to financial market participants and government regulators worldwide.

Although standards entail up-front implementation costs for industry, the benefits over time promise to dwarf those costs by enabling firms to report the same data to regulators as they use to manage risks and run their management information systems. Standards also reduce industry costs for collecting, cleaning, and aggregating data. So it’s hardly surprising that businesses are eager to embrace the LEI.

Thanks to officials and businesses globally, the LEI system is beginning to take hold. Organizations in several countries are issuing “pre-LEIs,” which are designed to be compatible with the final LEI system. To date, almost 100,000 pre-LEIs are in use. Registration authorities called local operating units are in place in six countries; they constitute the bottom tier of governance. The middle tier is a foundation in Switzerland whose board is being assembled by authorities around the globe to ensure that all parties implementing the LEI adhere to governing principles and standards, including reliability, quality, and uniqueness. And a Regulatory Oversight Committee provides overall oversight.

We are proud to have contributed to that effort, to have helped shape the LEI governing structure, and that OFR’s Chief Counsel chairs the LEI Regulatory Oversight Committee.

But using the LEI is still largely voluntary. And like it or not, standards work best when they are universal, like the bar code system. Thus, in my view, the time has come for regulators around the world to require companies to use the LEI in reporting financial data. True, in the U.S. and Europe, regulators require that the LEI be used for swap reporting. To be effective, however, the LEI should be ubiquitous.

The LEI is the cornerstone of the OFR’s data standards efforts. But the Office is also pursuing other critical standards initiatives in partnership with the Council and our global counterparts. These initiatives include ownership hierarchies to define the relationships among entities in corporate structures, protocols for data sharing, and standards for collection and aggregation of data in the new swap data repositories.

4. The need to balance confidentiality with data sharing.

The Dodd-Frank Act prohibits the OFR from publishing confidential data, but also requires the Office to make data appropriately available to the public to promote both market transparency and research on the financial system. That requires balancing confidentiality with the need to share data.

We can envision data sharing as three levels of concentric circles. The innermost circle entails data sharing between the OFR and Council member agencies. We are collaborating through the Council and its committees on data sharing agreements, protocols, and safeguards.

The middle circle represents data sharing with academic researchers under protocols that balance the need for access with heightened safeguards to protect confidential data.

The third, outermost circle involves sharing data with the public, again with the most stringent safeguards, for transparency, accountability, and communication about our work.

At the OFR, we are exploring how best to share data at each of these three levels and striking the appropriate balance. For example, our latest research Working Paper discusses balancing transparency and confidentiality by using new technologies, such as cryptography.

5. Developing the macroprudential toolkit

Ultimately, our objective is to strengthen the financial system so it can reliably provide the fundamental financial services I highlighted earlier. The final challenge, then, is to develop the macroprudential toolkit to promote financial stability.

The macroprudential toolkit is the collection of policy instruments available to financial policymakers and supervisors. It should help them address systemwide vulnerabilities early to decrease the odds of another financial crisis, and respond to threats to lessen the impact of any crisis that occurs.

We are beginning an important transition, turning from a bank-centric discussion of prudential tools to one that is truly macroprudential. The framework to guide our efforts is outlined in recent research papers, underscored in the OFR’s inaugural research conference in December 2011, and discussed in our first Annual Report. We must focus simultaneously on the specific channels — across institutions, activities, and markets — through which threats to financial stability typically surface, such as default, runs, and fire sales. And we must seek a like number of policy instruments with the potential to reduce or neutralize these threats, such as capital requirements, liquidity requirements, and minimum haircuts. In that regard, I am heartened by recent global support for further exploring how to construct and implement this potentially important policy tool.

The macroprudential toolkit should have a distinct tool to address each threat or channel of transmission. We must select the right financial stability tool for the job, while being mindful of the interplay among them and with other policy tools.

The OFR is not empowered to make policy, but the Dodd-Frank Act directs the Office to conduct studies and provide advice on the impact of policies related to threats to financial stability. Helping to construct and evaluate the macroprudential toolkit is part of that mandate.

Fundamental uncertainties are the hallmark of threats to financial stability, and the financial system is vast, complex, and evolving. The goal of fully capturing every threat with our analysis will always elude us. But we will keep on trying. Better data, research, and analysis can help us improve market discipline, regulation, and the shock absorbers and guardrails needed to make the financial system more resilient.

The list of challenges I discussed today is daunting. However, the OFR team relishes challenges and rises to them. Each of our 180-plus employees — and counting — can take pride in what we have accomplished so far. And we benefit from strong collaboration. Yet all of us — together —are only at the beginning of this venture. There is much more to come, and I look forward to the next opportunity to give you an update on our progress and success.

Thank you again for having me here today. I would be happy to respond to your questions.

Remarks by OFR Chief Counsel Matthew Reed at the Industry Forum & Working Groups Event, International Securities Industry Organization for Trade Communication

Thank you for inviting me to join you here today.

I would like to say a special thank you to Karla McKenna for extending the invitation for me to speak, as well as for her work on the standard for the legal entity identifier, or LEI, and her invaluable technical support for the LEI system’s Regulatory Oversight Committee, on which I serve as Chair.

Karla told me you would be interested in hearing about the latest developments in the implementation of the global LEI system and I will be glad to provide that update today. I will also provide you with some context for the environment that has sustained the LEI as it took root, began to grow, and became poised to flourish on a worldwide basis.

I think we all agree we have reached a time—finally—when data standards have arrived. A consensus is emerging among policymakers across the globe that standards are essential for the effective monitoring, supervision, and understanding of the financial system, and the LEI is recognized as the cornerstone for future global financial standards.

For a strong signal that financial standards have moved into the regulatory limelight, look no further than the fact that promoting data standards is engrained into the mandate of the Office of Financial Research, or OFR. The Dodd-Frank Act listed seven items under the “purposes and duties” of the OFR. Second on the list was “standardizing the types and formats of data reported and collected.”

Grasping the importance of standards is a lesson that has been learned many times in history— often the hard way after mounting problems or sobering calamities made the need for standards painfully obvious.

On my desk at work is a blue book with colorful but unsightly page makers sticking out of the sides. The book by Marc Levinson is called, “The Box: How the Shipping Container Made the World Smaller and the World Economy Bigger.” It is a book about how the standardization of shipping containers lowered costs, saved time, and streamlined the flow of commercial goods across the globe. Before the mid-1960s, shipping containers came in many sizes. Goods were loaded, reloaded, stored, and stocked at ports and depots across the world. The amount of handling and spotty security made these goods vulnerable to theft. After standardization finally took hold, a container full of freight could be locked up securely at its departure point and transported faster and at lower cost by trucks, trains, and ships across the world.

Another success story for standardization is the railroad track gauge, or the distance between the metal rails on a track bed. Before the Civil War, no standard gauge existed, so rail freight had to be unloaded and repacked from one train to the next to proceed to its destination. In 1864, a standard gauge was mandated for use on the Transcontinental Railroad and that gauge became the U.S. standard by 1886.

One final example of standardization came after the Great Baltimore Fire of 1904, which burned for 30 hours and destroyed 1,526 buildings over 70 city blocks. The catastrophic damage might not have been so bad if the fire crews from Washington had been able to hook up their fire hoses to Baltimore’s fire hydrants, but their hoses did not fit. The National Fire Protection Association later adopted a national standard for fire hydrant connections. However, the standard has not been government-mandated or universal adopted, and a replay of the Baltimore tragedy occurred as recently as 1991 during a fire in Oakland, California, where 25 people died, including a police officer and a firefighter.

These stories teach us that standards and their universal adoption are critical for building infrastructure. Roads, rails, water lines, and nuts-and-bolts would all be messy and mismatched without standards.

Unfortunately, the financial industry and its regulators have been slow to grasp the importance of standards in building the infrastructure necessary to support today’s global financial system.

Like the fire here in Baltimore 110 years ago, this country had an economic brand of wildfire in 2008 that spread across much of the world, wiping out trillions of dollars of wealth, sending millions of workers into unemployment lines, and pushing the global economy to the brink of disaster.

The crisis highlighted the seriousness of the problem of identifying financial connections and underscored once again the urgent need for a global system to identify and link information about financial transactions for insights on risk exposures across the system. The linchpin for these links is the ability to identify with precision the entities doing the transacting. Missing from our financial infrastructure—but necessary—is a legal entity identifier, or LEI.

In a moment, I’ll discuss the latest news about our efforts to develop the LEI. But let me briefly describe some other standards-related work that has become part of the rebuilding of financial system infrastructure, a task that is now on the minds and agendas of policymakers.

For example, the OFR coordinates the work of the Data Committee of the FSOC—the Financial Stability Oversight Council—which is focused on a wide range of standards-related initiatives, including ownership hierarchies and data sharing protocols. We are also working on improving data standards for swaps data repositories.

Across the federal government, other data standards activities preceded the LEI work, and have ushered in this new era of policymakers’ focus on standards. For example, look at the use of the Extensible Business Reporting Language by the federal banking and securities regulators to collect financial data from the industry. These efforts were implemented in the last decade and have now become part of the financial disclosure nomenclature alongside substantive standards like GAAP.

Another more recent example is the Financial Management Service, an office of the Treasury Department, promoting standards for sharing data across the federal government. In the same way that XBRL promises to allow apples-to-apples comparisons of reports by public companies and banks, these FMS efforts seek to normalize financial management reports and information produced by agencies.

Although many of these initiatives preceded the recent financial crisis, the crisis nudged this work closer to the front of the agenda. Title VII of the Dodd-Frank act asked regulators to focus on swaps data and create repositories to hold those data so that we could better understand the derivatives markets. Those provisions carried with them the implicit requirement that standards be implemented for reporting the data in a usable fashion. Title IV contemplated reporting for private funds and the sharing of the resulting data with the FSOC, again carrying the implicit requirement that the data be presented in useful form. And the OFR was charged with “standardizing the types and formats of data to be reported.” In this modern age of global electronic markets, the only logical way to fulfill those mandates is to make standards an early part of the conversation.

Financial standards can take hold in one of three basic ways. They can evolve organically when a single player in a market becomes dominant, such as Microsoft and Apple producing the dominant operating systems for personal computers. They can emerge from cooperation within industry organizations, like the National Fire Protection Association, which coordinated the standards for fire hydrants at the beginning of the last century. Or standards can be set through government involvement, particularly when there are high implementation costs or collective action problems that can arise from proprietary interests or dispersed benefits. At the OFR, we think about each approach as we consider whether standards are needed and how they might be encouraged. Regulatory compulsion played a role in many standards development efforts because a collective action problem needed solving. The LEI is no exception, but it has also benefitted from important contributions by voluntary consensus-based organizations and the private sector.

To frame the problem that the LEI is designed to solve, I have used the example of City National Bank, a national bank headquartered in Los Angeles. There are 14 banks in the U.S. called City National, and 10 times more that use a variant of the name. So an identification code is a natural way to distinguish one bank from the other. City National Bank has several. The bank has an RSSD ID, a unique identifying number assigned by the Federal Reserve to all financial institutions, main offices, and branches. It also has a unique FDIC certificate number, a central index key from the SEC, a code from the Society for Worldwide Interbank Financial Telecommunication, commonly known as SWIFT, and other proprietary identification numbers from vendors.

The tangle of identification schemes and names applies not only to banks. Separate identifiers also exist for securities firms and insurance companies.

Given all of these different identification codes and the potential for name confusion, the need for a unique, precise, globally recognized identifier is evident.

The LEI is like a bar code—a unique ID for companies participating in global financial markets. When the LEI system is fully implemented globally, the OFR and other government entities will have a powerful tool to help in assessing potential threats to financial stability. This capability lies at the heart of the OFR’s mission.

For financial firms, the LEI will provide a clearer view of their risks and interconnections. It will also reduce costs for collecting, cleaning, aggregating, and reporting data. Industry groups have estimated that the world’s largest banks spend more than $1 billion per year on standardizing disparate data sources.

A hybrid of industry support, government action, and international cooperation has carried the LEI forward to where it is today. The essential element has been the leadership of the Group of 20 nations and their finance ministers, the Financial Stability Board and other regulators working with the Board, and U.S. government entities that include the OFR.

The private sector has participated in this partnership every step of the way, most recently through a Private Sector Preparatory Group that is providing advice and expertise.

Throughout the global process, the OFR has played a key role by leading work streams and collaborating with other regulators and the industry to provide recommendations to the G-20 to guide the governance, development, and implementation of a global LEI system.

These combined efforts have brought the LEI and other financial standards from the back office to the forefront of policy discussions. The reason is simple: the global financial system contains enormous—and growing—volumes of data. Without standards to harmonize these data, we have a gargantuan amount of noise.

One of the guiding principles in establishing the LEI system has been that this crucial international standard must be freely available to the public, to businesses, and to authorities. Vendors will provide add-on services to the LEI, which will further promote its use and demonstrate its utility in the marketplace. However, the LEI system will not sanction profiteering or facilitate monopolistic private gain. The LEI is recognized as a public good because it would help public authorities not only to identify threats to the world’s financial system, but also to assess them and respond to them more effectively to promote stability. For that reason, authorities concluded that collective action problems needed solving and that the identification of key participants in our financial markets needed to be returned to the public domain so that regulators could be more effective in identifying threats to financial stability, fighting financial crime and terrorism, and conducting micro-prudential regulation and market oversight.

So, where is the LEI? The answer is that it is here—and its use is rapidly spreading. In fact, more than 80,000 pre-LEIs are already in use. That total includes codes issued by the DTCC as CFTC Interim Compliant Identifiers, or CICIs, which conform to the LEI standard.

More pre-LEIs are being assigned every day and the international community has embraced a framework for global acceptance of pre-LEIs to underpin an interim LEI system for producing fully standardized codes until the LEI system is fully up and running.

The OFR is working with other federal financial regulators to include use of the LEI in rules for reporting data to government agencies, just as the Commodity Futures Trading Commission is already requiring use of the LEI in swap data reporting. This reporting has become a critical piece of our financial infrastructure.

We are all eager for this keystone standard to permeate financial data across the global marketplace but we have to admit that, for an international collaborative undertaking, we have come a long way at lightning speed.

Regulators began discussing how to create an LEI in 2010. The OFR issued a policy statement in November of that year, calling for the establishment of an LEI and providing impetus to efforts by regulators and industry. The financial industry responded with a proposed solution.

The SEC and CFTC each proposed swaps rules that required the use of an LEI as a way to spur adoption.

Late the following year, the G-20 directed the FSB to begin to develop a framework for the LEI standard. A few months later came the adoption of a technical standard developed by the International Organization for Standardization, better known as ISO. Karla McKenna was critical to that effort and continues to provide valuable guidance.

That standard is ISO 17442, a 20-digit, alpha-numeric character set. The LEI links to basic “business card” information, such as the company name and address, the date the LEI was assigned, and the date of the most recent update to the registry.

The LEI project reached another high point in June 2012, when the G-20 endorsed an FSB report that provided a blueprint for the LEI system. The G-20 envisioned a three-tiered public-private governance system designed to protect the public’s interest in the system, while ensuring that it meets private sector needs.

At the top level overseeing the system is a Regulatory Oversight Committee, or ROC. The ROC met for the first time in January of this year, when I was selected Chair. Global authorities from more than 50 countries and jurisdictions attended the meeting. Members from the Japan Financial Service Agency and Banque de France were selected as Vice Chairs.

Since then, we have fully developed the ROC, approved bylaws, and established committees that have undertaken the work of setting up the rest of the governance framework for the global LEI system. The middle tier of the framework is the Central Operating Unit, organized as a foundation in Switzerland. We have been putting the pieces in place through the Swiss legal system to establish the Global Legal Entity Identifier Foundation. By next month, we plan to have selected the foundation’s board of directors.

In the middle, the Central Operating Unit will ensure that all parties that implement the LEI adhere to governing principles and standards, including reliability, quality, and uniqueness, so that we can achieve our shared goal for “one golden standard” for the LEI.

The third and final tier is an international network of Local Operating Units, or LOUs, that will register entities and assign the LEIs. The LOUs will validate and maintain the reference data associated with each LEI, and make these data continuously available to the public and regulators, free of charge. At present, four pre-LOUs in the U.S. and Europe are issuing codes. The ROC is negotiating an interim system that will recognize the pre-LOUs seeking to join the system, so that the codes they issue can be used for regulatory reporting under rules all over the globe. In all, 13 utilities have expressed an interest in issuing codes, and as I noted, four have already issued more than 80,000 codes.

The worldwide phase-in of the LEI is being driven by the legislative and rulemaking processes of each jurisdiction requiring the use of the LEI and by the adoption of the LEI by firms for risk management and reporting.

As use of the LEI spreads throughout the world financial system, we expect the LEI to become increasingly valuable. We also expect the trend of adoption of the LEI to spur further adoption in a reinforcing cycle that will make the LEI ubiquitous in financial reporting and data management throughout the world.

That is the vision; it is an exciting one; and with the continued help of people around the world—and in this room today—we will reach this remarkable achievement.

Thank you again for the opportunity to be here. I would be glad to respond to your questions.

Financial Stability Analysis: Using the Tools, Finding the Data

Remarks by OFR Director Richard Berner at the Joint Conference of the Federal Reserve Bank of Cleveland and the OFR

Remarks by OFR Director Richard Berner at the 14th Annual Risk Management Convention of the Global Association of Risk Professionals

Thank you for inviting me to join you; it’s a pleasure to be here. This morning I want to discuss the outlook for financial risk from our perspective at the Office of Financial Research.

First, I want to thank Rich Apostolik, the GARP Board of Trustees, and many of the chief risk officers in the audience for engaging with us on our mutual interests, such as helping to create a culture of risk awareness within organizations—from entry level to board level—and promoting best practices for financial risk management.

The financial crisis highlighted deficiencies in the understanding of the risks in the financial system and limitations in financial data. Market participants and regulators broadly misperceived the extent of leverage and maturity transformation. They did not see the migration of such activities to the so-called shadow banking system, or the economic exposures of supervised firms to these activities. And they collectively underestimated how disruptions could spread across interconnected companies and markets, and impair the functioning of the financial system, with severe consequences for the economy.

The crisis spurred financial reform, including the creation of the Financial Stability Oversight Council and the Office of Financial Research. Our work at the OFR is aimed at collecting and improving the quality of financial data and developing tools to evaluate risks to the financial system.

Out of the crisis has come a widespread appreciation for a different approach to policymaking. Financial stability is now a statutory policy objective for every federal financial regulator, policy analysis is focused on assessing threats to financial stability, and policymakers are creating more tools to combat those threats—developing what we call the macroprudential toolkit.

A different perspective is essential because, as we saw in the years preceding 2008, the standard data and tools used to measure risks provided little indication of the vulnerabilities that were growing in the financial system.

There is thus a need to improve the quality and scope of financial data to monitor activity across the financial system. The analytical toolkit needs improvement to assess these fundamental sources of vulnerability and instability in the financial system. It needs to be more forward-looking, and to test the resilience of the financial system to a wide range of events and incentives.

Likewise, parallel improvements in financial risk management are also needed to expand the range of data and the tools used to uncover risks and vulnerabilities in institutions, individually and collectively, across the financial system.

Our work on assessing risks and promoting best practices for financial risk management dovetails with yours. We can help weave macro considerations, namely, risks that spread across the financial system, into your work. We can also weave micro factors that focus on individual firms into our work. Such joint efforts promise to pay dividends for financial stability and risk managers around the globe.

Against that backdrop, I want to assess the outlook for financial risk by outlining risks and vulnerabilities in the financial system, to offer suggestions for best practices in risk management, and to propose some opportunities for collaboration between the OFR and you in the risk management community.

The Outlook for Financial Risk

Policymakers took aggressive steps during the recent crisis to repair financial markets and institutions, and both have recovered significantly. Stock prices have reached new highs, implied volatility has declined to pre-crisis lows, and risk appetite has clearly returned. But that recovery does not inevitably imply a new and destructive cycle of risk-taking. In fact, the adoption of financial reforms here and abroad has, in my view, brightened the financial risk outlook, and I am more optimistic as a result.

Among the reforms:

  • Financial markets and companies are more transparent.
  • Our financial system is significantly less leveraged, reducing our vulnerability to a future crisis. U.S. banks have raised their capital levels to approximately $1 trillion, up 75percent from three years ago.
  • Liquidity requirements will also increase, to enable companies to fund themselves from cash reserves for at least a month.
  • A new framework is in place to protect the financial system, the economy, and taxpayers from the consequences of the failure of a large, complex financial company.
  • And finally, important market reforms—especially in the market for OTC derivatives—have helped to improve the outlook for financial stability.

The Financial Stability Oversight Council and the OFR

Beyond these improvements, regulators are also better equipped to monitor and respond to threats to the financial system. The work of the Financial Stability Oversight Council, its member agencies and organizations, and the OFR has been instrumental to that end.

Let me explain two ways that our work on data and analysis informs the Council’s deliberations and can be helpful to you as risk managers to limit the buildup of risk.

First, we are working to expand the scope and quality of data.

A key part of the OFR mission is to fill the gaps in existing data, by prioritizing key analytical questions about threats to financial stability and assessing data needs to answer them.

Accurate assessment of data gaps and sharing data appropriately should enable us as policymakers to get more bang for our data collection buck, reduce the industry’s reporting burden, and provide more and better information for you as risk managers. To pinpoint gaps, the OFR has completed an initial inventory of purchased and collected data among Council member agencies and an inventory of internally developed data is under way. To improve the scope of data available to policymakers and to help us minimize duplication, the OFR has established data-sharing agreements with a number of Council member agencies and continues to work on new ones as needed.

To improve data quality, the OFR is working to establish standards for data collection and usage. Standards enable us to aggregate and compare data on an apples-to-apples basis. They enable firms to report the same data to us as they use to manage risks and run their management information systems, thereby reducing reporting burdens. Standards also reduce industry costs for collecting, cleaning, and aggregating data.

Accordingly, the OFR is playing a leadership role in the initiative to establish a global Legal Entity Identifier (LEI). The LEI is a code that uniquely identifies parties to financial transactions. These identifier codes are like the barcodes that uniquely identify products. The OFR’s Chief Counsel was recently named Chair of the LEI Regulatory Oversight Committee. With the planned launch of the global system this month, the goal of standardizing the identification of these entities will move closer to reality. As a result, financial company CROs and financial regulators worldwide will gain a better view of true exposures and counterparty risks across the global financial system.

We plan to follow up on our work on the global LEI system by developing best practices for other data standards, such as in the hierarchies that illustrate relationships among entities.

To fulfill our mission, we must manage, analyze, and safeguard large volumes of data. The OFR is required by statute to protect the integrity of these financial data with a robust security framework and we take that requirement very seriously. No goal is more important than collecting data in a secure manner and safeguarding the data held.

The second way our work informs the Council’s deliberations and can be helpful to you as risk managers is by improving our analytical toolkit.

As with our data initiatives, we are complementing, not duplicating the work of others, be they regulators, academics, or practitioners.

  • We are leveraging our expertise through partnerships and collaboration in what we call a virtual research community. The OFR has launched an array of initiatives to meet its mandate to foster a network of outside researchers, academics, industry groups, and risk managers.
  • We have issued six papers in the OFR’s Working Paper Series to foster debate on key issues, including risk management.
  • We convened seminars and workshops to evaluate tools like stress tests; we have also held annual conferences to evaluate the macroprudential toolkit and to assess the evolving nature of financial intermediation.
  • We formed a Financial Research Advisory Committee of 30 distinguished professionals in economics, finance, financial services, data management, information technology, and—of particular interest today—risk management to advise us. In fact, five members of the Committee are, or have served as, chief risk officers.
  • We are supporting the Financial Stability Oversight Council by monitoring and conducting research on key risks, including those in money market funds and credit default swaps. We are also providing data and analysis to inform the Council’s work on the designation of nonbanks for enhanced prudential standards and supervision by the Federal Reserve.

To sum up, the regulators’ work to strengthen the financial system, and our work to help the Council assess and monitor threats to financial stability, are starting to pay off.

Remaining Risks and Vulnerabilities

However, other developments make me less sanguine, partly because they embody risks and vulnerabilities that are neither immediately evident nor easily monitored in markets. Today, the signals from financial markets are relatively benign; and it’s of course legitimate to think that periods of low market volatility and rising risk appetite like this one may simply reflect recovery.

More broadly, low volatility, interest rate spreads, CDS spreads, and repo haircuts are all traditionally viewed as signs of low financial market risks. However, it is more likely that eventually just the opposite is true. These developments often signal rising market risks, because they give investors and risk managers incentives and wherewithal to take on leverage.

Traditionally analysts view such indicators as exogenous barometers of risk. However, just the opposite is likely to be true. They are endogenous indicators of risk appetite and investor sentiment.

You might say that anyone who has spent a week on a trading desk could have told you that. But recognition of that dynamic in either academic or policy analysis is only starting to appear. A recent paper by Danielson, Shin, and Zygrand argues that leverage and volatility are endogenously co-determined, and that low volatility promotes increased leverage and risk.1Similarly, Fed Governor Jeremy Stein recently observed that low volatility gives market participants incentives to write deep, out-of-the-money puts to enhance returns, and in ways that hide risk.2 That’s because one can, and I quote, “beat the benchmark simply by holding [it] and stealthily writing puts against it, since this put-writing both raises the mean and lowers the measured variance of the portfolio.” By stealthily, Governor Stein means that generally our measurement systems don’t adequately capture the low-probability future risks that such strategies introduce. Those gaps in our measurement system at the firm level are multiplied many times across the financial system.

This reality should change our thinking about early warning indicators, asset allocation, and our macroprudential toolkit. It should also change our thinking about risk management. As my colleague Rick Bookstaber puts it, “[Treating such indicators as exogenous means that] higher leverage and risk taking in general will be apparently justified by the lower volatility of the market and by the greater ability to diversify as indicated by the lower correlations.”3

Promoting Best Practices in Risk Management

I would like to turn to a discussion of best practices in risk management, which will highlight some of the remaining risks and vulnerabilities in our financial system.

I think we agree that effective risk management relies on a combination of quantitative tools, data management, and governance procedures that enable us to recognize and address risks and vulnerabilities. Our second OFR Working Paper covers this subject in more detail. In case you haven’t seen this paper, it is entitled, “Forging Best Practices in Risk Management” and it is posted on our website. I think we would all agree with the conclusions expressed there—that gaps in risk management remain, that they represent potential threats to your firms, and I would add, that they represent vulnerabilities and threats to financial stability.

Risk Governance and Incentives

Key elements of a strong risk culture include adequate resources and independence for the risk function, a board of directors with the proper information and expertise to understand the firm’s risk-taking, and compensation schemes that align the risks taken by individual units with the long-term objectives of the firm. Despite some progress, vulnerabilities clearly remain.

Liquidity Risk Management

Excessive reliance on short-term funding amplifies shocks to firms and to the financial system. Repo markets, money market funds, asset-backed commercial paper, securities lending, and rehypothecation—the reuse of collateral by a broker to borrow for its own use—all came under stress during the financial crisis, and firms with the greatest reliance on these funding sources were among those at greatest risk. We still see structural vulnerabilities in money-market funds and in repo markets, and future cyclical risks that today’s low-volatility, low-interest-rate environment may hide.

Data and Information Technology

The financial crisis has highlighted the varied level of integration that firms have achieved in their risk management infrastructures. Some of the firms that fared best had developed a firm-wide view of their risks, aggregated across diverse lines of business. Most large complex financial institutions have not yet fully developed this capability.

Market Risk and Credit Risk

These are the traditional focal areas of risk management and, in many respects, they are the best developed aspects of the field. Yet, in periods of stress, it is virtually impossible to distinguish between market risk and credit risk. Portfolios typically involve normal correlations based on short-term risk horizons, but under stress, correlations with equities and among the other asset classes rise, increasing the volatility of the portfolio and its beta-sensitivity. This “dediversification” usually occurs when investors have taken on more risk and leverage.

Using value-at-risk analysis beats no analysis for sure. But aside from the well-known shortcomings of VAR analysis—that it depends on contemporaneous volatility, underestimates worst-case loss, and may not capture correlations across a portfolio or firm—low VAR creates incentives for more leverage. Likewise, in such an environment, even rigorous stress tests may look deceptively good. Thus, an important lesson of the financial crisis is the need to build longer horizons into the measurement of market risk and credit risk to capture the behavior of financial markets under a range of business conditions.

Operational Risk

As highlighted in the Council’s 2012 Annual Report, strong cyber security and mitigants to a broad range of operational risks are key elements of protecting financial stability and an ongoing challenge for financial institutions. The Flash Crash on May 6, 2010—when the Dow Jones Industrial Average plunged nine percent and then recovered within minutes—pointed to the new types of operational risk that emerge from high-speed trading and highlighted the importance of a sound infrastructure. Ensuring the prevention of unauthorized trading and fraud also should remain a priority for operational risk management.

The Micro-Macro Interface

Firm-level risk management focuses on risks to a single institution. But actions that a single institution may take to mitigate its risks—withdrawing funding, selling impaired assets, or exiting a market—can amplify risks in the system as a whole when undertaken simultaneously by many firms, as in the case of a classic bank run. Indeed, risk management practices that may seem sound in isolation can have destabilizing, procyclical effects when widely adopted.

My colleague Jonathan Sokobin yesterday offered a detailed analysis of how the OFR’s broader mandate can help you as risk managers understand the macroprudential implications of firm-level practices. He discussed the issues of aggregation and feedback in risk management, and of new tools such as agent-based modeling, to assess emerging risks. I commend his remarks to your attention.

Stress Testing

The discipline of stress testing has undergone a renaissance since the crisis. But many questions remain about its further evolution. We have devoted some effort to come up with answers. Two of our published working papers, a third paper that will be released shortly, and our 2012 Annual Report discuss stress test methodologies. As I noted, we also held a workshop on stress testing; participants included a mix of practitioners, academics, and supervisors.

From that workshop, we framed a three-part research agenda:

  1. What should be the conceptual foundations of regulatory stress testing, that is, what are the objectives of stress tests and where do they fit with broader supervisory objectives?
  2. What are the modeling needs, with respect to defining scenarios, for example? And,
  3. How can stress tests be made more useful for macroprudential supervision?

Opportunities for Collaboration

Let me conclude by discussing collaboration. I see three areas for fruitful collaboration between us that will help the OFR to achieve its mission and help you better to manage risks.

First, we welcome collaboration and suggestions to help us assess gaps in data that we both need to monitor and manage risks. And we welcome your support for the adoption of data standards that will enable all of us to have confidence that the data we employ in our analysis and monitoring have integrity and can be relied on to provide accurate signals and comparisons.

Second, we welcome dialogue about improving the risk management toolkit. For example, consider the use of a standard or benchmark portfolios that your organization is supporting. Having the banks run their risk systems against pre-defined sets of underlying financial instruments might provide a transparent basis for comparing bank approaches to risk measurement. It might give us a way to view the banks’ risk approaches on an “apples-to-apples” basis.

In addition, as indicated in our mandate, our working papers, and our sponsorship of workshops, we welcome work on improving stress tests at the micro and macro levels. Enhancing stress tests by accounting for the interactions across firms and markets will represent a major advance in these tools.

Finally, we can collaborate on promoting best practices in risk management. That brings me full circle back to where I started today: Working together to help create a culture of risk awareness within organizations, from entry level to board level, with a marriage of the micro risks and macro risks, will help us forge those best practices and reduce risks to financial stability.

These are daunting challenges. To help meet them, I look forward to continued engagement in the months and years ahead.


  1. Jon Danielsson, Hyun Song Shin and Jean-Pierre Zigrand, “Procyclical Leverage and Endogenous Risk,” October 2012. 

  2. “Overheating in Credit Markets: Origins, Measurement, and Policy Responses,” at the “Restoring Household Financial Stability after the Great Recession: Why Household Balance Sheets Matter” research symposium sponsored by the Federal Reserve Bank of St. Louis, St. Louis, Missouri, February 7, 2013. 

  3. “The Volatility Paradox,” December 12, 2011.