Remarks by OFR Director Richard Berner at the conference on Interdisciplinary Approaches to Financial Stability
Published: October 22, 2015
The OFR and the Center on Finance, Law, and Policy at the University of Michigan sponsored the conference, with support from the Smith Richardson Foundation, the University of Michigan College of Engineering, and its Ross School of Business.
Good morning. It’s great to be back in Ann Arbor. Thank you, Michael, for your kind introduction. Thanks also for your key role in financial reform that led to the establishment of the Office of Financial Research and Financial Stability Oversight Council.
Thanks as well to our partners in sponsoring and organizing this important conference. There’s a long list of partners: The University of Michigan and its Center on Finance, Law, and Policy; its College of Engineering; and its Ross School of Business; as well as the Smith Richardson Foundation.
Setting the stage
We are here to discuss and learn from each other about interdisciplinary approaches to financial stability. Let me be clear on what we mean: Financial stability is not about constraining market volatility, nor can we predict or prevent financial shocks. Rather, financial stability is about resilience. We want to be sure that when shocks hit, the financial system will continue to provide its basic functions to facilitate economic activity. In the OFR’s first annual report, we identified six such functions: (1) credit allocation and leverage, (2) maturity transformation, (3) risk transfer, (4) price discovery, (5) liquidity provision, and (6) facilitation of payments.
Threats to financial stability arise from vulnerabilities in the financial system — failures in these functions that are exposed by shocks. Resilience has two aspects:
- Does the system have enough shock-absorbing capacity so it can still function? and
- Are incentives, such as market discipline or transparent pricing of risk, aligned to limit excessive risk taking?
Both aspects matter. Shock absorbers are needed to buffer hits, while what I call guard rails — or incentives that affect behavior — are needed to increase the cost of — and thereby constrain — the risk taking that can create financial vulnerabilities.
Resilience, or conversely, threats to financial stability are systemwide concepts. To measure, assess, and monitor them, we must look across the financial system. We must examine both institutions and markets to appreciate how threats propagate from one institution or market to others, and to evaluate ways to mitigate those risks.
The financial crisis exposed critical gaps in our analysis and understanding of the financial system, in the data used to measure and monitor financial activities, and in the policy tools available to mitigate potential threats to financial stability. These gaps — in analysis, data, and policy tools — contributed to the crisis and hampered efforts to contain it. Filling those gaps is crucial to assessing and monitoring threats to financial stability, and to developing what we call the macroprudential toolkit to make the financial system resilient.
Thanks to heroic efforts by Professor Barr and many others, the Dodd-Frank Act was signed five years ago. Since then, we have improved our understanding of how the financial system functions, and our ability to measure financial activity and spot vulnerabilities. But we need to do more to understand how the financial system fails to function under stress, to spot vulnerabilities in the shadows, and to gather and standardize the data needed for our critical analysis and policymakers’ responses to identified threats.
When Michael called to see if we were interested in cosponsoring this conference, it took me only a split second to say yes. Traditionally, financial policymaking is the purview of financial economists and lawyers. But I can say with authority — because I am one — that the Ph.D. economists of the world have not cornered the market on good ideas. I’ll let the lawyers speak for themselves.
That’s why a conference like this one is so important. I know there are a lot of great ideas out there among other attendees at this conference: the lawyers; mathematicians; risk managers; statisticians; engineers; biologists; and data, computer, and other scientists. Most important, we have the opportunity in the next two days to kick off a process to share those ideas and learn from each other’s successes and mistakes.
This morning I will start by focusing on four of the challenges we face in prosecuting our agenda.
First, our regulatory framework tends to be based on rules rather than on principles. As Paul Tucker reminds us, a static rulebook invites initiatives aimed at getting around it. So-called regulatory arbitrage often involves the migration of financial activity toward presumably more opaque and less-resilient corners of the financial system. Thus, making the framework flexible, adaptable, resistant to gaming, and forward-looking is a challenge worth meeting.
Second, threats occur and are transmitted and amplified across the financial system, but our regulatory infrastructure and the collection of data by primary regulators are focused on individual entities. We need to manage their legitimate confidentiality concerns through appropriate safeguards that facilitate collaboration and sharing insights and information.
Third, markets and institutions are global, but our authorities as policymakers are national. To achieve our shared mission, we must take a shared approach that is collaborative, cross-border, and global.
Finally, we know that financial innovation and the migration of financial activity create a moving target. So our goals to eliminate gaps in data and analysis, and to devise ideal shock absorbers and guardrails, will always elude us. All the more reason to collaborate to fill the most important gaps and work on the toolkit together.
Tools from other disciplines
Let me turn to discussing four financial stability analysis tools that are derived from and shared with other disciplines. They are: (1) data visualization, including heat maps, (2) stress testing, (3) network analysis, and (4) agent-based modeling.
Let’s start with data visualization. Picturing patterns in large datasets, now popularly known as “big data,” can be worth a thousand words — or more. Our work is focused on tail risk, rather than means or modes in normal times, so large, granular and diverse datasets are intrinsic to it. Visual tools are thus especially important for spotting and communicating tail risks in a sea of data.
Drawing conclusions simply from observing patterns in data is risky, however. I’m a fan of coherent granular data that animate an analytical framework. But measurement completely without theory, much less testable hypotheses, can be a slippery slope, one we should be wary of. As with children, small datasets mean small problems, while big datasets can mean very big ones.
Accordingly, as with most tools, visualizations work best when they are carefully adapted to the specific tasks to be performed. A recent OFR working paper emphasized that visual techniques that would be ideal for one task would be inappropriate for another. For example, the accountability requirements of financial rulemaking imply that fixed visuals are preferable, as is the case with engineering blueprints submitted for approval for compliance with building codes. On the other hand, like a medical sonogram, or real-time functional magnetic resonance imaging, the exploratory nature of an analyst’s discovery process during a financial stress event favors interactive visualizations.
At the OFR, the heat map in our Financial Stability Monitor is a key visualization tool. It depicts a framework that looks across the financial system at five categories of risk: macroeconomic, market, credit, funding and liquidity, and contagion. The monitor enables us to measure and track risks in each category wherever in the financial system they occur — in banks, shadow banks, other nonbanks, and markets. We update it and its supporting data semi-annually on our website.
But this is only a start. We’d love to compare notes with you on similar tools.
Stress testing has well-known roots in engineering and medicine, such as to assess the resilience of bridges and infrastructure, and tests for cardiovascular functioning under stress.
In my view, regular financial stress testing is one of the best tools available both for assessing potential sources of vulnerabilities and for calibrating microprudential requirements, such as for capital based on firms’ idiosyncratic risks. I think stress tests might also be used to calibrate macroprudential tools, including those aimed at building resilience across the system. And of course, it is an important tool for risk management at financial firms.
At the OFR, we are required by statute to evaluate stress tests and similar tools. We are engaged in extensive dialogues to obtain access to the data used to conduct stress tests, and to suggest ways to improve them, including for nonbank institutions and systemwide risk assessment. Some key areas of our research related to stress testing include better ways to consider risk propagation or contagion in stress testing. In this regard, network approaches and agent-based modeling can be helpful interdisciplinary methods to move stress tests toward a systemwide framework. I’ll discuss each of those momentarily.
In our work, we are already borrowing from other disciplines. We’ve published several research papers on stress testing, including two on selecting stress test scenarios. These two papers use variants of Monte Carlo methods, which simulate uncertain scenarios to determine the distributions — including the tails — of outcomes. Another OFR paper published just this month extended techniques from engineering to quantify fundamental economic uncertainty and applied the method to an example of portfolio stress testing.
Network analysis in financial stability work has clearly borrowed from other disciplines; for example, Gary Gorton and Andrew Metrick famously described studying the run on repo in the crisis analogously to epidemiology. Network analysis was born in applied mathematics but it has been used in studying vulnerabilities in computer networks, highway systems, and other areas.
The pre-crisis growth of finance was plainly evident. But until the crisis, the parallel increase in the connections and complexity of the global financial system was less well-understood. We now use network analysis for detailed and precise insights about the channels through which contagion spreads through the financial system during a crisis. The most recent research conference by the Bank for International Settlements and Financial Stability Board was devoted entirely to interconnectedness.
Interconnectedness creates both economic benefits and financial vulnerabilities. The process of financial intermediation can be simple, but to satisfy complex client needs, firms may innovate, turn to new products, and connect to specialized firms to obtain them. The rapid growth of securitization and the derivatives markets before the crisis was a good example. Such services reduced the costs of intermediation and helped to diversify or share risks. Other services provide the processing required for completing financial transactions. In these ways, connections can contribute to a more efficient and resilient financial system.
However, interconnections can also act as channels for transmitting or amplifying financial shocks. Liquidity or credit shocks in one part of the financial system may spread to other parts, resulting in runs and fire sales. Interconnected systems or networks also tend to be more opaque. The opacity of exposures among firms and markets can trigger individually rational but collectively procyclical behavior, amplifying the effects of an initial shock. And more concentrated, highly interconnected systems with a few key players can be particularly vulnerable to shocks.
Constructing and analyzing financial networks is complicated. We are mapping significant portions of the financial system to help simplify and understand real-world interconnections among institutions and markets. For example, we have mapped sources and uses for liquidity and funding in securities financing transactions, highlighting the roles of broker-dealers, hedge funds, and other borrowers, and the roles of lenders such as money funds and asset managers. We are developing, or will use, analogous maps for collateral use and payments, clearing, and settlement systems to track operational and other risks.
Agent-based modeling (models with heterogeneous agents)
Agent-based modeling, or models with heterogeneous agents, is a tool used in many disciplines. It has been used in the past to attempt to explain social interactions, traffic flow and other crowd dynamics, the spread of epidemics — even the behavior of birds in flight. The view of financial calamities as the aggregated actions and interactions among individuals is the essence of agent-based modeling and it has intriguing insights for financial stability research. Just like the birds in a flock, individual market participants make individual decisions and react to the behavior of other participants during a crisis. To understand the dynamics of a crisis, agent-based modeling examines and aggregates the actions and interactions of the participants.
In late 2012, we published a working paper about agent-based modeling that launched what has become a significant line of research for the OFR. The paper explained that agent-based models seek to incorporate the complexity of behavior among financial firms and the tendency of firms to change behavior during a crisis. The paper described how the behavior of individual firms or “agents” can affect outcomes in complex systems.
A key area of stress-testing-related research is risk propagation or contagion. In this regard, network approaches and agent-based modeling can be helpful in moving stress testing toward a systemwide framework. One example is a recent OFR working paper that presents a dynamic macroprudential stress testing framework. Our contributions to a forthcoming Basel Committee working paper suggest that second-round effects using network methods materially affect their assessment of risks.
Essential Data Science
I’d like to conclude with a focus on data. Gaps in the scope and quality of the data needed to measure financial activity persist and filling these gaps is essential. Without high-quality data accurately measuring financial activity across the system, effective oversight of global markets and firms, and effective assessment and monitoring of financial stability risks, are impossible. Obviously, that’s equally true in the physical sciences, and in engineering and medicine. To fill these needs, we at the OFR have several initiatives underway:
OFR projects are underway to collect critical, transaction-level data on bilateral repo and securities lending activities, where vulnerabilities and data gaps remain. The markets for these critical short-term funding instruments continue to be vulnerable to runs and asset fire sales.
We are helping the Commodity Futures Trading Commission and other regulators improve data quality in registered swap data repositories. These repositories are designed to be high-quality, low-cost collection points for data that are critical to understand exposures and connections across the financial system. To ensure global consistency in these data, we are also collaborating with our global counterparts.
We are also improving the quality of financial data by developing and promoting the use of data standards. We have led a foundational initiative among governments and private industry worldwide to establish a global Legal Entity Identifier or LEI — a data standard that is like a bar code for precisely and uniquely identifying parties to financial transactions. If the LEI system had been in place in 2008, the industry, regulators, and policymakers would have been better able to trace the exposures and connections of Lehman Brothers and others across the financial system. The LEI initiative has become fully operational in just a few years. But ubiquity is needed to realize its full benefits, so I have called for mandating its use for regulatory reporting.
I have outlined some ingredients for good data and good analysis. Of course, the ultimate goal is good policy. What tools should we put into the macroprudential toolkit and how should we calibrate them? In the past five years, the policy toolkit to address vulnerabilities has improved substantially. The adoption of new rules for capital and liquidity strengthened banking systems globally. However, much more remains to do, especially outside banks and across the rest of the financial system. Vulnerabilities can arise in nonbank financial intermediaries and in markets. Both micro- and macroprudential tools will likely be needed to address them.
Work to identify risks in nonbank entities should identify the activities that can give rise to vulnerabilities. Use of derivatives, secured funding, illiquid asset concentrations, counterparty credit concentrations, and obligations of CCP membership may all contribute to the interconnectedness of these firms in ways that could be relevant to financial stability. An activities-based approach will also help target policy measures — such as counterparty concentration limits, large position monitoring, and liquidity tools to manage redemption risks. Finally, an activities-based approach may require policy tools aimed at markets rather than at entities, such as minimum floors for repo haircuts to reduce excessive reliance on short-term, wholesale funding.
What will happen after this conference ends tomorrow afternoon? When we go back to work on Monday, will we do anything differently? We should challenge ourselves to find ways to operationalize what we learn here and take action to foster the subject of this conference.
Some ideas might include:
- Establishing interdisciplinary centers of excellence at universities to offer coursework and promote interdisciplinary research.
- Coursework and research across many disciplines on data science that would help elevate this work to its rightful place as an essential tool for decisionmaking.
- Collaborative efforts among financial policymakers to learn from other disciplines about resilience and crisis management.
Over the next two days, I look forward to learning from you.
Thanks for your attention. I would be happy to answer some questions.