How Best to Collect Data from Financial Companies
Published: May 10, 2016
Today, the Office of Financial Research released its first paper in a new Viewpoint Series. This first viewpoint describes best practices for financial regulatory data collections.
We view our mission as including efforts to develop and promote such best practices. By following best practices, financial regulators can align their interests with those of industry and improve the quality of the data collected. Regulators can thus make the process of data collection smoother, more efficient, and less costly for themselves and the financial companies that report information to the authorities.
In short, best practices help assure better data and a reduction in the regulatory reporting burden.
The Dodd-Frank Act, which established the OFR, created two OFR pillars: one for data and one for research. Both pillars matter, but the data part of our mission is what really makes us unique. Other government entities conduct financial stability research and collect financial data. But the OFR has a singular mandate and authority to collect financial data, and to make them usable for assessing and monitoring threats to financial stability and for informing policies to mitigate those threats.
To achieve its mission, the OFR is working to improve the scope, quality, and accessibility of financial data. We need good data — detailed, consistent data — to assess and monitor vulnerabilities that may arise from leverage, complexity, interconnectedness, risk concentration, and other potential sources of threats.
The OFR viewpoint paper describes the thoughtful preparation required for an effective data collection, the importance of designing a clear, well-specified collection template, and the best ways to securely transmit data. It also discusses pitfalls and regulatory considerations in data collection.
A particularly important best practice is to run a pilot data collection to engage with market participants and obtain input from them. This practice helps assure the success of the final collection, especially when the data involve new products or activities, new data reporters, or new technologies.
That’s the approach we used starting in 2014 when we partnered with the Federal Reserve System, with input from the Securities and Exchange Commission, on a pilot data collection covering the market for bilateral repurchase (or repo) agreements. A repo is essentially a collateralized loan, when one party sells a security to another party with an agreement to repurchase it later at an agreed price. Repos, an important source of short-term funding for the financial industry, came under stress during the financial crisis.
The repo market remains vulnerable to runs and asset fire sales that can threaten financial stability. However, until now, data on bilateral repos were scant. Using our pilot data collection, we estimate that bilateral transactions constitute about half of the overall U.S. repo market, which provides more than $3 trillion in funding every day.
We are also working with the same agencies on a second pilot to fill gaps in data describing securities lending activities, when securities owners lend stocks or bonds to other parties. These loans are secured by collateral. During the recent financial crisis, some securities lenders had big losses on cash collateral reinvested in other securities. The losses were one of the reasons the government provided assistance to prevent the bankruptcy of the insurance company American International Group.
Our first Viewpoint Series paper offers useful guidance on how best to collect the data essential for our important work and that of financial regulators around the globe — and through those efforts, lessen the burdens on industry.
Richard Berner is Director of the Office of Financial Research