Defining Success for the FDTA

Posted Monday, July 1

By Campbell Pryde, President and CEO, XBRL US

The standardization of regulatory requirements called for in the FDTA is of critical importance to us all. Why?

The cost of government regulation is increasingly being seen as a burden by many in Congress and the courts who look to limit the scope of regulatory agencies. 

At the same time the data collected by regulatory agencies is becoming more important for the functioning of a modern economy.  With increases in geopolitical, environmental, market, liquidity, counterparty and political risk the need for timely and comprehensive information is essential to navigate these risks. In many cases this information can only be made available by the government.

The cost and burden of collecting this data must be reduced and the accessibility of this data must be improved.  Both  can be achieved through standardization. Many regulators dislike standardization because many in government believe it limits their flexibility to regulate as they see fit. This is a trap that must be avoided. Digital standardization enforces a disciplined and structured approach that results in a regulatory framework that is clear and robust and not perceived as petty, irrelevant, or enforced to meet bureaucratic targets.

As regulators consider implementation of the FDTA, and as stakeholders respond to the cross-agency rule proposal that we expect to see any day now, it is important to keep in mind what constitutes success: 

  1. BETTER DATA: Data needs to be reliable, unambiguous, machine-readable, timely and  interoperable.
  2. REDUCED COST: There must be a reduction in reporting burden and cost across all stakeholders. 
  3. FLEXIBILITY: Regulators must have the ability to adapt to changes in reporting needs and technology over time. 

This may sound like a big “ask”, but it has already been done in hundreds of successful data standards programs rolled out by regulators worldwide, including the Securities and Exchange Commission and the FDIC, both agencies that fall under the FDTA.

We cannot let the FDTA be shortchanged by inertia, by concerns that collaboration and reaching a consensus is too complicated or simply not feasible, that it is too restrictive for regulators, or that it is too much of a disruption to the market.  

Over the last 120 years, standardization across industry and government has driven massive increases in productivity and wealth creation. As a civilization no one argues for eliminating standards that are already in place. However, the adoption of standards always encounters strong resistance. In many cases the parties who have to adopt standards incur the cost of adoption and more importantly, may lose influence.  The benefits however, are far greater, but are distributed across a broad spectrum of society.

Standards that we use every day overcame many of the same hurdles that we hear today in reference to the FDTA. A few of the complaints we’ve heard about the FDTA include::

  • “It will disrupt established practices” – this same argument was used against standardized time zones.
  • “It is an unfunded mandate, too expensive” – this argument was used against the standardization of railroad track gauges.
  • “It is too much of a regulatory burden” – this was the case made against the adoption of accounting standards.
  • “Markets function fine as they are today, standards are not necessary” – this argument made against the barcode.
  • “It is too disruptive and you cannot get everyone to adopt it” – this was the argument made against containerization of shipping containers.

Imagine if those arguments had won and we didn’t have standardized time zones, or accounting standards, consistently created railroad  track gauges, or bar codes today? 

We know that the FDTA can meet its goals if agencies adopt standards like the  legal entity identifier and XBRL. Anything less, like suggestions that we wait until machine learning/AI can accurately, cheaply and consistently translate PDF files into granular financial data, or  giving agencies the latitude to create their own custom standards and identifiers will be more expensive, and will not accomplish what Senators Crapo and Warner envisioned with the FDTA. 

Having standardized digital models that reflect US GAAP and other broadly adopted data standards that are supported with agency wide standard identifiers for legal entities, ABS, mortgages and  securities will result in :  

  • Lower costs to collect, prepare, and analyze data
  • Data that is interoperable and can be easily shared, commingled, and inventoried
  • The ability to adapt to changes in reporting requirements because of accounting standard changes or industry need, quickly and inexpensively 

We should not settle for anything less. 

If the proposed rule limits the broad adoption of standards and the agencies attempt to gut standardization in the name of regulatory flexibility, I implore everyone to contact the agencies and Congress and make it very clear that minimizing the impact of standardization is a huge mistake.

Statement presented at the Data Foundation FDTA Forum 2024: Defining Success, June 27, 2024