Derivatives data reporting – the lynchpin of a far greater objective

Derivatives data reporting – the lynchpin of a far greater objective

  • Export:

By Allan Grody, the president of Financial InterGroup Advisors

On the way to regulating the former bi-lateral dealer-managed global over-the-counter (OTC) derivatives market, regulators stumbled on a pervasive issue that has stymied global financial services firms for generations - how to aggregate a disparate set of financial transactions across a global market to see a consistent, timely view of their effect on systemic risk.

In the US our legislators and regulators have defined broad principles for regulating this market, embodied in the 2010 Dodd-Frank Act (DFA). In Europe, the European Parliament through passage of the European Markets Infrastructure Regulation (Emir), and elsewhere similar legislation, has had similar effect, bringing former OTC derivatives transactions under regulatory scrutiny.

That we are attempting to do this in one transaction category, OTC Derivatives, through a new regulatory regime and having our issues with its implementation, belies a much bigger issue. That issue, set to be implemented through regulation in 2018, deals with a myriad of other asset categories set to traverse a similar processing path even though that path has yet to be completed and so far has proved dysfunctional.

Reporting of granular data components of OTC derivatives transactions to multiple regulators has been implemented without first giving data standards and technology its rightful priority. An afterthought at best, elected officials and their regulators rushed to set a framework for reform, declared the problem solved, then proceeded to implement the framework-level objectives. They were met with what can only be described as a challenge to their thoughtless disregard for the plumbing that would first have to be put in place to provide any chance of organising granular transaction data.

To this point of prematurely establishing regulations the Commodity Futures Trading Commission conceded they did not have any informed view at the time they passed regulations and, therefore, had hoped the industry would take coordinated steps toward standardising reporting to regulators.

The result was regulators still find themselves without a capability to receive, access, store, analyse or aggregate this data for any purpose, certainly not for the primary objective of systemic risk assessment.

That the plumbing has yet to be put in place for just one market, OTC derivatives, when all other tradeable asset categories are set to be implemented in this same way beginning in 2018 is a red flag for any prudent observer or participant in this effort. It certainly defies all tenets of good systems design, for that is what new regulations at the implementation level is all about.

Implementing a comprehensive system to accommodate regulators framework-level objectives just for OTC derivatives has now been underway for over six years.  While a lot has been accomplished much more is still left to be done.

The Global Legal Entity Identifier (LEI) Foundation (GLEIF) has assigned a half million LEI codes that are being used mainly in reporting OTC derivatives transactions to trade repositories. However, nearly one-third of these have not been renewed annually as required. No study of aggregating LEIs across multiple trade repositories has been conducted due to the failure to describe and then standardise transaction data elements.  A pilot study is currently underway to gather relationship data starting with immediate and ultimate parent LEI data so that counterparties can be aggregated up through their parent entity. This is a process that is essential to the main objective in regulating the OTC derivatives markets, analysing systemic risk.

The pilot is to last six months. It is currently in its second month and while it is early days, about one-third of the reporting data intermediaries, known as Local Operating Units (LOUs), have yet to begin reporting relationship data. Thereafter, an evaluation will be done to determine whether objectives have been met, any flaws have appeared, any modifications are indicated, and most, importantly, determine what would be the timing for a ‘go-no-go’ decision.

The LEI is the most essential identifier as it is to be paired with both the Unique Transaction Identifier (UTI) and the Unique Product Identifier (UPI) to enable financial transaction aggregation by asset class and counterparty, an essential process to analyse systemic risk. The LEI is also essential to creating uniqueness of financial transaction data to be reported to trade repositories and, ultimately to regulators.

Other activities underway but yet to be accomplished include agreement on both a universal UTI and UPI code construction technique; a UPI assignment platform for OTC derivatives; and a UTI assignment platform for UTIs. Further requirements are to first identify and then to harmonise (aka standardise) all the data elements associated with OTC derivatives.

The Bank of England recently reported on its attempt to use the already reported data in just one sub-segment of the OTC derivatives market, foreign exchange derivatives, in a single trade repository, DTCC’s Trade Repository, and found significant data quality issues with newly created UTIs, UPIs, and LEIs.

Notwithstanding this, European Union regulators are proceeding to mandate an even more comprehensive set of data requirements for most other tradeable instruments, built on work to date on OTC derivatives requirements.  New rules are to go into effect this coming January, referred to as the ‘no LEI no trade” rule, and the "no LEI, no admission to trading" rule. The rules require an LEI to be present within an order’s data element construction before a trade is placed on a trading venue and for the issuer of an instrument (stock, bond, et al) to likewise obtain an LEI before the instrument is admitted to trading. The trading venues are required to submit granular instrument-level reference data to the Financial Instrument Reference Data System, a system newly built.

While all this is going on, heading to a January 2018 implementation, new distributed ledger technology (DLT) – a component of Blockchain technology – and smart contracts are being promoted in a new vision for data assembly, validation, and storage. This vision has the backing of a consortium of global financial institutions. This vison, now finding its way into operative systems, could potentially eliminate data intermediaries and market utilities, certainly most easily in the standardisation of identity data. One such solution has already been proposed as a proof of concept. Another practical implementation was presented recently by the CEO of the GLEIF.  

The  mindset of hurrying up to pass regulations and then figuring out how to implement it has so far lead to incremental, mainly partial and unproven implementations, each built around legacy best practices. The history so far for OTC derivatives implementations, the precursor for the underlying plumbing for all systems yet to come, is a dysfunctional set of unproven data standards and trade repositories that yet lack the ability to aggregate any of these transactions, now numbering well into the billions. Imposing potentially millions of additional LEIs and UPIs and the data elements comprising each newly reported financial transaction, especially in a yet to be tested system, is not prudent. 

This should be foremost in the minds of regulators and, especially industry groups, as the major implementation milestone is only six months away.

  • Export:

Related Articles