BlackRock calls for blockchain to fix futures processing snags

The asset manager wants the industry to move faster in adopting a “single source of truth” model.

Distributed ledger technology (DLT) should be used to mend the operational breakdowns in futures processing seen during the pandemic-induced meltdown of 2020, according to the world’s largest asset manager.

“A DLT-type solution is a perfect example of how you would distribute data and a single source of truth. And that allows us to move away from a model where we essentially have every single participant across the industry replicating and reconciling, and ultimately burning a lot of calories,” said Tony Ashraf, global head of derivatives operations, collateral management and clearing at BlackRock.

“Moving to a model where we accept a single source of truth and we synchronize from it, we have a clear view in terms of the status of our trades at any point in time. I think that’s the first piece that we need to think about,” he said.

Ashraf was speaking on behalf of the $10 trillion asset manager during a panel at Eurex’s Derivatives Forum in Frankfurt today (May 25).

Problems encountered in 2020 caused sleepless nights for red-eyed back-office staff as many futures trades saw breaks, resulting in missed margin payments and positions left on the books of the wrong brokers.

The debacle led dealers to consider creating a new utility under the auspices of the Futures Industry Association.

Nick Solinger, chief executive of FIA Tech, a for-profit subsidiary of the trade body that processes a lot of the position transfers – or give-ups – at clearing houses around the world, said volumes during the peak of Russia’s invasion of Ukraine were 50% higher compared with the outbreak of the Covid pandemic, though backlogs turned out to be less significant.

During Covid, delays largely occurred in the back offices of clearing brokers, which had connectivity issues with clearing houses. During the Russian invasion in February, the majority of delays stemmed from “executing brokers getting allocations from clients, processing them before the close of clearing and getting them out,” according to Solinger.

Since March 2020, the FIA has been co-ordinating work with the industry to improve processes in trade allocation. Market participants have already resolved some bandwidth issues with exchanges, while the FIA has set about creating a universal set of procedures for the trading and clearing lifecycle.

A working group of about 40 firms across the buy side and sell side has been convened to ensure better interoperability between systems. The group is focused on standardizing the data that vendor systems, including Bloomberg, FIS and Ion, use to communicate with each other. The group has begun to create a golden source of specifications for data used to identify products and brokers to minimize the risk of trade breaks, as well as harmonizing data coming out of clearing houses.

“I certainly recognize that we’ve made progress. I think we need to move faster, to be perfectly honest,” said BlackRock’s Ashraf. “There’s been so much talk about digital assets and DLT… I think the benefits that [DLT] offers is essentially what we need today.”

Speaking on the same panel, Per Haga, global head of prime derivatives services product at Barclays, said bilateral point-to-point connections between clearing houses and members had been expanded but more work was needed on connections between brokers and buy-side clients “to create as robust a process as possible”.

Before a technology solution is chosen, Haga favors an interim step of a “central repository we can all consume from”.

He adds this would act as a golden source for trade data, “making it much easier for the buy side to add another clearing broker, as there’s less integration work to be done”.

Solinger agreed on the need for a golden source of data, calling the last generation of middleware platforms “ossified”, but he warned they are deeply entrenched in the system.

“You can’t really move away from them because they automated the differences as opposed to creating a new market standard.”

On the same panel, Melanie Weber, senior vice-president for derivatives clearing design at Eurex Clearing, said average pricing for split orders was a focus for the clearing house: “We have seen in the allocation process, it might be one of the pain points. So we started a journey on what can be improved in our average pricing offering.”

In 2020, processing congestion was exacerbated by buy-side traders needing to work futures orders in pieces over a whole trading session, before allocating average prices across hundreds or thousands of sub-funds.

This led to calls to standardize allocation flow from the buy side to FCMs, as well as greater standardization of clearing house support for average pricing.

Eurex has plans to introduce a new average pricing offering by year-end.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Data catalog competition heats up as spending cools

Data catalogs represent a big step toward a shopping experience in the style of Amazon.com or iTunes for market data management and procurement. Here, we take a look at the key players in this space, old and new.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here