If it ain’t broke, break it: Back-office tech reform may benefit front-office returns

Better data visibility across multiple systems could provide a driver for technological change in the world of post-trade.

If the post-trade world has a mantra, it would be, “If it ain’t broke, don’t fix it.” If the data is flowing, trades are settling smoothly, payments are going to the right place, clients are getting the right reports and settlement notifications, then don’t mess with it. Don’t introduce any more complexity. Don’t roll out anything new that might interfere with processes that are running fine.

Behemoth back-office infrastructures are highly fragile and sensitive to change. The “If it ain’t broke, don’t fix it” approach may keep critical post-trade processes running smoothly, but it fosters stagnation, a false sense of security, and inhibits firms’ ability to implement meaningful change elsewhere in their organizations—after all, any enterprise-wide change not only must include the back office; some say it should be driven by it.

“Banks and asset managers are starting to realize that if we can reduce our costs in the middle and back office, that could give the front-office guys some new ways to either cut costs or find additional alpha,” says Nick Gordon, CEO and co-founder of London-based tech startup Adnitio.

He says the post-trade environment has traditionally been considered a cost center, while the front-office receives greater focus because of its status as a revenue-generator, but this is starting to change.

Gordon was one of the original founders of transaction monitoring and analytics provider Velocimetrics (now known as Beeks Group). Adnitio’s middle- and back-office real-time tracking tool is based on Velocimetrics’ network monitoring and packet capture and analysis solutions, which were originally developed for low-latency and high-frequency trading clients.

Part of me thinks the industry is still comfortable with the process that has existed for 20 years
Mark Wootton, BNP Paribas

Why would latency-monitoring tools be relevant to post-trade processes? Brad Bailey, head of market intelligence at broker-dealer Clear Street, says the post-trade world is beset by barriers to data flows. “If we consider the post-trade ecosystem holistically across functions and firm types, the main choke points often stem from the same fundamental problems: poor or inaccessible data, manual processes, and highly fragmented, antiquated technology. From a functional perspective, these issues create choke points around trade processing, allocations, corporate actions, collateral access and movement, and settlement,” he says.

And parts of the industry seem content to leave things that way. In some cases, this is because of the potential cost of making changes. In others, it’s the sheer complexity involved in keeping all the disparate systems and processes that comprise a post-trade environment—everything from clearing and settlement to asset servicing, custody, and reporting—running smoothly.

One reason for the fragility of these environments is that systems are often woven into many legacy upstream and downstream systems. According to a study conducted by Broadridge Financial Solutions and Firebrand Research, most sell-side firms have siloed infrastructures across the range of asset classes, often maintaining separate middle- and back-office operations and technologies for equities, fixed income, and derivatives.

“Some medium-sized organizations have, on average, nine post-trade solutions. You’ve got organizations working in siloes across asset classes and business lines,” says Danny Green, head of international post-trade at Broadridge. “So what happens when you want to try and increase your levels of efficiency? When you launch an initiative, you’ve got to impact, on average, nine different ecosystems. Therefore, change becomes difficult to implement, and quite expensive.”

Despite the cost and complexity, firms are beginning to realize that they need to innovate. For example, BNP Paribas Securities Services is investing in its back-office process, particularly within corporate actions, and Societe Generale is leading a consortium of banks to solve data management issues using privacy-enhancing technologies.

However, that’s where firms run into another challenge: All these innovation projects are run separately in disparate business areas, such as corporate actions, which is an area under asset servicing.

Mark Wootton, regional head of local custody and clearing for Asia-Pacific at BNP Paribas Securities Services, recently described the bank’s back-office overhaul to WatersTechnology, highlighting how all participants play a significant part throughout the lifecycle of one corporate action event. However, while each participant involved understands their own process well, there isn’t a consolidated view of the challenges that everyone else goes through in the same lifecycle.

“Part of me thinks the industry is still comfortable with the process that has existed for 20 years,” Wootton says.

That’s where vendors like Adnitio could help the industry to kick-start meaningful change: Gordon says he set up Adnitio to help banks have visibility of their end-to-end process across the middle and back offices by processing each data event in real time and providing users with a live, up-to-date view of traffic and activity.

“The whole idea behind the company is that you can get all your business data out in real time, tracking it end-to-end with zero impact on your underlying systems. What that means is you don’t have to go through any change management; you’re using existing applications to basically stitch together the lifecycle of your transaction as it moves across your systems from the post-execution point, all the way through all your settlements, funding, process payments, FX, netting, and so on,” Gordon says. “It’s a highly complex space, and [seeing how that data moves across systems] has always been the challenge.”

For example, Adnitio provides dashboards that can show where exactly a trade is stuck. “Say for your number one client, you’ve got a trade that is worth several million and it’s been stuck in clearing and needs to go through a manual process. Maybe you want to clear that so that they can do another trade because they’re waiting for that liquidity to be released. That’s where it’s all changing,” he says.

Adnitio pre-processes and links data across various systems, collects data using messaging queue tools like Apache Kafka and IBM MQSeries, and also collects data by monitoring APIs within applications with a low impact, which Gordon says is measured in nanoseconds, as well as from databases.

“That’s when the teams are asking to confirm that they’ve got [the data] in their system of record, so that their analysts and quants can act on it. The quants are getting involved in this process to see how they can add value to the trading in the front end,” Gordon says.

Beyond contributing to revenue generation or cost cutting, there’s another reason for firms to strengthen their post-trade processes: regulation. For example, the UK’s Financial Conduct Authority (FCA) has issued its rules and guidance on requirements to strengthen operational resilience in the financial services sector. Firms have until March 31, 2025, to show the FCA that their critical systems can perform within impact tolerances and that they have made the necessary investments for those systems to operate consistently.

That doesn’t necessarily require firms to switch out a legacy system for a new one, but it may prompt them to better understand how data flows across existing systems, which in turn will enable them to focus any change projects where real problems already exist, rather than create new ones by introducing a completely new architecture. As part of that process, gaining complete visibility of systems could serve as a benchmark for firms to then implement any major change to middle- and back-office systems.

Still, even regulators are having a difficult time changing hearts and minds when it comes to post-trade.

Gordon recalls an end-of-life project he was tasked with 20 years ago. “They said it would only take a year to take this small application out of the bank. Six years later, they still couldn’t take it out, because what they hadn’t realized is that it had been providing data to lots of upstream and downstream process,” he says. “So, until you’ve got your benchmark numbers and know exactly what’s going on, you’re going to struggle.”

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

SEC squares off with broker-dealers over data analytics usage

The Gensler administration has ruffled feathers in the broker-dealer community with a new proposal seeking to limit their use of predictive data analytics. But at the heart of this deal is something far more seismic: one of the first attempts by the SEC to regulate AI.

The Cusip lawsuit: A love story

With possibly three years before the semblance of a verdict is reached in the ongoing class action lawsuit against Cusip Global Services and its affiliates, Reb wonders what exactly is so captivating about the ordeal.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here