FRTB forces banks to rethink entire data management infrastructure

Data mapping and getting historical time series data are among the challenges banks face in conducting calculations necessary for FRTB. But they have help.

Right now, risk managers across the world are crunching an endless stream of numbers—and perhaps at the same time popping an endless stream of aspirins—as they scramble to prepare for the implementation of the long-awaited Fundamental Review of the Trading Book.

FRTB was first introduced by the Basel Committee on Banking Supervision (BCBS) in the aftermath of the 2008 global financial crisis. It was an avenue for regulators to rethink how capital charges for market risk are calculated to ensure banks could absorb potential trading book losses in the face of extreme market conditions.

After conducting consultation exercises in 2013 and 2014, BCBS introduced an initial set of standards in 2016. But since then, the implementation deadline has been delayed numerous times, though the firm(ish) deadline is now set for January 1, 2025, in most countries.

Through this review, the BCBS requires banks to have a more granular view of their trading book data and to apply more granular assessments of their risk models. As Sri Ganesan, global head of financial services at Ness Digital Engineering, said during a panel discussion last month at WatersTechnology’s North American Financial Information Summit, “The dataset is going to be almost nine times what you produce today for market risk.”

But dealing with large datasets is just one of the data challenges banks will confront. They have two choices for compliance—the standardized approach (SA) and the internal model approach (IMA)—both of which pose two distinct challenges: data mapping and getting historical time series data.

This presents a huge data management undertaking for most banks to meet FRTB requirements. But this is also where risk and data vendors can help. And in some cases, it warrants a complete rethink of banks’ infrastructure needs and the use of cloud. Thanks to a combination of legacy infrastructure and high volumes of data hosted by banks on-premises, reorganizing the data and implementing workflows to access and sort that data is a complicated task.

But to better understand the specific data challenges, one needs to dissect the two different approaches.

Under the SA, banks are required to calculate risk sensitivities—delta, vega, and curvature across all desks and asset classes. The BCBS’s 127-page FRTB document, entitled Fundamental Review of the Trading Book: A revised market risk framework, explains that this approach should provide a “method for calculating capital requirements for banks with business models that do not require a more sophisticated measurement of market risk.” Banks also need the relevant reference data to identify the positions in scope and map them to the predefined regulatory “buckets”, which group together instruments that exhibit similar risk characteristics.

NeoXam, an enterprise data management (EDM) software provider to buy- and sell-side firms, is one vendor that helps banks collect the necessary data from market data providers.

Tim Versteeg, Asia-Pacific managing director at NeoXam, explains that under the SA, a firm will collect the market data from the various data vendors, construct golden copies, and put them into the predefined FRTB buckets. Some examples of these buckets—which essentially represent a bank’s universe—are equity, FX, credit, interest rates, and commodities. Additionally, there are different instrument types—spots, volatilities, and options.

“Basically, anything that is traded and that represents a risk factor needs to be bucketed,” he says.

If they come up with a new set of scenarios, your entire universe needs to be put into all of the buckets for that stress scenario.
Tim Versteeg, NeoXam

This process of instrument sorting is typically done once, as they don’t change—unless the regulation comes up with new scenarios, which could happen, Versteeg says.

“If they come up with a new set of scenarios, your entire universe needs to be put into all of the buckets for that stress scenario. Besides the scenario, there are also shocks being applied. For example, we apply a shock to the bucket of emerging market equity of -5%.”

Potential shocks are also predefined, according to the FRTB regulatory paper. The paper itself says that when calculating expected shortfall, instantaneous shocks must be equal to an n-business-day movement in risk factors, with n defined by the liquidity characteristics of the risk factor being modeled.

“The scenarios often have multiple shocks, so it might be -5%, -10%, 0%, +5%, +10%,” Versteeg adds.

Eugene Stern, business manager for enterprise risk at Bloomberg, explains that the differences between the two approaches have changed along with the developments in Basel systems. In Basel II.5 and Basel III, IMA and SA are both viable options for FRTB, but in Basel II.5, IMA gave banks slightly more leeway compared to the more formula-driven SA. Since the introduction of FRTB, however, the IMA has become far more complex and risk-sensitive than it was previously, and Stern says many more banks—except for the largest tier-ones—now are going with the more palatable SA.

“What you are seeing is a large migration across the industry from internal models under Basel II.5 to a standardized approach under FRTB.”

A billion—maybe more—data points

Under IMA, banks need historical market data going back at least a decade to conduct specific calculations based on expected shortfall. This focuses on potential losses under extreme conditions, says Franck Rossi, vice president of product at trading and risk management solutions provider Numerix.

“You need to generate expected shortfall, stressed expected shortfall, [and] you also need to source the data to feed the expected shortfall and stressed expected shortfall with up to 10 years of historical data,” he says.

This could potentially result in a much bigger dataset than Ness Digital’s Ganesan’s earlier estimate. Even for banks that already have a system in place to pull disparate data together, the volumes will be enormous, according to NeoXam’s Versteeg.

“An equity has basically one price per day; that’s simple. But an equity volatility surface has, let’s say, 300 points for every day. If you have thousands of these volatilities, you’re talking about billions of points,” he says.

Then banks would need the one-day price difference and the 10-day price difference. Taking the equity volatility example that has 300 points, multiplied by, say, 200 business days in a year, and then by 10 years of historical data, and then by two for the one-day and 10-day price differences, that equates to 1.2 million calculations the bank has to deal with.

To some banks, IMA may seem more attractive as it can lead to lower capital requirements. However, the data required for IMA—historical data going back 10 years—involves some hefty work.

Charlie Browne, head of market data, risk, and quant solutions at GoldenSource, notes that the process for implementing the IMA is so time-consuming, rigorous, and data-heavy that it is driving some firms to reorganize their entire data-mapping infrastructure.

“If you’re addressing FRTB, and FRTB is that wide-ranging and you’re looking at things like market data and market value and model ranges and everything else, then you really need to look at your entire infrastructure,” Browne says.

Saved by the cloud?

This is where the use of cloud can be a game-changer.

“Cloud is definitely a must,” says Numerix’s Rossi. The advantage of using cloud technology is that banks can dynamically bring up additional CPUs and memory to run these calculations, shut them off when done, and only pay for the additional capacity used.

However, most banks are keen on a hybrid setup, as a lot of the data is considered sensitive.

While sources spoken to for this article say many banks are opting to use the SA, it’s not an “all or nothing” regulation, Rossi says. “Let’s assume at your bank you have 10 desks. You can select IMA or the standardized approach—it’s not IMA or SA for everyone. This is at the desk level,” he says.

It’s worth noting that banks still have to keep the SA in their back pockets in the event regulators decide a certain bank isn’t eligible to use the IMA. Banks will also need to implement a two-factor risk model to assess potential losses if a counterparty defaults, called the default risk charge (DRC). On top of that, SA numbers must be calculated in parallel, essentially doubling the work.

“Why? Because you need to prove to your regulator that the numbers you are generating with the IMA are in line with your model capabilities. And you have to do it on a monthly basis,” Rossi says. “So, should the regulator say, ‘This month you cannot go [with] IMA, there are some tests and rationales to support this,’ then you have to revert to the standardized approach. … That’s double the work.”

Deadlines, guidelines, and guesswork

Today, sources say that most banks across Europe and Asia-Pacific are adequately prepared to meet the requirements under SA. But as local jurisdictions are able to set their own timelines, the steady progress toward FRTB implementation worldwide is fragmented.

Between Asia-Pacific, the UK, and the EU, timelines range from full implementation by 2024 to 2025. The US remains an outlier, as the Federal Reserve has indicated that it would like to follow FRTB implementation, but the schedule to publish an initial draft of the rules of Basel III has slipped. As it stands now, the Fed must publish a notice of proposed rulemaking (NPR), which is the official document that puts a US timeline in place.

Bloomberg’s Stern says that in lieu of an NPR, US banks are operating on assumptions for what the timing will be and what the rule will look like.

“The general assumption here in the US is that it’s not going to differ much from the original framework. They’re going to aim to conform to the global timing, but that’s an assumption and not set in stone,” he says. “Banks in the US are, by definition, still preparing because they don’t know what the timing and the final rules are going to be.”

Even while banks tackle the dissonance of FRTB implementation guidelines and deadlines between local jurisdictions, one thing remains certain—how they manage their data for FRTB remains a Herculean effort, one that requires a hard look at their data infrastructure, and a helping hand from their vendor ecosystem.

Additional reporting by Rebecca Natale

If the map fits

While EDM providers are helping some banks with the data mapping process, firms have further options in software vendors. Trading, risk, and post-trade operations software vendor Murex has a roster of clients that do all their data mapping and management within its system.

“They have a direct feed of securities information directly into Murex from a data provider,” says Timothy Clarsen, head of risk management business solutions for Asia-Pacific at Murex.

Murex has mapping logic and tools that allow banks to harmonize their data for other data management use-cases, and FRTB happens to fit the bill. “It’s part of the project to assess the data and cleanse it, and make sure that you are mapping it correctly. It’s exactly what you would do in an EDM system,” he says.

For those that use their EDM provider to do all that work, Clarsen says the provider would then import that data into Murex, for example, to perform the next phase calculations.

“An EDM system is not expensive for nothing. They have a lot of sophisticated capabilities to take multiple data sources and create rules. We don’t offer that. What we can say is that you receive information for a bond, for example, that says its industry is ‘ABC’. ABC isn’t a one-to-one mapping with an FRTB bucket, but the logic can say, ‘If it’s ABC, it maps into bucket five,’ and we can then provide tools to manage this part of the mapping logic. We can handle the important part of that mapping process for the FRTB calculation specifically,” he says.

Meanwhile, NeoXam’s FRTB module has mapping rules that translate all the values a bank would get from the data vendors into values that were prescribed by the regulator.

For example, a regulator has a bucket for emerging and developed markets, but that’s not something that every data vendor provides. Often, they’ll just give the actual exchange. NeoXam does the mapping between the exchanges that are developed markets and the ones that are emerging markets. To put that visually, it’s “basically a lot of mapping tables where you might have the codes that you get externally on the left side, and the codes that need to be used for the buckets on the right side, and then the definition of how that gets mapped,” Tim Versteeg, Asia-Pacific managing director at NeoXam says.

The difficulties in getting the right data and helping with the mapping challenge have also led to some data vendors enriching their datasets. Some of these vendors, like Bloomberg, have an FRTB bucket classification so that a bond, for example, and any trades that go with that bond, go into a specific bucket.

Bloomberg’s business manager for enterprise risk, Eugene Stern, says the data requirements are very specific for the SA. He adds that across equities, credit and mortgages, banks need good metadata on individual securities and rules to turn that raw information into the relevant FRTB buckets. Banks also need to be geographically flexible, because rules can change in different jurisdictions.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

SEC squares off with broker-dealers over data analytics usage

The Gensler administration has ruffled feathers in the broker-dealer community with a new proposal seeking to limit their use of predictive data analytics. But at the heart of this deal is something far more seismic: one of the first attempts by the SEC to regulate AI.

The Cusip lawsuit: A love story

With possibly three years before the semblance of a verdict is reached in the ongoing class action lawsuit against Cusip Global Services and its affiliates, Reb wonders what exactly is so captivating about the ordeal.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here