Mainframes still mainstream: How financial markets are embracing and evolving 'legacy' IT

Tech giant IBM is targeting security, AI, and portability in the modernization of the mainframe as firms report still retaining “the workhorse of the back office.”

It’s the 1930s; Franklin Delano Roosevelt has been elected president, the Securities Act of 1933 and the Glass-Steagall act have been enacted, and a mathematician named Howard Aiken, a researcher at Harvard University, leads a group of engineers at IBM to design and build the Harvard Mark I, one of the world’s first large-scale computers. The Harvard Mark I, or the IBM Automatic Sequence Controlled Calculator, would start running in 1944 to assist in World War II efforts and be used in the development of the atomic bomb for the Manhattan Project.

The mainframe computer has a long history in the world of processing for large scale operations and enterprises. The financial services industry has long looked to the mainframe computer to handle critical workloads. But will the explosion of cloud computing across banks, asset managers, and vendors squeeze out a workhorse like the mainframe?

The answer, right now, appears to be no.

Earlier this month, a whitepaper from research and advisory firm Celent, found that 60% of those surveyed still use a mainframe, and 44% of those don’t plan on retiring their mainframe(s). Of the world’s top 50 banks, 88% still use a mainframe, though it should be acknowledged that most use cases sit in retail banking operations. Among study participants, those on the buy side had the lowest percentage of mainframe retention.

If I can take a trade confirmation at the time I receive it and predict its likelihood to fail clearance and settlement, I can save myself days of operational headache
John Duigenan, IBM

Among those surveyed, reasons to keep mainframes included the low costs of running them, trust in security and efforts from mainframe manufacturers like IBM to modernize these machines to run alongside the cloud.

“Some of our clients say, ‘We’ll never get off the mainframe’,” says one industry consultant. “For certain workloads, the mainframe is very inexpensive to run. And they’re built like battleships—they never go down. So, for some workflows, is it worth refactoring all the code to move to a distributed model that’s not less expensive and may introduce risk?” In short, considering the inherent risks that can accompany any migration project, if it ain’t broke, why fix it?

Security is just one consideration factored into the ongoing modernization of the mainframe, says John Duigenan, general manager financial services, global services at IBM. With cybersecurity top of mind, as well as considerations for the eventual capabilities of quantum computing, IBM took those factors into consideration with its new z16 mainframe.

Rolled out in April of this year, the z16 has been branded as the first quantum-safe system. With built-in quantum-safe encryption, Duigenan tells WatersTechnology that a user would be protected against a potential data breach, which firms continually worry about from state actors.

It also includes the Telum processor. “We have built the AI acceleration directly onto the chip,” Duigenan says. “Why does that matter? Because across financial services, there is the potential to use AI across decision making.” The processor allows for real-time AI that can be applied directly to workloads like securities processing.

“If I can take a trade confirmation at the time I receive it and predict its likelihood to fail clearance and settlement, I can save myself days of operational headache,” Duigenan says. “Lots of securities processing systems run on the mainframe today. They are decades-old but they’re also going through this modernization process.” He says the ability to do these predictions could potentially prove a cost-cutter for large securities processing firms.

Being that IBM is also one of the largest cloud providers by revenue, the company is looking to provide the ability to run cloud techniques like Kubernetes, DevOps, containerization and APIs on the mainframe with no technical difference detected so users can utilize a hybrid cloud model.

“So, all of the existing Cobol and Assembler that’s been around for 50 years, alongside that I can put the most modern cloud-native software,” Duigenan says. “Run it side-by-side and leverage the scale of the platform to have existing code and brand-new customer experience code, work side-by-side in harmony.”

Blast from the past

Read this story from 2007 about how Credit Suisse and State Street were working to modernize their mainframe footprints.

Hybrid cloud strategies remain the most popular way of deploying cloud as firms weigh what percentages of their operations and workloads will sit in public cloud. Earlier this year, Royal Bank of Canada’s senior vice president of technology infrastructure Jikin Shah told WatersTechnology the bank was operating on a hybrid model with some 600 applications running on a private–public hybrid model, with 80% of them using a private architecture.

But, Shah said, that number will shift. “In the next three to four years, we will see roughly an equal balance between on-premises and multi-cloud, and maybe more at 40:60 in four or five years,” he said.

Despite the mainframe’s long history, many do not consider it a piece of legacy technology, Duigenan says. “Our mainframe footprint is growing, not shrinking,” he says. And despite the sense that mainframes evoke the image of an old-fashioned platform, Duigenan points to its continued innovation as evidence against that statement.

Both mainframes and legacy technology have branding problems as firms will stick to running something that is tried and tested, says Monica Summerville, head of capital markets technology research at Celent. “There’s just ranges of legacy: there’s legacy that’s manageable and then legacy that causes a real problem.”

“In some cases, it’s working really well, it’s stable, there are reasons we like it. So we are going to take this legacy thing and then surround it with new technology to let us integrate,” she says. “From a technology point of view, if it fits into the architecture, you should consider it. You can run mainframes on the cloud and you can run cloud on a mainframe. So it’s not this dusty thing in the basement anymore.”

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Data catalog competition heats up as spending cools

Data catalogs represent a big step toward a shopping experience in the style of Amazon.com or iTunes for market data management and procurement. Here, we take a look at the key players in this space, old and new.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here