AI has sparked a new round of conversation about Cobol, with tools emerging that claim to translate legacy code and, with it, solve the modernisation challenge.

It is worth being precise about what that means and what it does not, writes Rob Thomas, senior vice-president and chief commercial officer of IBM Software.

Translating code is one thing. Modernising a platform is something else entirely.

The two are not the same, and the gap between them is where most enterprises run into trouble.

But the value IBM mainframe delivers has nothing to do with Cobol. It has to do with what the platform is: a purpose-built architecture from silicon through the operating system for unmatched transactional resilience, security, performance, and efficiency at scale that no other distributed environment has been able to deliver.

Whether the application is written in Cobol, Java or any other language, the platform provides the same guarantees. The language is not the source of that value. The platform is.

Here is where the translation argument falls short, and why it matters:

Translation captures almost none of the actual complexity. The modernization challenge is not a Cobol language problem. It is everything the application runs on and integrates with. Enterprise Cobol on IBM Z sits inside a vertically integrated stack: z/OS, CICS, IMS, Db2, RACF, MQ, Parallel Sysplex, and Cybervault with DS8K Storage. That stack is what enables 25 billion encrypted transactions per day on a single system, 450-billion AI inferences per day at 1ms response time, up to eight nines of availability, quantum-safe encryption, and sustained 100 percent utilization without impacting SLAs. Translating Cobol does not move any of that.

The real work is data architecture redesign, runtime replacement, transaction processing integrity, and non-functional requirements baked into the platform itself. That is system-level engineering, not language conversion.

Decades of hardware-software integration cannot be replicated by moving code.  Cobol on IBM Z is code optimised over decades of tight coupling between software and hardware. An analogy is the iOS and iPhone: someone could build an alternative, but it is unlikely to displace a billion iPhones. The performance derives from tight coupling of software and hardware, processor-level acceleration, I/O subsystem optimization, and decades of performance tuning.

AI strengthens the mainframe case, it does not weaken it. Code refactoring, DevOps modernization, knowledge preservation, and quality-of-service improvements are all on-platform opportunities that AI accelerates. It compresses timelines and addresses the skills gap as experienced COBOL Cobol developers retire. Each of these is an argument for doing more on IBM Z, not less.

SaaS-only solution does not hold up under scrutiny. Given the depth of on-premises dependencies, it is difficult to see how a SaaS-only solution can replace the Cobol applications on the mainframe, meeting the demands of the enterprise. And given everything happening around digital sovereignty and data residency, would an organization make its most critical transactions dependent on a provider operating in a jurisdiction it does not control?

Some of this conversation is not about the mainframe. Roughly 40% of Cobol runs on Windows, Linux, and other distributed platforms. A large portion of the AI-and-Cobol story is a distributed systems problem that has been folded into a mainframe headline. The two challenges require different approaches and conflating them leads to the wrong solution.

The tools that translate Cobol are solving a real problem. Just not the one that matters the most for enterprises running IBM Z. Most of the headlines are about the code, but the engineers doing this work know the code is the starting point, not the destination. What the application runs on, how it scales, how it recovers, how it is encrypted, and how it integrates with everything around it – that is the real modernization work. Understanding that difference is where the work actually starts.

This isn’t theoretical. Clients are already proving these points:

  • Royal Bank of Canada – Utilized watsonx Code Assistant for Z to proactively identify dependencies, data flows, structure and organization of existing applications, creating an in-depth blueprint for the modernization and management of changes to core system applications.
  • National Organisation for Social Insurance – NOSI observed up to a 94% reduction in time to analyze and locate superfluous Cobol code and programmatic routines, reducing the identification time from approximately 8 hours to nearer to 30 minutes using watsonx Code Assistant for Z.
  • ANZ Bank – uses modern DevOps tools to reduce manual operations by 60% and accelerate application modernization.

In summary, the AI debate is real and AI for code will drive substantial value creation, but neither should be lost in translation.