So, we started with a report on “legacy” COBOL systems still being in use and a report suggesting this represented a security risk, since the writers of those systems have retired and are dying off. We wanted a quick sanity check and found ourselves pinned to a wall of naivete pretty quickly. One technology journalist we spoke to, for example, reckoned there might be a few machines out there running Windows XP but only a handful, and nothing older.
At the personal computing level this might be right. On a corporate level and looking at the systems on which many people depend for their daily functions, it couldn’t be more wrong. The study we were looking at, from SSRN, covered the American IRS system. It also covered 2015’s breach of the US Office of Personnel Management, both of which run on old COBOL systems. Asking around, we found the same was true of British systems and banks internationally.
So if no-one’s able to update these things, is the security automatically compromised?
Simon Bevan, who worked for IBM for 25 years from the 1960s, was at pains to point out that “legacy” had to encompass only things that were no longer maintained. This didn’t mean they were no longer fit for purpose, however. He was one of the developers behind IPARS, the original airline reservation code, from which most of the modern equivalents have evolved. He says:
Needless to say, there are not many of us left who could help if things went wrong. Whatever the case, the core reservations code is extremely stable and running in a stable and well maintained environment.
Reassuringly, he and his team gave some thought to the longevity of the systems they were developing at the time (he was also part of the first online banking system, for the Bank of Montreal). He added:
Knowing the complexity and cost of designing these core systems, I assumed that they would continue to exist at the centre of any subsequent extensions of the original applications.
The development priority in the then foreseeable future would be to capitalise on the original systems by extending their capabilities rather than rewriting the core functions. At that time, it was IBM’s strategy to protect the customers’ development investment by guaranteeing system compatibility into the future.
Others saw potential issues. Oliver Kraus, IT consultant, said:
You have two main security concerns – the one is of the technology itself, and the second is of your own code. Since the first kind is generic, the more chances someone will know of the vulnerability, but the more chances there will be a solution or a ‘patch’ to cover for it. Because of the high costs of developing the software from scratch, the need to teach all of the developers a new technology, and the risk of creating bugs or security vulnerabilities, many CTOs prefer to keep the old technology.
Job for life
The ageing developer population remains an issue for many, however. Anthony Peake, managing director at Software Solved, said the public sector is one of the few areas in which the notion of the “job for life” still retains any clout. This is why it’s only now coming to light that there is an ageing system problem. He added:
Those who are coming in or just entering the software profession, understandably, don’t want to learn code and systems that are decades old. Equally, those who do understand and have worked with COBOL and other legacy systems such as Delphi, are now coming to the end of their careers or have simply forgotten how to work with the tool. There are specialists but they cost a lot.
So is there an actual risk? He thinks so.
A number of government bodies simply bury their heads in the sand. Budget restrictions or lack of skills simply means that everything stays the same. This, of course, opens them up to other risks. COBOL and Delphi were secure at the time of implementation, but the world has moved on considerably and new threats such as injection attacks are simply not covered by legacy software.
The sheer size of the risk has also increased. Systems now might be on the public internet or part of a wider corporate network, again not considered when the original systems were implemented putting public data at risk.
John Walker, visiting professor at the School of Computing and Informatics, Notthingham Trent University, and owner and MD of Secure-Bastion Ltd, puts it pretty bluntly:
At the end of the day it is a case of ‘security through obscurity’ versus ‘devoid security through confusion’. On one hand, we have the inferred security represented by say, non-routable protocols, or mainframe speciation partitions such as LPAR. On the other side of the security coin we have unpatched outdated systems such as NT4.0 residing inside virtualisation. No matter, the outcomes are the same: confusion, and a very big potential for unknown unknowns of insecurity to reside within the operational environment.
Bevan goes back to fundamentals to address the issue. Buy systems from established vendors who have some longevity and who maintain their systems. Of course, patch everything, and design your own software in a modular fashion so that you can incorporate new functions and features easily.
Treat data as you would treat money. Always control access to data and always encrypt databases, but be aware that this is protection against casual copying not against insider activity.
And don’t forget that technology doesn’t exist in any sort of vacuum, adds Bevan:
Continuously update the application logic to accommodate changes in the real world. Regularly review, enhance and document the component functions. Review the technologies, environment and supporting facilities upon which your software depends.
And if you have an older developer in your workforce, be nice to them. They may turn out to be extremely important…