Why old school thinking?

By John Forrester March 28, 2023

Why old school thinking?

For many years, a report from Govloop in the US writes, government agencies have applied a “hodgepodge of cyber-security strategies” to counter both internal and external risks. Many relied on firewalls to manage external activity and potential threats. In the end, the Govloop authors conclude that Legacy firewalls tend to be more concerned with activity attempting to penetrate their perimeter defenses and do not monitor internal activities within the local network.

Firewalls are also, according to the report, traditionally weak when it comes to blocking sensitive or personal information from leaving a network. They assume that everything coming from within their security perimeter is already authorized. Firewalls also can create bottle necks, funnelling online traffic into the system through fixed network points so packets can be verified. Unfortunately, these bottle necks tend to slow down networks and significantly impact their performance,

The publication of the executive order in May 2021 by the Biden administration with its focus having federal agencies move towards a Zero-Trust architecture and initiate a “never trust, always verify” approach for user identification ad authentication in government is an important step forward, according to the authors of this report, in dealing with the “evolving threat landscape” where legacy tools are no longer sufficient in meeting the variety of threats appearing in recent years . The Zero-Trust security approach is gaining popularity in Europe as a mainstream solution for implementing such a strategy. Forrester recently published a report highlighting the status of Zero Trust adoption in Europe. While these move towards a Zero-Trust Architecture and prioritizing an identity-based approach to security is certainly a much needed step forward, legacy systems are still around “because they have to be and are often at the centre of an agency’s operations”. Since legacy systems usually have a dedicated task of some kind, many agencies keep them around for a variety of reasons ranging from lack of a suitable replacement system to budgetary issues, to a fear of losing key data during a migration of data to other systems which could cripple the organisation .

Undoubtedly, legacy systems need to be part of an organisation’s security risk programs. In particular, where possible knowledge of how to deal with a legacy system should be spread among as many employees as possible to minimize the effect of losing that one resource that manages the system. At a minimum, segregating legacy systems, both externally from the Internet and internally, will help lessen the risk of operating a legacy system .

As one blog posting put it, there are three main reasons for companies holding on for dear life to old technologies . First, the “if it ain’t broke don’t fix it” mindset. Sometimes the technology just works, and since it’s doing what is supposed to do, it’s cheaper and less time-consuming to keep it that way (many payroll systems were like this. They just keep going). Once and awhile the IT team has to deal with a couple of quirks, but that’s more manageable than having to train your team and migrate everything to a new system.

The second reason is what the blog called a “snowball effect”. The longer a system has been running, the more data it accumulates and the more it becomes a part of an organisation’s culture. This fact alone makes it less probable that an orgnisation would even think about replacing it.

The third reason has to do with bureaucracy. Banks and government agencies have been stuck with age-old systems because they are the ones that have been officially endorsed, tested time and again, and proven secure so they had been approved for those specific uses. Adopting new technologies would require a new approval process that could take years. Certainly, as the blog posting underscored, there is no “straight answer”. The more you stick to legacy hardware or software, the more expensive, however, it will be in the long run. In the meantime, since legacy systems often include software and services of a certain importance to an organisation, cyber security professionals should always include them in any eventual security risk assessment.