CISO Says... Clean Up!
I recently visited the Risk and Resilience Festival at the University of Twente. One of the keynote speakers was Maarten van Aalst, Director of the international Red Cross Red Crescent Climate Centre, and professor at the University of Twente. He told a story about Bangladesh and how the frequent flooding lead to increasing numbers of casualties. It was too simple, he said, to blame climate change. A much bigger impact was the increasing population. The more people there are living in the Ganges Delta, the bigger the exposure. It reminded me of the fact that exposure is often something we can influence.
A common approach in IT risk management is to carry out a Threat and Vulnerability assessment (TVA). In cyber security, a threat is a possible danger that might exploit a vulnerability to breach security and therefore cause possible harm. A threat can be either "intentional" (i.e. hacking: an individual or a criminal organization) or "accidental" (e.g. the possibility of a computer malfunctioning, or the possibility of a natural disaster such as an earthquake, a fire, or a tornado) or otherwise a circumstance, capability, action, or event[1]. A vulnerability is a weakness which can be exploited by a threatening actor, such as an attacker, to perform unauthorized actions within a computer system[2].
To exploit a vulnerability, an attacker must have at least one applicable tool or technique that can connect to a system weakness. In this frame, vulnerability is also known as the attack surface or exposure. However, in most cases TVA exposure does not get much attention. When evaluated, exposure is usually estimated in a high-mid-low manner and then combined with ‘severity’ in a heatmap[3]. The recommended way to minimize exposure is often to either patch a system to remove the vulnerability or move a vulnerable system into a separate network to hide it from an attacker.
While the increasing human casualties in Bangladesh is mainly due to the rising population, the increasing number of hacks is (also) due to the increasing amount of assets. Assets, in this context, can be hardware, applications, platforms and data. The more you have, the higher your exposure.
Although it is common to believe that ‘what you don’t own you don’t have to protect’, organizations do not seem to prioritize the inventory of assets. When performing a very basic assessment of the security status of an organization, I prefer to use the CIS security controls[4] as a baseline. The CIS identifies 6 basic controls. The top three are:
- Inventory and Control of Hardware Assets
- Inventory and Control of Software Assets
- Continuous Vulnerability Management
The top two are about being aware of what you own, including software and hardware. In my experience, most of the organizations have no complete overview. In fact, it is often far from complete. Basic questions regarding purpose and function remain unanswered. The common reason being that the organizations find it ‘impossible’ to keep track of so many different assets. If that is the case, the organizations can’t manage the vulnerabilities in the various assets - and it’s safe to assume there are many vulnerabilities present. A high number of vulnerabilities and a large exposure equate a big risk.
One way to mitigate this risk is to try and manage the vulnerabilities. Organizations implement a vulnerability scanner. This will automate the inventory of vulnerabilities, but fixing the detected vulnerabilities requires manual effort. It is not uncommon to find multiple vulnerabilities on one server, and when scanning only a few hundred IP-addresses you can easily end up with an excessively lengthy report. The scanner is not able to detect and repot old or uncommon assets, monitoring network traffic is difficult. Of course, it is possible to sniff the network, but if you rely on artificial intelligence to discern genuine traffic from malicious traffic, it will be quite a naïve mistake.
It may not be feasible in Bangladesh to diminish the exposure by moving people out of the Ganges Delta, but in IT we can. If we want to decrease the risk, the best way is to remove of as many assets from the hypothetical risk scenario. This will decrease the exposure and decrease the effort needed to perform vulnerability management. The best security investment you can make might be to get rid of your legacy applications.
How to convince your environment?
I know from experience removing legacy is not a sexy subject. Of course no-one will really object against cleaning up. Who wants to be against that? However, removing legacy usually requires replacing one or more old applications with a new up-to-date one. This can be very costly and it may have impact on the way of working employees are used to. When faced with the decision to either implement a new application or a project to clean up legacy, the latter will be second choice.
One way to create some urgency is to make the risk visible. A breach in a legacy application may lead to severe damage. We recommend that organizations perform a quantitative risk analysis on the legacy systems and then calculate the risk. It is important to remember that this initiative will likely save significant capital in the long-term.
Sources:
[1] Taken from Wikipedia: https://en.wikipedia.org/wiki/Threat_(computer)
[2] Taken from Wikipedia: https://en.wikipedia.org/wiki/Vulnerability_(computing)
[3] Eg. https://www.sans.org/reading-room/whitepapers/auditing/overview-threat-risk-assessment-76
[4] https://www.cisecurity.org/controls/cis-controls-list/
Want to keep track of what's happening in cybersecurity? Sign up for Nixu Newsletter.