ICS Security, Medical Devices and the Accidental Bogeyman
Hacked pacemakers, insulin pumps, cars, industrial facilities, satellites and breached power grids… For years now, cybersecurity researchers have been warning of the possibility of black hat hackers to hurt or kill people through their exploits — often with demonstrations about how they could do it.
But the degree of sensationalism often accompanying the topic can obscure the true level of risk, while doing little to underscore the level of risk of common attack vectors and vulnerabilities such as outdated operating systems, unpatched or buggy software, malconfigured networks and the like.
In general, the level of cyber risk is high with industrial control systems, which are used for an array of applications, from controlling satellites to oil-and-gas equipment to automation equipment in factories. There are reports of computer sabotage causing chaos stretching back decades. True, it is difficult to verify, say, whether the United States deployed Trojan malware on computing equipment used to control the flow of gas in a Trans-Siberian pipeline, prompting a massive explosion. Thomas C. Reed, a former Air Force secretary in the Reagan administration, claimed as much in the book “At the Abyss.”
But attacks on connected industrial systems are common now. A number of cybersecurity vendors, from IBM Managed Security Services to Kaspersky Lab, have tracked an uptick in ICS security attacks in recent years. Only in March, Norsk Hydro, one of the largest aluminum producers globally, struggled with cyber-induced production in operations in both Europe and the United States.
In to deal with the problem, industrial giant Siemens and TÜV SÜD, the international testing, inspection and certification firm, joined forces on what they call “a new approach to digital safety and security.” “Attacks against industrial environments are increasing at an exponential pace,” said Leo Simonovich, vice president and global head for industrial cyber and digital security at Siemens. “However, unlike in IT, where the primary concern is data loss, cyberattacks targeting operational technology can lead to a potential shutdown or worse.” The two companies will thus collaborate to offer what they are terming “digital safety and security assessments” to help energy customers, in particular, assess and manage cyber risk.
Simonovich pointed to the potentially-catastrophic Triton malware, which the cybersecurity firm Dragos discovered in Saudi Arabia in 2017. Researchers recently found the code at a second facility.
“What was remarkable about [the Triton] attack, was the ease with which attackers traversed from IT to OT to safety systems,” Simonovich said.
Indeed, this is a recurring theme across sectors where cybersecurity breaches pose a potential safety risk. Despite all of the research demonstrating esoteric and often borderline-implausible types of attacks at cybersecurity events, it is easy to overlook the risk posed by, say, an “air-gapped” Windows XP computer or dated malware such as Kwampirs, a trojan discovered in 2015, or Conficker, first discovered in 2008. While hackers could, say, modify CT scans to create fake cancer cells, as researchers demonstrated, it is more likely that a hospital will get hit with a commodity type of attack or a pacemaker patient will end up targeted by a cyber-hitman. “People always laugh when I say: ‘Even though I consider myself a cybersecurity expert, there are easier ways to hurt people,” said Stephanie Preston Domas, vice president of research and development at MedSec. “All of these fancy custom exploits designed against medical devices are not pointing to the real problem. The real problem is things like Kwampirs still work. Things like Conficker still work.”
And then there is WannaCry, the 2017 ransomware attack that Europol said was unprecedented in its scope. Affecting some 200,000 computers, WannaCry affected industrial and medical facilities. Nissan had to stop production in a facility in the United Kingdom. Renault was forced to halt production in multiple sites. Germany’s train company Deutsche Bahn was a victim. A similar piece of malware hit, Notpetya, caused millions of dollars worth of damages to shipping giant Maersk.
But as large as the ICS security impact was, WannaCry also had an outsized impact on the UK’s National Health Service, resulting in damages at nearly £100 million while leading to the cancellation of 19,000 medical appointments.
It’s possible that WannaCry, or a similar commodity attack, could lead to death or injury by, say, delaying a heart surgery, although it is generally difficult to prove a direct connection, said Leon Lerman, chief executive officer of Cynerio.
Attacks like WannaCry also illustrate the risk of exploits developed by nation states leaking and unwittingly empowering adversaries to attack the U.S. and allies. WannaCry and NotPetya both used an exploit known as EternalBlue, which the U.S. National Security Agency likely developed. The New York Times recently reported Chinese intelligence agents used “key parts of [the United States’] cybersecurity arsenal” to carry out attacks. Incidentally, the piece also reports that the sophisticated NSA-developed Stuxnet malware used to target Iranian nuclear centrifuges caused damage to U.S. businesses including Chevron.
Domas is more worried about generic malware or simple carelessness playing a role in cyberattacks with safety consequences. “I still see too much sensationalism focused on bad guys causing patient harm,” she said. “I would love to see more of a shift toward understanding that if patient harm has happened [as a result of a cyberattack], it’s probably accidental. It is probably a side effect of something else that they were trying to do on the system.”
Researchers who examine cyberattacks on industrial control systems see a similar pattern, Simonovich said. “Most have some level of human error associated with the breach.”
On a related note, faulty software code on industrial systems and medical devices is a subject closely related to both safety and cybersecurity. The recent saga regarding the Boeing 737 MAX underscores this point. Writing about the problem, security expert Bruce Schneier wrote: “Technically, this is safety and not security; there was no attacker. But the fields are closely related.”
Domas agrees with that sentiment, citing, for example, the case of a hospital worker who caused an anesthesia machine to seize up abnormally after plugging a cellphone into it. It is easy to overlook the risk of such everyday occurrences, which don’t involve a cyberattacker.
Similarly, the types of headline ICS security and medical device stories that tend to garner the most media coverage make it easy to overlook the risk of insider threat. But “insider threat makes up the overwhelming majority of [attacks in the industrial sector],” Simonovich said.
Ultimately, to help address risk in ever-more connected industrial and medical environments, the people who work within them need to clearly understand the risk such connected systems can pose, whether exploited on purpose or inadvertently. “I think the people who are technology savvy are becoming more aware, but I really don’t see a huge kind of uptick in understanding or appreciation for the people who are not.”
And then, the topic of gauging risk management can be fiendishly difficult. One of the threat modeling, which is a subset of risk management. “But because there are so many cyber risks in any system, and you can’t fix all of them,” Domas said. “So that’s why you have to pair it with things like threat modeling and figure out what are the ones you need to be most concerned about,” she added. “Honestly, a lot of times, you find things that come up that trickle through your ranking system and you said: ‘You know what? I’m okay with that risk,’ and you do nothing to fix it, even though you know, there’s a cybersecurity issue there,” Domas said. “You have to strategize. You can’t fix everything.”