"There's a Patch for that" (or maybe not)
Yesterday's story on delayed patching or situations where patching is blocked by policy created a lot of discussion, and I thought it was worth another go, from a different perspective.
There are lots of things we use daily that have an OS, applications and security issues that we NEVER patch. Sometimes because we don't think of it, sometimes because we are denied by regulations. Very often we don't patch them because the manufacturer treats them as throwaway devices - there simply are no patches.
What especially brings this to mind is that I was that after yesterday's story, I was explaining the concept of "malware" to my son (he's 10). My explanation was that it was software that someone wrote, to make a system do something that it wasn't intended to do. Pretty much straight out of my SEC504 notes come to think of it (thanks, Ed!)
Anyway, that brought a few examples to mind - I'll list a few:
Windows (and other) hosts in the Pharmaceutical industry:
Machines used in pharmaceutical manufacturing need to be "re-certified" after every change. This confuses me somewhat, since the owner of the unit defines the testing procedure for re-certification (things like "copy a file, do a transaction etc), so it should be easy right? Long story short, this recert process tends to freeze things in time on devices that are directly involved in manufacturing of pharmaceuticals. I cringe whenever I walk past that Windows 95 machine at one customer of mine
Embedded LINUX (and *nix) OS devices:
We tend to think of these the same way we think of lightswitches, but in most cases they run a full Linux OS. Nothing too critical, you know, trivial things like elevator controls, security cameras, HVAC (Heating/Ventilation/Air Conditioning) Systems come to mind for instance.
Embedded Devices in Healthcare (both Windows and Linux)
Again, we think of these as devices rather than computers. Things like IV pumps, controls for X-RAY and CAT-Scan machines, Ultrasounds and the like. There have been very public disclosures (and responses to yesterday's post) about Conficker and other malware running on gear of this type, and as far as I can tell neither the manufacturers or the regulators are too-too excited about it, and I think they should be - the hospital system administrators sure aren't happy about it.
Prosthetics are getting more and more complex - huge advances in prosthetic limbs, hearing and sight aids all involve computers embedded in the device.
And even simple devices like pacemakers are re-programmed remotely (and wirelessly). When my dad told me how cool getting his unit re-calibrated was, I couldn't help but see the down side (but didn't discuss it with him). Do you want to take bets on how many heads of state, or CEOs for that matter have a pacemaker? Or how much a well placed "cardiac incident" might influence global or financial affairs?
It's a good thing that there's no direct transport for malware across the silicon / carbon unit boundary. One day we'll go to the hospital for a simple procedure, and instead of worrying about MRSA or C-DIF, we'll worry about catching CONFICKER-YYZ instead !
And a lot closer to home ... Did you drive to work today?
Aside from your entertainment system, your car has a fully documented, >>unsecured<< network and operating system with an open and documented API (google "ODB II" sometime). Even better, by law this unsecured network and OS has a wireless link in it (your tire pressure sensors are short range, remotely activated wireless transmitters). No risk there if someone else started a remote control session on your car between the house and the grocery store - this might seem over the top, but not by too much
We talk about protecting our nations critical infrastructure, but I think we're missing the boat on loads of critical infrastructure that doesn't involve generating electricity, pumping oil or running water systems. Remember that definition of malware above, and remember (not too far back) that STUXNET was targeted and written to make nuclear plant systems behave "to make a system do something it wasn't intended to do".
I think we don't need to think much harder to make a long, long list of critical systems that we'd have a hard time dealing with if they stopped working properly.
Again, I invite you, our readers to comment - describe any devices or systems that we deal with on a daily basis, that we wouldn't normally patch or update, or cannot patch or update. Extra points for critical type devices, but if your toaster has a USB port that's sure interesting as well (I want one !)
=======================
Rob VandenBrink
Metafore
Comments