Saturday, March 19, 2016

Hacked off: a manifesto

I haven't blogged about computer (in)security for awhile -- but not for any lack of material. Certainly the confrontation between the FBI and Apple about unlocking the iPhone used by one of the San Bernardino shooters has been all over the news. And because that story is all over, there's little point in me adding my two cents worth. I'll wait to comment at least until there's a court decision on the matter to comment upon.

If only it were this simple ...
I take it back. I will comment upfront on one aspect of the situation. Apple is to be commended for building a product that's actually secure -- a praiseworthy technical and managerial achievement no matter which side of the legal controversy you happen to be on. Keep reading to get an inkling how rare such achievement is.

Remember how (apparently) the US and Israel once impeded the Iranian uranium-enrichment program with the Stuxnet worm? Remember how the attack on the Iranian centrifuges was deemed so sophisticated that technologically advanced nation-states had to be involved? This next item may not count as progress, but it is news: "An Easy Way for Hackers to Remotely Burn Industrial Motors." To wit:

... Now a researcher has found an easy way for low-skilled hackers to cause physical damage remotely with a single action—and some of the devices his hack targets are readily accessible over the Internet.

and also:

... At least four makers of variable-frequency drives all have the same vulnerability: they have both read and write capability and don’t require authentication to prevent unauthorized parties from easily writing to the devices to re-set the speed of a motor. What’s more, the variable drives announce the top speed at which motors connected to them should safely operate, allowing hackers to determine the necessary frequency to send the device into a danger zone.

 Not good design. Flat out, not good.

(2015 reissue)
(I am reminded of a scene in my 2008 techno-thriller, Fools' Experiments. There, a (justly) enraged AI overrode the defensive measures in the computerized controls of a particle accelerator, causing a superconducting magnet to overheat and explode -- and causing a chain reaction of explosions all around the accelerator ring. But this was a very clever AI, not some mere script kiddie.)

"Industrial" motors are ubiquitous: in factories, refineries, mines, power plants, the HVAC equipment of any large building or complex  ..., so this design flaw alone is more than adequate grounds for worry. But the never-ending list of vulnerabilities isn't even limited to "only" industrial settings. For example, here are "5 Major Hospital Hacks: Horror Stories from the Cybersecurity Frontlines."

One choice quote (from the opening):

In real-world war, combatants typically don’t attack hospitals. In the cyber realm, hackers have no such scruples. “We’re attacked about every 7 seconds, 24 hours a day,” says John Halamka, CIO of the Boston hospital Beth Israel Deaconess. And the strikes come from everywhere: “It’s hacktivists, organized crime, cyberterrorists, MIT students,” he says. 

And then there are increasingly computerized cars. No sooner had NHTSA tried to assure us the problem wasn't as bad as it had been portrayed, and that the holes had been plugged ("Only Fiat Chrysler cars were vulnerable to hackers, says NHTSA") than the FBI begged to differ ("Vehicles 'increasingly vulnerable' to hacking, FBI warns: Software vulnerabilities in networked vehicle systems could create new risks.") And just wait till self-driving cars are approved for general use -- a day for which, according to Reuters, Uber is already planning: ("Uber seeking to buy self-driving cars: source") ...

http://edward-m-lerner.blogspot.com/2011/05/alien-probe-no-not-that-kind.html
(2011 reissue)
(I used a hack of a computerized car as a minor plot element in my first/1991 novel, Probe. To think that such holes still gape ....)

Some of the above-mentioned hacks involved social engineering, user gullibility, and plain old negligence. None of those should be acceptable as an excuse. In an Internet age, everyone from developers to maintenance staff to everyday users must be responsible -- and held accountable:
  • Developers and installers -- and most certainly their managers! -- must assume that user carelessness will happen, and make it harder for carelessness to defeat systems. It's not rocket science to firewall office systems and block gaming sites. And absolutely nothing should ever be connected to the Internet without an authentication subsystem.
  • Purchasing authorities must insist upon decent defenses in every product and system they acquire -- and liability for suppliers when those suppliers fall short.
  • Governments and large-scale computer buyers must work with industry to phase out the utterly useless End User License Agreements that absolve suppliers of all responsibility for shoddy work. When you and I accept these terms -- and so, buggy software -- we tend not only to put ourselves at risk, but also everyone with whom we share a computer, a network, or information. Alas, as individuals, it's often impractical to do anything but accept.
  • Governments and large-scale computer buyers must also begin the long, slow process of setting and enforcing minimal security/safety standards -- and please note that I am not, in general, one to advocate for more regulation -- for computerized products and systems. Cars, foods, children's clothes, and toys, to name but a few product categories, must meet mandatory safety standards. Why don't the computerized systems on which we daily rely?
  • Every organization that runs a computerized system or sells a computerized product must do regular tests of those, promptly fix whatever the tests find wrong, and compensate those whom their negligence or design flaws have injured or put at risk. Secure products and services must become less expensive for providers to offer than insecure ones.
  • And employees must be held to account -- censured, docked in pay, demoted, even fired -- for willfully ignoring or defeating security measures. Including to the highest levels of organizations. Including to the highest levels of government -- and those who aspire to such offices. Accountability and the risk of penalties are the only reasons some people will take computer security seriously.
Why do I propose such radical change? Because to the extent industrial sites, medical records, hospitals, and modern cars -- and national-security secrets -- are vulnerable ... we're vulnerable. You, me, and everyone around us. A hack can harm, even kill, more of us than the biggest conventional terrorist attack ever.

But we can't afford such an intrusive regimen, you say? Wrong! We're already paying, only as (more?) often for negligence as for vigilance. From a recent report ("Cyber attacks cost global business $300bn+ ") by professional-services firm Grant Thorton:

New research from Grant Thornton reveals that cyber attacks are taking a serious toll on business, with the total cost of attacks globally estimated to be at least US$315bn* over the past 12 months. The Grant Thornton International Business Report (IBR), a global survey of 2,500 business leaders in 35 economies, reveals that more than one in six businesses surveyed faced a cyber attack in the past year. With high-profile security breaches and hacks becoming more prevalent, nearly half of firms are putting themselves in the firing line with no comprehensive strategy to prevent digital crime.

It's got to stop.

No comments: