Extracting vendor promises won’t fix cybersecurity • The Register

Opinion To say cybersecurity is mostly very good is like saying Boeing’s Starliner parts mostly work – true, but you’re still going to be sleeping in the office. Moreover, it’s questionable whether either are getting any better.

Jen Easterly should know. As boss of the US government’s Cybersecurity and Infrastructure Security Agency (CISA), she sees when things break, why they break, and what happens afterwards. She is as unhappy as an ISS astronaut feeling a stiff breeze. As she says, the quality of security in software has been bad, is bad, and will remain bad until vendors and customers start taking it seriously.

She also says that the industry is at fault for giving cybercriminals cool gang names, and should instead label them Evil Ferrets and the like. Who wouldn’t want to be an Evil Ferret? Give them boring long numbers, like cybersecurity incidents, and watch their egos wither.

Back at the good bit, Easterly says that plenty of software vendors have signed up to a pledge to deliver measurably better products by next year, which would be good if anything happens if they don’t. It would be even greater if anything happened to companies that didn’t sign up at all. Here, Easterly suggests that corporate buying of software and services should be contingent on suppliers making good on such pledges.

It is not unduly cynical to note that, from time to time in this great industry of ours, software companies make promises they don’t keep. Sometimes IT departments choose to believe these, not because they think they’ll come true, but because it ticks a box now and gives plausible transfer of blame later. This is by no means universal or even mostly true, but it shields the agents of failure from consequence. Nobody wants to fix the thruster valves. Who gets the pain when an organization gets hit with ransomware? Nobody at the vendors, and nobody in upper management.

Pledges and pious intentions are fine, but you can tell how serious the industry really is by looking at how much it is investing in hard work, hard sweat, and hard thinking in fixing the problem. CISA may be handing out the fig leaves, but where are the cross-industry research and development efforts? Who’s spending time and money in analyzing exactly why cybersecurity is so flawed, and what are the methodological faults that keep it that way?

It’s just not seen as a real problem. It could be made one if enough volts were applied to the right backsides. If insurers refused to cover not just business losses from wonky security, but also didn’t extend cover at all if standards could not be shown to be in place. Standards that included contractual liability for vendors. Less than full regulation, more than a tick-box exercise in blame deflection. If the consequences hurt like yanking teeth, the industry will respond – and not until.

That response would necessarily have the whole industry working together in sharing data, test methods, even design and verification tools to make cybersecurity a proper engineering discipline. There’s even a template for how this might work, in a part of tech where consequences are impossible to avoid: semiconductors.

Selling millions of physical parts that don’t work is infinitely more painful than shipping sloppy software. A chipmaker is extremely lucky if it can push out a firmware patch that kisses things better. Otherwise, it has to deal with its customer companies for recall or repair, even assuming that’s possible. This is why every stage in chip production is designed, validated, and scrutinized to the limits of the possible, generating terabytes of data for intense analysis – not a scenario familiar to software houses.

Even that’s not enough. As more and more devices are built out of chiplets from different silicon foundries, validation of the final part has forced unprecedented cooperation between traditional competitors. This has gone far beyond pledges and piety, with new tools, protocols, and processes being adopted. Want to see LLMs making a real difference to an industry that needs it? Here you go.

Nobody’s saying that making software secure is at the same level of difficulty as shipping billion-transistor parts that absolutely must work, or that there’s even much overlap in any of the details of either endeavor. But it is proof that creating design rules and test regimes for massively complex systems is within our grasp when the consequences of not doing so are sufficiently scary.

Cybersecurity doesn’t feel that scary, which is even scarier. If cyber warfare, infrastructure attacks, millions of individuals having their data stolen, and billions extorted every year isn’t enough then – horror of horrors – the C Suite bonus packages have to be on the line. It’s that critical.

What a reformed software industry truly committed to cybersecurity would look like is unknown. It would have to be open to small startups and open source, committed to continual research and innovation, and have the sort of honesty that reality demands. A miracle for sure, just one we know we can do. Let’s find a big enough electrode to make it happen. ®

Source link