Several times over my career I’ve wondered why I was dealing with problems which seemed to be so trivially preventable that it should count as negligence. The question kept occuring to me, why do we accept this level of preventable risk and cost, why doesn’t cyber security legislation pay a more effective role? I’m not talking about legislation against the misuse of computer systems (we’ve had that for almost thirty years), but to try and prevent the other underlying causes of breaches;
- Errors in design and manufacture
- Errors in operation
Design and manufacture
I can’t help seeing striking similarities between the issues which lead to information security breaches today and the issues in the automative industry safety in the 1960’s and 1970’s.
The criticism of the Chevrolet Corvair in the Ralph Nader book “Unsafe at any speed” and the design flaws of the Ford Pinto which led to the landmark case of Grimshaw v. Ford Motor Co. are historic examples of manufacturers prioritising cost over safety.
It’s not difficult to draw parallels with software development; where the cost of development and the pressure to introduce new features quickly have arguably contributed to poor quality development and ultimately security vulnerabilities. I’m not talking about complex vulnerabilities involving chaining multiple bugs together either, but the kind of mistakes which feature in the OWASP Top 10 (like injection and XSS vulnerabilities).
As legislation improved the standard of automative safety so the standard of development should improve under the same pressure. The UK government seems to think so. Even as I was drafting this blog post the NCSC reported on legislation proposed by the Department for Digital, Culture, Media & Sport “aimed at improving the security of millions of internet connected devices in our homes”. It’s a start, at least.
Alongside pressure on manufacturers governments introduced legislation to encourage drivers to take more responsibility for automotive safety. The Road Traffic Regulation Act 1967 which introduced a maximum Blood Alcohol Concentration level for drivers, the Road Traffic Act 1988 and subsequent seat belt instruments are examples of a goverment requiring the operator of the car to take responsibility for safety, as well as the manufacturer.
In this scenario I see the end user/consumer of technology as the passenger, with limited responsibility. Cloud service providers, managed service providers or the IT teams of public and private institutions should be held responsible for the configuration and maintenance of the systems they are responsible for.
Exposing personal identifiable information (PII) through careless configuration of cloud services or exposing consumers to unnecessary risk by storing their passwords as plain text should result in financial penalties. The good news is, the latter did result in a fine (€20,000) for the company involved, and let’s hope that’s only the start.
The legislation in this area is still in its infancy; the Data Protection Act 2018 (the UK’s implementation of the EU GDPR) is nearly a year old, and the DCMS proposed legislation is at the consultation stage. Nevertheless a shift of responsibility away from the consumer and onto the manufacturers and operators of technology seems inevitable.
Based on the arc of progress in the automative industry I wouldn’t expect to see significant improvement for at least another decade though, but maybe I’m just being cynical.