I think that I speak for a number of my colleagues in the information security and data privacy communities when I say that "breach burn-out" is a recurring occupational hazard.
Here's how it goes. After some period of time spent working on projects to improve security and privacy you hear about a rash of incidents, a string of security breaches, that elicit weary groans. You find yourself asking, "Why do I bother?"
Sometimes the still small voice of calm will answer, "For the money." You remind yourself of the payments that are due, the mortgage, the doctor bills and the health insurance (which may well be bigger than the mortgage). And you decide to keep going.
Sometimes you find yourself in a position to ease back on the earnings and take some time to smell the roses, and you say to yourself "Them roses, they sure smell good." But then you hear about a rash of breaches that elicit groans of a different kind, groans of anger and frustration, tinged with regret. And sometimes you decide it's time to rejoin the fray.
Speaking for myself, I've been groaning a lot lately. There was Facebook, valued at billions, either failing to get a clue about privacy or arrogantly flaunting privacy conventions to see if it could make a buck. There was the year-end count of private data exposures that topped 160 million records. There was Boeing and its hackable Dreamliner (after the FAA intimated the 787 could be hacked because "it allows new kinds of passenger connectivity to previously isolated data networks," Boeing said that "the plane's networks don't completely connect" as though partial connection was somehow not connection). Now we have CIA statements at SANS about hacking utilities and other SCADA systems, reminding everyone that folks in several sectors have continued to develop and deploy mission critical systems under various false assumptions about security.
(Which part of War Games did these people sleep through? BTW, there is a good primer on SCADA on Wikipedia and here is a well-balanced set of slides--in .pdf--put together by D. Maynor and R. Graham at ISS. Their experience parallels what my colleagues found in the nineties: zero systems they could not penetrate, and many that could be hacked with skills rated 3 or less on a scale of 1 to 5.)
Even cool companies like Aptera seem to be forgetting simple things, like not letting other people sign you up for their email. Hardly on the same scale as diddling with the spent fuel rods at a nuclear power plant, but one more reminder that when it comes to security and privacy, most people just forget this stuff. Which is not so much a criticism of "most people" but a reminder that most people don't have an innate talent for "security-think."
Indeed, this truism is so well-established that the folks in charge should have put well-established mechanism in place to compensate some time ago, like security input at the design stage and security review during development and deployment. With all the other problems the world faces, it would be nice to think that by now we had routed the insecurity dragon or at least chained it up in its cave. Apparently we have not. Darn it!
[Exeunt. Alarum, and chambers go off]