We just need to go into any organisation, namely big ones that have been growing and maturing their Infosec practices for the last 10-20 years, or if you’re a startup in the B2B space (which many of the clients I’m Fractional CISO for are) and trying to sell to big corporates to face the ever growing “army of Compliance officers” that get thrown your way as part of the Procurement process.

I remember a time, about 20 years ago, where security wasn’t so much about that. We focused on Security Engineering and building secure products, without even knowing there was such a thing as an Infosec policy. We knew our products and systems, we largely knew how they’d be taken advantage of, and we did what we could to mitigate those exposures. So, the maybe few companies that had Security resources had them largely focused on its operational value not on the bureaucratic accountability that characterises the modern enterprise.

Research from the 70s was already highlighting this in the field of Safety, and we didn’t pay attention.

“The production issues they had the problems of balancing the line – those were inherited from administrative, bureaucratic, and planning levels and abetted by the design of the machinery. They were not created at the level of the line”

Dekker, talking about research done in Taylor factories

“Failure piggybacks, nonrandomly, opportunistically, onto the very structures and processes we put in place to prevent it”

Pidgeon and O’Leary, 2000

I remember a few years ago, the great Jim Manico (considered by many one of the best Application Security professionals out there) tweeting that Security teams spend more money and effort justifying why we shouldn’t fix something, than it would actually take to fix the issue we keep talking about, and I do believe he was on to something. Again, it’s this move from operational value to bureaucratic accountability instead.

Being fair to our own industry, this isn’t just about us and what we do. This is a pattern affecting all areas of business and governance, that we’re largely affected by. In his book “The Safety Anarchist”, Dekker called this out as “Authoritarian high modernism”

Authoritarian high modernism believes that every aspect of our lives and work can be improved with rational planning, with better techniques and more science. Authoritarian high modernism has a sweeping vision for how standardization and control are keys to the success of modernism. If we are to apply the benefits of technical and scientific insight, then we need standardization and control. This in turn requires careful measurement, ordering, mapping, surveillance and tracking. The authoritarian high-modernist vision truly believes in its own ethic and its own good. It considers itself the superior model for getting things done.

Dekker, “The Safety Anarchist”

So all of these security bureaucracies we’ve created, are self-perpetuating, as we have more government and standard/framework regulations which keep them in check and justify their need.

“Bureaucratic accountability is demanded because of bureaucratic accountability; paperwork begets paperwork; nonoperational positions grow more nonoperational positions”

Dekker in “Safety Differently”
Source: https://symbolsage.com/ouroboros-meaning-and-origins/

This “paperwork begets paperwork” is what Dekker termed “Bureaucratic entrepeneurialism”. Because we fear the consequences of curtailing security functions and combined with the promise of future useful work and reminders of past successes (pats in the back for work done prior) it helps perpetuate their existence and serve as justifications for more of it. When that is combined with Security leaders interested in Empire-building, it’s a sure fire recipe for long term disaster.

There are some (obvious and less-obvious) consequences of this bureaucratic entrepreneurialism. Some are:

  • Institutionalisation and legitimatisation of counting negatives (non-compliances, deviations, incidents). Most of the language revolves around deficit and control and that we need more of it.
  • They add incentives around the absence of all these negatives so we inadvertently push people to under-report, not talk about near misses or capabilities they’re missing to do a better job, and either don’t bring bad news to decision makers or even end up watering down definitions of what constitutes incidents and rationalise that “this bad thing happened, but it wasn’t quite fully within that definition so we don’t need to talk about it” or just plain old fudging of numbers.
  • We mostly organise around lagging indicators (things that have already happened), and we should know by now that lagging indicators have very little usefulness in predictive value for what may happen in the future
  • These systems we put in place to manage the bureaucratic accountability have an effect of disregarding or devaluing technical expertise and they disempower middle management. Operational staff internalise that as not being empowered to think for themselves and it stifles innovation, chokes off initiative and willing ownership by those seen accountable by the bureaucratic system

The more these process and structures are developed or enforced bureaucratically by those who are at a distance from the operation, the greater the risk becomes that they produce ‘fantasy documents’ that bear little relation to actual work or operational expertise

Clarke and Perrow, 1996

So these are the dangers and the usual effects of bureaucratic entrepreneurialism. Now, it would be intellectually poor of me for this post to be perceived as complete disregards for GRC functions, because that’s not actually what I think of them. There is work to be done there to satisfy the needs of compliance capitalism, which doesn’t just affect security, but it’s an expectation of the context our businesses operate in. However, being wary and knowledgeable that these are the unintended effects that excessive bureaucratisation have on both operations and operational staff, we SHOULD create and adopt practices which can act as counter-gradients to their effects.

This blog is already long so I won’t make it even longer, but there are things we can do about this. In my last blog “The Causality Credo in Infosec and how to leave the club” I mentioned SCAD in relation to learning from incidents, and in my last talk “Security Differently: Leveraging insights from Resilience Engineering and Safety Science” (link to slides) I reference both Learning Teams and adaptation of Safety Decluttering practices as counter-gradients worth exploring