All Articles Technology Security The faces of tech intolerance today

The faces of tech intolerance today

How to uncover toxic combinations.

11 min read

SecurityTechnology

What is your organization’s slop tolerance? I’ve asked this question quite a bit over the last six months. The answers I receive range from insightful to snarky. Just last month, an AWS front-end developer and a financial services full-stack developer both told me that they were willing to clean up after a bit of slop to save time. Why? They told me they have been asked to do twice as much work as in 2025; they wouldn’t know how to function without AI tools. But, here’s where things get a bit more complex: A cloud engineer – not a coder – I know from a global pharmaceutical provider had a rather terse observation: “If I have to clean up after someone’s ‘vibe code’ slop again, I’m going to go back to scrubbing dishes in my alma mater’s cafeteria.” As beautiful as our tech may appear from the outside, it’s very likely that it hides some toxic tech slop.

In 2025, the Oxford English Dictionary informed us that the phrase of the year was “rage bait.” Merriam-Webster gave “slop” the honor of Word of the Year. Collins Dictionary concluded its own competition naming “vibe coding” as the winner. I have to say that from a techie’s perspective, these words are closely related. But I think that I’ve been asking the wrong question of folks. The better question is: “What is your organization’s tech intolerance quotient?” Let me explain.

De-frictioning and the workplace – a word of caution

We live in a world that is increasingly interested in removing friction from processes of all types, both technical and non-technical alike. Technologists and business leaders are bent on finding ways to eliminate inefficient elements of a process. When it comes to tech, the term AI is used to describe the process of removing overhead and friction. At least in the United States, it’s all about “efficiency.” You could argue that the word “efficiency,” improperly deployed, can result in a lot of slop: Where techies see slop, business leaders see efficiency.

Let me put it this way: I’m highly tolerant of slop when it comes to removing friction about a process that only you care about. For example, in the picture below, I’m looking slightly smug because, like the women behind the gate, I had just ignored the gate at the Forty Foot swimming area near Dublin, Ireland. I “de-frictioned” it in the name of having fun.

We do much the same thing in tech; we look for efficiencies and remove barriers to entry – not to have fun, but to get the job done. Sometimes, lately, I’ve noticed that instead of making a process more efficient, we’re shifting it left and simply removing processes. And, there’s the rub: I’m entirely intolerant when you de-friction something in a process that involves my job role or something that I do, in fact, care about. As my mom used to say, “It’s all about whose ox gets gored.”

Tech intolerance: The hidden root of today’s IT and cyber problems

Most IT professionals can spot bad code, poor configurations and sloppy architectures from a mile away. But the root cause of many of today’s technology and cybersecurity issues isn’t a single line of code or a missing patch. It’s something deeper and stems from a more uncomfortable place: I’m going to use the phrase, “tech intolerance” as a catch-all term for what I mean.

Tech intolerance is what happens when an organization’s behavior creates conditions where technologists can’t do their best work. Under tech intolerance, people are rushed, under-informed and pressured into delivering “something that works” instead of “the right thing that works.” Over time, this produces AI slop, technical debt, shadow IT and brittle systems that fail in embarrassing, sometimes catastrophic, ways.

We’ve seen this play out in different forms in events like the UK Post Office scandal, the Colonial Pipeline incident and countless quieter failures in cloud migrations, identity systems and “digital transformation” projects.

How tech intolerance manifests itself

The symptoms of tech intolerance are often obvious:

  • Hurried solutions deployment (AI or otherwise): Tech that is deployed without guardrails or data testing. According to CompTIA’s AI’s Impact on Productivity and the Workforce report, 8 in 10 organizations have backtracked from using AI to using a human-centered solution. According to the CompTIA IT Industry Outlook 2026 report, most organizations aren’t highly capable in data practices. In one case, I know of an education project in a leading government that backtracked on its use of AI. Instead of using AI to automatically create mappings, it has gone back to a human-based approach. Why? They underestimated how long it would take AI to get past its learning curve.
  • Shadow IT: Sometimes, departments and workers feel there is a communication gap between their job roles and the IT department. As a result, these departments start creating their own tech solutions; that’s shadow IT. As more tech implementations live and work in the shadows, it creates conditions where organizations can’t prove compliance with regulations. Even worse, it creates an undocumented attack surface that invites successful attacks. Tech debt represents a steep learning slope for many teams; they just don’t quite know how to prioritize relieving it. This is especially true of cloud-based tech and legacy systems that lie far beyond the team’s horizon of expectations.
  • Technical debt: It’s a “stealth practice” that results when “temporary” workarounds become permanent production systems, creating cybersecurity cruft. Teams bypass change control, logging or policy because “there’s no other way to get this done.” You could call it “cybersecurity slop.” Debt is unseen, because it is buried in brittle integrations, undocumented processes and half-documented automations. In some cases, institutions treat it as an accounting inconvenience instead of a real risk.

How intolerance can get worse

Four characteristics of today’s technology landscape amplify this problem:

  • Ubiquity: Technology is everywhere. There is no “non-tech” side of the business anymore. Every function depends on IT, but many decision-makers still treat IT as a back-office cost center.
  • Velocity: The pace of change today is brutal. Cloud services, SaaS platforms and AI capabilities update faster than governance, training or risk management can keep up.
  • Mutability: Architectures aren’t static. Infrastructure, identities and data paths evolve constantly. Assumptions that were safe last year are dangerous this year.
  • Reactivity: Under pressure, organizations react instead of design. They bolt on controls, buy tools to satisfy auditors and copy what competitors say they’re doing — creating toxic combinations of tech, process  and people.

Failing to manage these realities leads organizations worldwide to backtrack on their AI implementations. Why? Because they unwittingly foster toxic combinations.

Today’s toxic combinations

You can see these toxic combinations in several recurring patterns:

  • Cloud storage misconfigurations that expose sensitive data, because ownership and responsibility are unclear.
  • Credential sprawl in backups, serverless functions and continuous integration/continuous delivery (CI/CD) pipelines, where secrets are loosely managed and rarely rotated.
  • FUD-driven cybersecurity communication, where practitioners use fear, uncertainty and doubt to get funding — eroding trust instead of building it. For all the talk of post-quantum cryptography and the dangers of “Harvest now, decrypt later” attacks, I’m much more interested in taking a balanced, more mature, risk-management approach.
  • Naïve tech adoption, especially in AI, where tools are deployed because they’re exciting, not because they’re aligned to a clearly understood problem.

These failures are usually blamed on tools, vendors or “bad actors,” but they’re actually symptoms of organizational behavior. But, there is a much larger problem.

Substitution: A critical, chronic condition that needs further investigation

The most insidious failure pattern is what I call substitution: It’s a situation where you think you’re improving or replacing a problematic process, but you’re listening to the wrong people and measuring the wrong things. You don’t fix the process—you simply swap one broken process for another. That might be in the name of “shifting left,” or making a particular process more efficient. But, like when a tree becomes petrified into stone, a subtle, slow and (in the case of our tech) un-pleasant substitution takes place.

Recently, I worked with a company that wanted to use AI to improve its security analytics processes. Trust me, there is a huge amount of repetitive work that can be “de-frictioned.” Yet this company made a mistake: Instead of working with trained security operations center (SOC) analysts who understand the tedious, repetitive tasks that block real threat hunting, the organization hired a consulting firm. That’s not really a mistake in and of itself. But the problem in this case was that the consultant created “new” processes based on generic documentation and a few web searches, rather than deep engagement with frontline analysts. That’s substitution.

I call it substitution slop – in this case, a shared failure by both the contractor and the organization that hired them. That’s one of the faces of tech intolerance. It’s an age-old problem that, frankly, is a bit faceless; I’ve noticed folks really don’t quite realize the complexion of the problem. In my recent conversations with CIOs in the governments of Europe, the US and Australia, I’ve noticed they’re using the terms “modernization” and “digital transformation” more than usual. They often talk about moving their tech from the Stone Age into the modern age. I can’t help but think that if they’re not careful in their work, they could end up committing acts of substitution that, in essence, turn their work to stone.

Skills to break the toxic tech cycle

The way out is easy to talk about, but it’s not particularly easy to put into action, especially in situations where budgets are limited, training is deferred and time pressure is wantonly applied. We need to involve practitioners early, reward honest risk communication and treat technologists as partners in business design, not just implementers of pre-baked decisions. Until we investigate the faces of tech intolerance, we’ll keep blaming tools for problems that actually start in the mirror.

Solving this isn’t just about buying better platforms; it requires new skillsets and mindsets:

  • Improved communication: We all know that the pace of tech change shifted gears from rapid to chaotic. Recently, I spoke with a deeply experienced tech professional who had just changed jobs from the tech sector (Google) to the health care sector. I asked her about the criteria she used to gauge a company’s quality. Her answer was as terse as it was insightful: “Communication.” She said that once she feels that an organization actively manages information siloing, it is likely to use its technology wisely. In my mind, she’s basically telling me that she likes working for organizations that actively avoid technical intolerance.
  • Foundational tech training for all employees: To improve communication, ensure greater compliance and better tech adoption, organizations need to educate their employees in the essential nuances of AI, data and cybersecurity. They’re the tech trifecta today. But today’s workers still aren’t literate in them. In fact, you could argue that the worst form of technical debt in any organization is when it skips training of its employees in favor of other priorities.
  • Improved identity and credential access (ICAM) practices: I’ve seen a lot of talk over the last 18 months about the need for better ICAM. I couldn’t agree more, but I don’t think the root cause of the problem is that we haven’t been doing ICAM; it’s that chronic conditions, such as lack of upskilling and time pressure, have contributed to the problem.
  • Balanced expectations: Leaders need to understand that automation changes the shape of work, not just the headcount. Offloading repetitive tasks should free experts to handle complex analysis, not simply generate more noise.
  • Workflows and frameworks informed by subject matter experts: Service design, threat modeling and value-stream mapping need to become standard practice, not special projects. Cross-functional teams must actually own shared outcomes.
  • Paradigm shifts in cybersecurity: Move from perimeter defense and product thinking to identity, data and workflow thinking. Security must be embedded into the design, not stapled on in review.
  • Realistic use of AI: AI can accelerate triage, enrich alerts and automate repetitive tasks, but it can’t define risk appetite, interpret business context, or replace experienced analysts’ judgment.

So, as I tour the world this year talking with IT workers, I hope to learn more about how organizations are facing, surfacing and managing their hidden technical intolerance challenges.