Upward Certainty Bias

Upward Certainty Bias describes a cultural pathology in organisations where information becomes progressively more confident, simplified, and caveat‑free as it moves up the hierarchy. At each managerial layer, analysts feel implicit pressure to remove uncertainty; not because they’re being dishonest, but because organisational norms reward confidence, clarity, and decisiveness over nuance. The result is a kind of “lossy compression” of uncertainty: by the time analysis reaches the C‑suite, the ambiguity and risk that genuinely matter for strategic decisions may have been stripped away. Upward Certainty Bias doesn’t arise from malice or incompetence, but from a structural mismatch between the messy reality of data and the polished narratives organisations prefer. Naming it helps analysts and educators highlight that some uncertainties aren’t noise to be filtered out, but essential signals that leadership must see to make good decisions.

Carillion Collapse

Let’s illustrate this with a real example. I could talk about Space Shuttle Challenger and those ‘O’ rings again, I could talk about the Post Office Horizon software again. I could even talk about the Royal Bank of Scotland’s problems in 2008. But I thought I’d cover Carillion because this was a business failure that really should never have happened. When Carillion collapsed in January 2018, it did not do so overnight. The failure followed years of expanding debt, increasingly aggressive accounting on long-term contracts, and growing exposure to underperforming projects. What makes Carillion relevant is not simply that it failed, after all companies fail all the time, but how uncertainty about its financial position was handled in the years leading up to the collapse.

Carillion operated on complex, multi-year construction and outsourcing contracts. These inherently involve estimates: projected costs, timelines, penalty clauses, and revenue recognition based on anticipated completion stages. In such environments, financial reporting depends heavily on management judgement. As later Parliamentary inquiries and regulatory findings showed, the company continued to recognise revenue and profits on major contracts even as risks were mounting. Warnings emerged late and partially. Dividends continued to be paid. Public statements remained broadly optimistic. The technical question was not whether executives had perfect foresight as in reality no one does. It was whether deteriorating contract realities were being fully recognised and escalated in financial reporting, or whether they were being smoothed through optimistic assumptions. In other words: how much uncertainty was filtered out before the numbers reached investors, government clients, and the market?

By the time a major profit warning was finally issued in mid-2017, the gap between the confident narrative and the underlying fragility was too wide to bridge. Six months later, Carillion entered compulsory liquidation, affecting thousands of employees and hundreds of public sector projects. The collapse illustrates a broader governance issue: in complex organisations, senior leaders are uniquely positioned to see cross-contract and balance sheet exposure. That makes them uniquely responsible for deciding how much uncertainty is acknowledged publicly. When optimism consistently overrides caution, uncertainty does not disappear. It accumulates.

Upward Certainty Bias

(Yes, I know I need a snappier phrase here). But essentially, I’ve been thinking about “kiss up kick down” management for a while. At first, I understood it the obvious way: a manager who flatters their superiors and mistreats their team. But lately I’ve started to think that’s just the extreme version. There’s a quieter, more normalized version that doesn’t look like bullying. It looks like alignment. It looks like decisiveness. It even looks like leadership.

It looks like this:

  • Upward, certainty.
  • Downward, pressure.

I work in data. My job is ambiguity. Every real dataset contains caveats; sampling bias, incomplete tracking, conflicting definitions, missing data, assumptions layered on assumptions. My instinct, professionally and ethically, is to surface those uncertainties. To say, “Here’s what we know, here’s what we don’t, and here’s what could change the conclusion”. But if I’ve learned one thing in my working life it’s that ambiguity does not travel well up a hierarchy.

When I pass uncertainty upward, I’m rarely thanked for it. Instead, I’m often asked to simplify. Tighten it. Make it cleaner. Make it usable. “So what’s the number?” And so there’s a quiet expectation: translate complexity into clarity. But here’s the part that’s been bothering me. If I compress ambiguity into a clean answer, and that answer later proves incomplete or wrong, the same executive layer can say, “We didn’t know there were doubts.” The uncertainty disappears in transmission; and so does accountability.

I don’t think most executives wake up intending to behave this way. But I do think many systems reward it. Senior leaders are expected to project confidence. Markets reward confidence. Boards reward confidence. Teams are told they need decisive direction. The trouble is this ambiguity I see doesn’t go away. If it doesn’t go up to those with a wider business knowledge. It goes down. It sits with the analyst. The product manager. The operations lead. The people closest to the messy reality of implementation. In that sense, “kiss up, kick down” isn’t just interpersonal cruelty. It’s a structural habit: send certainty upward, send doubt downward. And the more senior you are, the more you’re insulated from the consequences of uncertainty because you no longer see the granular trade-offs. You see dashboards.

As a data analyst, I can’t see the second- and third-order impacts executives can see. I don’t have access to the strategic constraints, the political realities, the investor pressures. Which is exactly why I believe the executive tier should be managing ambiguity; not the analyst alone. They are the only ones positioned to weigh uncertain data against broader consequences. When ambiguity is forced downward, the burden shifts to the person with the least visibility into its implications.

The Performance of Certainty

I’ve started to notice how common this performance is. The manager who rephrases your careful explanation into a confident headline in the leadership meeting. The VP who says, “The data shows…” when the data actually suggests. The post-mortem where uncertainty is retroactively denied: “No one raised concerns.” Well, maybe someone did. Maybe it just wasn’t convenient to amplify them.

In extreme cases, this is toxic leadership. Scholars like Robert I. Sutton, in The No Asshole Rule, describe how organizations suffer when leaders bully or silence dissent. But what I’m describing is softer. It’s cultural. It’s the quiet understanding that ambiguity is friction, and friction is unwelcome in executive storytelling. So the story gets cleaned. And someone else carries the risk.

Part of why this bothers me is professional identity. As a data analyst, my integrity is tied to accuracy; or at least intellectual honesty about limits. When I feel pressure to compress uncertainty beyond what’s responsible, I feel complicit. But I also feel small. Because if I insist on preserving nuance, I risk being labelled:

  • Overly cautious
  • Not business-minded
  • Unable to “see the bigger picture”

And maybe that’s the heart of it. The bigger picture belongs to the executives. The smaller, messier details belong to me. But the consequences of oversimplification don’t scale neatly with title. If anything, the higher the decision, the more dangerous false certainty becomes. This is partly why I want to write publicly about “kiss up, kick down.” Not to accuse specific people. Not to litigate past decisions. But to name a pattern. And to ask a harder question: What would it look like for leaders to manage ambiguity openly? To say, “The data suggests X, but here’s what we don’t know?” To carry uncertainty upward instead of compressing it into someone else’s risk? If “kiss up, kick down” is the extreme version of hierarchical self-protection, then maybe the antidote is shared accountability for doubt.

I don’t know yet how to do that perfectly. But I know I don’t want to normalize the alternative.