Illusions in the Boardroom
Free Illusions in the Boardroom The board-level diagnosis
Illusions of Work
Illusions of Work The operator manual
← Articles

Then AI Arrived

The Polished Lie

A retailer deploys AI to answer leadership questions about metrics. For three months, nobody checks. Some numbers are wrong. Some are fabricated. The failure is not the AI. The failure is skipping the preconditions.

In September, a mid-sized European retailer deploys an AI agent to answer leadership questions about operational metrics. Revenue by territory, retention by cohort, conversion by channel. The agent has access to the data warehouse, the dashboards, and the historical reports. Leadership asks questions. The agent answers: specific percentages, trend comparisons, quarter-over-quarter movements. The answers arrive with explanations, sound analytical, and are consumed as data.

For three months, nobody checks.

The numbers are sometimes from the wrong time periods. They sometimes mix product categories. A small number are fabricated entirely: the agent fills gaps with statistically likely values and presents them with the same confidence as the verified numbers. The VP of sales makes territory decisions based on the output. The CFO includes the numbers in a board deck. Anyone who raises concerns about validation is told they are “slowing down innovation.”


The retailer believed it was doing the right thing: using AI to answer analytical questions about its own operations, not to generate narrative. The AI had access to the organisation’s own data. This looked like reconciliation.

The failure is that the organisation had skipped the structural preconditions. The data warehouse was not structured, versioned, or owned by anyone accountable for its accuracy. The AI read the data the same way a human analyst would read a poorly documented spreadsheet: by making assumptions that seem reasonable and producing output that looks right.

Reconciliation only works if the artefacts are binding. When the artefacts are structured, versioned, owned, and tested against the system, the output is reconciliation. When the artefacts are unstructured and unowned, the output is confident fiction drawn from a different source. The fiction sounds more authoritative because it appears to come from the organisation’s own systems, which makes it harder to challenge and more dangerous to consume.

Producing organisational narrative used to require effort. A strategy deck took weeks. A board pack consumed days. The effort forced at least some engagement with reality, because the person assembling the narrative had to talk to the people who understood the system. AI removes that constraint. Narrative becomes abundant, and each additional claim about reality that is not reconciled against the system is an additional contradiction the organisation carries silently. The organisation has not become more dishonest. It has become more efficient at being wrong.

Illusions in the Boardroom
Free Illusions in the Boardroom How AI Forces Structural Honesty. The board-level diagnosis. Read now →
Illusions of Work
Illusions of Work AI Has Made Reality Non-Negotiable. The operator manual for CTOs. Read now →

See also: All articles · Illusions in the Boardroom · Illusions of Work