← Illusions in the Boardroom

Structural Correction

Chapter 13: What to Demand Before Anything Else

Six questions a board should ask before any AI investment. Structured as a dramatised interrogation in which an audit committee chair asks each question and the CTO’s inability to answer constitutes a finding.

The audit committee met quarterly, but this session had an additional agenda item. The chair, Elena, a non-executive director who had spent thirty years in operational finance and joined the board twelve months earlier, had read the AI readiness report that management had circulated the previous week. Fourteen pages. She had three questions written on a folded sheet of paper in front of her.

The CTO was present, which was unusual, and the CFO sat beside him, while the chief risk officer attended by video from Singapore. Although the committee secretary had reserved ninety minutes, the chair suspected they would need considerably less.

Elena had worked with the CTO for two years. She respected his technical judgement and suspected the gap she was about to expose was structural rather than personal. That would not make the next twelve minutes easier for him.

She had approved the previous two quarterly board packs herself. She had signed off the capability map presentation six months earlier without asking the architect's question. At her previous firm, she had accepted “integration testing” as an answer to a timeline question she should have pressed. The folded sheet in front of her would expose a gap she had helped maintain.

Elena opened her folded sheet and read the first question. Six demands define the minimum preconditions for any AI system that reads the organisation's own artefacts to produce trustworthy output. Without them, analytical synthesis is indistinguishable from hallucination. An organisation that points AI at its own data without process ownership, versioned definitions, and contract inventories has not adopted the reader path. It has adopted the writer path under a different name. The six demands that follow are the structural difference between reconciliation and confabulation.

The preconditions for trustworthy AI output are also becoming the minimum standard for explaining automated decisions to people who do not report to you. Boards sometimes assume that outsourcing a model, a platform, or an “AI capability” outsources the accountability, but the organisation still owns the business process, which means it owns the evidence trail when something goes wrong or when a supervisor asks how the decision was made. If the organisation cannot name the process, the owner, the interfaces, and the observed behaviour, the only remaining control is narrative, which is efficient right up to the moment scrutiny arrives with logs.

...

Continue reading in the interactive reader

Read this chapter

See also: Full contents · Preview chapters · Illusions of Work