Auditors told they cannot blame AI for failures
(Credit: Thapana Studio - stock.adobe.com)
The accountancy regulator has issued what it describes as the world’s first guidance on the use of artificial intelligence in auditing, making clear that firms cannot hold the technology responsible when things go wrong.
Mark Babington, executive director of regulatory standards at the Financial Reporting Council (FRC), was unequivocal on the matter of responsibility.
“You can’t blame it on the box,” he said. “If you use this technology, you are still accountable for it.”
Accountability, he added, rests firmly with audit partners rather than with the developers of any AI system.
The FRC’s guidance warns that AI tools “pose risks to audit quality”, citing the misuse of AI-generated outputs as well as the danger of deficient results, including hallucinations and data distortions, that could lead auditors to reach inappropriate conclusions.
The Big Four firms have collectively invested billions in AI on the expectation that it will transform the audit process, accelerating work and reducing costs. KPMG’s US business recently credited its proprietary AI platform with winning an audit tender from a competitor, whilst in February the firm pressed its own auditor, Grant Thornton, to accept lower fees on the grounds that AI would make the work cheaper to perform.
Despite widespread anxiety about AI displacing audit professionals, Mr Babington cautioned against excessive alarm, describing too much “hand-wringing and panic” over job losses. Auditing, he argued, remained “still reliant on really good, professionally sceptical judgement”. Nevertheless, both PwC and KPMG have announced plans to cut hundreds of audit roles, attributing the reductions to fewer mid-level staff departing for positions elsewhere in a cooling labour market.
Mr Babington was clear that investment in AI is not optional. “If audit can’t keep up with AI adoption elsewhere in the corporate world, we’re in quite a low place,” he said.
Firms are already deploying the technology across a range of functions, from automated checks of financial statements to AI agents carrying out elements of audit procedures. Even so, the FRC stressed that human oversight, professional judgement, and investment in staff training and system safety must be maintained. Firms must also remain alert to the moment an AI agent’s behaviour shifts in ways that would warrant further human intervention before its use could continue.
The regulator is itself increasing its use of AI, deploying it to triage corporate reporting evidence and to process large volumes of documentation. Budget constraints, with the FRC’s planning allocation for the next two years rising by less than inflation, are limiting its ability to recruit specialist staff, adding further impetus to its own AI adoption.
There are also concerns that the technology could entrench existing inequalities in the profession. The Big Four absorbed 90% of audit fees paid by FTSE 350 companies in 2024, and Mr Babington acknowledged that there are “massive differences” in the resources available to different firms.
He said he was in discussions with US regulators about the potential for private equity investment in smaller firms to help address the imbalance, an avenue that smaller firms are themselves actively pursuing. “Private equity is going to help,” he added.

