AI isn’t coming — it’s already here. In insurance, AI is influencing underwriting, pricing, claims, customer interactions, and even vendor and contractor decisions. No business can credibly rely on the “we don’t use AI” defence anymore, and as an NED or SMCR function holder, you cannot assume that others at the table are the more “tech savvy.” And as a senior leader arguably you don’t need to be a tech expert to oversee AI: what you do need to be able to do is understand how it’s being used, where the risks lie, and what AI could mean for the business and its customers. That’s the essence of being fit and proper under SMCR: you don’t need to know how the model works, however, you must know what it can and cannot do. Oversight and governance should be practical, broad, and led at the senior management and board level.
Much of the confusion can start with the term itself. “AI” all too often spoken about as if it were a single tool, when in reality it covers a broad range of capabilities, from predictive analytics and machine learning to natural language processing, computer vision, and generative systems that create text, code or models.
For those sitting around the board table, that breadth matters. Different forms of AI influence different parts of the business — pricing models, claims triage, fraud detection, customer interaction, even strategic forecasting. Before debating risk or regulation, boards need clarity on what type of AI is being used and where.
AI can scale capability, open new markets and product offerings but only if experience remains in the room. It can flag issues, highlight trends, and even speed decisions, but it cannot replace human judgment. It has no memory of market intuition, or crisis memory. The real risk for senior leaders therefore is not the technology itself, but the hidden assumptions, unfair outcomes, and unchallenged decisions that slip past oversight.
Ultimately, boards are accountable for any process, product or outcome whether the technology is embedded formally or used informally. Being pragmatic means accepting that AI is part of the business landscape and designing governance that balances opportunity with responsibility is the key to good AI governance. Done well, it enables innovation and market advantage; done poorly, it exposes the firm its directors and SMF’s to significant risk.
Currently in the UK, there is no AI-specific legislation, and there seems no appetite to legislate. The FCA and PRA are clear: existing frameworks, and particularly SMCR, already places senior managers on the hook for AI-related harm. The closest we are to it are initiatives like the FCA’s AI Sand box Live Testing service. For senior managers, the takeaway is straightforward: AI is a rapidly evolving and increasingly scrutinised risk. It may compete with cyber, operational resilience, and conduct on crowded board agendas, but it cannot sit behind them. The regulatory approach may be principles-based, but expectations are rising. Understanding AI risk is no longer optional, it is a core part of discharging your responsibilities under SMCR and safeguarding both your firm and its customers.
For SMCR function holders, AI risk should not sit in a separate silo, it belongs squarely within the existing governance framework. That means embedding AI into models, risk oversight, audit programmes, and vendor management processes, and tracking clear indicators such as bias, error rates, and operational failures. Escalation and incident response must be defined in advance with case studies and reporting of remediation or regulatory. And alignment with PRA and FCA expectations, and roles and responsibilities under SMCR and Consumer Duty, should be explicit rather than assumed.
Done properly, this is not about creating new bureaucracy. It is about ensuring transparency in decision-making, protecting customers from unintended harm, and reducing regulatory and reputational exposure before it crystallises.
Talk to us about how we can address your challenges
The Lloyd’s Building
Gallery 7 – unit 787
One Lime Street
London
EC3M 7HA