UK proposes “light touch” approach for regulating artificial intelligence
Like an elephant, the essence of AI is difficult to put into words, but you know it when you see it.* This poses a challenge for policymakers looking to use regulation to support the safe development of artificial intelligence. In a new policy paper, the UK Government chooses not to pin down what AI means and emphasises the flexibility of its approach. But the EU’s first-mover proposal for an AI-specific Act may end up setting the global standard.
Pro-innovation regulation
In its policy paper, the Department of Digital, Culture, Media and Sport has put forward a “pro-innovation” vision for the future regulation of artificial intelligence. The paper sets out the building blocks of a cross-sectoral regulatory framework.
As the paper points out, the UK does not have laws written explicitly to regulate AI. This means that businesses rolling out AI systems must make sure they fit within existing legal and regulatory regimes. For example, the Information Commissioner’s Office (ICO) has taken action against Clearview for its facial recognition tech and has promised to investigate concerns over the use of algorithms to sift recruitment applications. See also our note on how AI in financial services is regulated in the UK.
The lack of AI-specific regulation may, however, lead to confusion and hinder innovation. Respondents to a 2019 survey of financial institutions suggested that additional guidance on how AI fits within existing rules could encourage more firms to adopt AI.
To steer consistency across different industries, the DCMS intends to set cross-sectoral principles tailored to AI and ask regulators to contextualise these for the sectors they oversee.
Principles and guidance, not rules
The DCMS has produced six guiding principles for regulators to consider when overseeing the use of AI in their sector. These are:
- Ensure that AI is used safely
- Ensure that AI is technically secure and functions as designed
- Make sure that AI is appropriately transparent and explainable
- Embed considerations of fairness into AI
- Define legal persons’ responsibility for AI governance
- Clarify routes to redress or contestability
DCMS does not expect these principles necessarily to translate into new obligations. Instead, it plans to encourage regulators to consider lighter touch options in the first instance, such as guidance or voluntary measures. Regulators are told to adopt a proportionate and risk-based approach focusing on high-risk concerns.
This flexible approach is likely to be applauded but by choosing not to regulate in this area it could be that the EU’s stricter rules become the de facto standard for AI regulation.
EU divergence
Unlike the EU, the UK is not preparing to introduce AI-specific legislation. Instead, the DCMS suggests that responsibility should be delegated to regulators for designing and implementing proportionate regulatory responses.
The European Commission’s bold proposal for an AI Act aims to regulate AI systems across the EU according to the level of risk they present. The draft legislation seeks to ban AI systems that present unacceptable risks, impose strict requirements on those considered to be high risk (such as systems used to evaluate credit risk or provide credit scores), and potentially subject lower risk systems to transparency requirements.
The EU’s regime could bring about sweeping changes, requiring businesses to assess the riskiness of their AI systems and comply with the relevant obligations. Failing to meet the requirements for high-risk AI systems could lead to fines of up to EUR 30 million or 6% of global turnover, whichever is greater. Read more in our blogpost on what the EU is doing to foster human-centric AI.
Another distinction between the EU and UK is the approach to defining AI. Whereas the EU AI Act includes a very broad definition, the DCMS policy paper chooses not to define AI. Instead, it notes core characteristics of AI technology which existing regulation may not be fully suited to address.
These characteristics are:
- Adaptiveness ie the logic behind an output can be hard to explain
- Autonomy ie the ability to make decisions without express intent or human involvement
It is the combination of these characteristics that demand a bespoke regulatory response for AI. By focusing on these core characteristics, the DCMS argues that a detailed universally applicable definition of AI is not needed.
The DCMS acknowledges that its proposals diverge from the vision of AI regulation set out by the EU but argues that the EU’s approach of setting a “relatively fixed definition” in legislation would not be right for the UK because it does not capture the full application of AI and its regulatory implications.
Next steps for AI in financial services
The DCMS emphasises the importance of ongoing collaboration between UK regulators in the digital space including via the Digital Regulation Cooperation Forum, which includes the Financial Conduct Authority.
As well as contributing to the DRCF, the FCA has been working closely with the Bank of England on AI, for example via the AI Public Private Forum. The results of a follow-up to the 2019 FCA-Bank of England survey on how machine learning is used in the financial services sector are expected later this year. The regulators also plan to open a discussion paper in 2022 which will aim to clarify the current regulatory framework and how it applies to AI.
For its part, the DCMS says that it is still at the early stages of considering how best to put its approach into practice but will set out further details in a white paper and consultation later this year. Its current thinking is to put the cross-sectorial principles on a non-statutory footing but the DCMS does not rule out the need for legislation as part of the delivery and implementation of the principles, for example to update regulators’ powers.
The DCMS invites views on its policy paper by 26 September 2022.
For more on the outlook for AI regulation, read our Tech Legal Outlook mid-year update and our 2021 report on AI in financial services.
*“There are some words or expressions which are like an elephant; its essence is difficult to put into words, but you know it when you see it.”
Blackbushe Airport Ltd v Hampshire County Council, R (On the Application of) & Ors [2021] EWCA Civ 398