AI in life sciences: regulating AI technologies and the product liability implications

Recent regulatory developments in the EU and UK suggest a proactive stance on governing AI technologies, with the EU introducing comprehensive regulations and the UK proposing reforms to align with emerging digital technologies. These changes present challenges and opportunities for life sciences companies, necessitating adept navigation of regulatory compliance and product liability risks.

headshots of two authors

In this article, Marsh’s Life Sciences industry practice leader Jenny Yu and experts from law firm Kennedys - Paula Margolis, Corporate Affairs Lawyer and Samantha Silver, Partner - examine AI in the life sciences sector, covering associated risks, regulation, potential liability issues, and strategies for companies to 'future-proof' themselves.


Test tubes

Regulatory developments

There has been some form of regulation of artificial intelligence (AI) technologies in many life sciences uses in the EU and UK for several years. The EU’s lead on AI regulation began within the medical devices sphere with the introduction of the Medical Device Regulations (“MDR”) and In Vitro Diagnostic Regulations (“IVDR”) (2017/745 and 746) that, belatedly, came into force on 26 May 2021.

More recently, it has led the charge in proposing the first-ever comprehensive regulatory framework to govern the risks posed by emerging digital technologies, including AI. Following the publication of the European Commission’s (“the Commission”) White Paper on AI and a series of consultations and expert group discussions, on 21 April 2021 the Commission published its long-awaited proposal for a regulation laying down harmonised rules on AI, also referred to as the ‘Artificial Intelligence Act’. It is designed to complement existing EU legislation, such as the General Data Protection Regulation. It also aims to extend the applicability of existing sectoral product safety legislation to certain high-risk AI systems to ensure consistency.

The proposed regulation adopts a risk based approach and imposes strict controls and extensive risk management for the most risky forms of AI, including the requirement to undergo conformity assessments; the drawing up and maintenance of technical documentation; the implementation of quality management systems; and affixing of CE-markings to indicate conformity with the Commission’s proposed regulation before products are released to market. It has wide-ranging applicability and will affect AI providers and users inside and outside of the EU. Although this is familiar territory for life sciences companies, it is important that resources are put in place to respond to and deal with this additional regulatory burden, if and when it comes into force.

If the proposed regulation does come into force, it will not be implemented in the UK owing to Brexit. Nevertheless, UK businesses offering AI technologies to the EU will be directly affected when selling their products in the EU, and will be required to comply with the regulation.

The EU’s drive to implement global standards for new technologies has also had a domino effect in the UK:

  • On 16 September 2021, the Medicines & Healthcare products Regulatory Agency (“MHRA”) published a “Consultation on the future regulation of medical devices in the United Kingdom”, which ran until 25 November 2021. The Consultation set out proposed changes to the UK medical device regulatory framework with the aim to “develop a world-leading future regime for medical devices that prioritises patient safety while fostering innovation.
  • In conjunction with the Consultation, the MHRA also published Guidance, “Software and AI as a Medical Device Change Programme”, which pledges to deliver bold change to provide a regulatory framework that gives a high degree of protection for patients and the public, while ensuring that the UK is the home of responsible innovation for medical device software.
  • On 22 September 2021, the UK launched its first National Artificial Intelligence (AI) Strategy to “help it strengthen its position as a global science superpower and seize the potential of modern technology to improve people’s lives and solve global challenges such as climate change and public health”. The Strategy includes plans for a white paper on AI governance and regulation.

Product liability risks

Although there is a human hand behind AI technologies, the intangible nature of many AI applications raises questions as to who or what will be accountable for the consequences of their use, particularly when the development of such applications involve a myriad of persons, including software developers and data analysts.

In the UK, and depending on the specific circumstances, claims relating to product liability may be brought in negligence, breach of contract or pursuant to the Consumer Protection Act 1987 (CPA), the implementing legislation which transposed the EU Product Liability Directive 85/374/EEC (PLD) into UK law. The CPA imposes liability on a producer for damage caused by a defective product, often referred to as “no fault liability”.

Section 3 of the CPA provides that a product is defective if the safety of the product is “not such as persons generally are entitled to expect”. In assessing the safety of a product, the court will take into account all of the circumstances it considers factually and legally relevant to the evaluation of safety, on a case by case basis. These factors may include safety marks, regulatory compliance and warnings. A claimant bringing a claim under the CPA must prove the existence of a defect and that the defect caused the damage.

The unique features and characteristics of AI technologies present challenges to the existing liability framework. For example, questions are raised as to whether AI based software or data is considered a “product”, as defined by the CPA, or a service. This distinction is particularly relevant in the context of AI technologies that comprise physical hardware and cloud based software, such as a smart medical device, and where such software is often subject to automated modification. Similarly, questions may be asked as to which person(s) are considered a producer for the purposes of the CPA. Is it the software developer, the engineer, or the user responsible for updating the software?

The EU is seeking to address whether the PLD is fit for purpose and whether, and if so how and to what it extent, it should be adapted to address “the challenges posed by emerging digital technologies, ensuring, thereby, a high level of effective consumer protection, as well as legal certainty for consumers and businesses”. Draft legislative changes could be available by Q3 2022.

The UK is taking similar measures to address whether its existing product safety and liability regimes meet the challenges of AI technologies. The UK Government opened a consultation via the UK Product Safety Review to explore possible changes to existing product safety laws to ensure the framework is fit for the future, acknowledging that the provisions of the CPA do not reflect emerging technologies such as AI. Furthermore, potential reform of the CPA is being mooted by the Law Commission as part of its 14th Programme of Law Reform, having invited views as to whether the CPA should be extended to cover technological developments.

 

This article was originally posted by Marsh.

More news and updates

Infinitopes secures £12.8 million seed financing to enhance its Precision ImmunomicsTM antigen discovery technologies

Infinitopes Precision Immunomics, an integrated cancer biotech combining world-leading platforms in precision antigen discovery with vaccine vectors capable of durably stimulating protective immune responses, today announced the completion of a £12.8 million seed funding round led by Octopus Ventures.

Business Opportunity: Advance your workforce with a Bioscience Degree Apprenticeship

Manchester Metropolitan University have opened applications for the Laboratory Science Degree Apprenticeships in Bioscience. New cohorts are set to begin in September.

Poolbeg Pharma plc announces exclusive option agreement to acquire Orphan Drug Candidate for Behçet's Disease

Poolbeg Pharma plc announces that it has entered into an exclusive 12-month option agreement with Silk Road Therapeutics Inc, for a nominal fee, to acquire a novel topical muco-adherent formulation of Pentoxifylline (tPTX) for the treatment of oral ulcers in patient's suffering from Behçet's Disease.

Significant POLB 001 patent granted in United States

Poolbeg Pharma, a biopharmaceutical company focussed on the development and commercialisation of innovative medicines targeting diseases with a high unmet medical need, announces that, further to its announcement on 20 March 2024, the Company has received the fully granted patent from the US Patent Office for its Immunomodulator II patent application.

More within