The push for AI oversight in the NHS
The NHS has always been cautious -- rightly so -- about adopting new technology in clinical settings. From electronic prescribing to telehealth, each wave of digital health innovation has been accompanied by frameworks designed to protect patient safety and data integrity. AI-powered clinical documentation tools are no exception.
In 2025, NHS England signalled its intention to establish a more formal process for evaluating AI tools used in clinical workflows. The AI and Digital Regulations Service (AIDRS), building on earlier work by the NHS AI Lab, worked with integrated care boards, royal colleges, and vendor organisations to define what a trustworthy AI scribe looks like. In January 2026, NHS England published the Ambient Voice Technology (AVT) Supplier Registry, listing 19 self-certified suppliers. This marks a significant step toward formal oversight of AI scribes in the NHS.
What the AVT Registry means
The NHS AVT Supplier Registry requires AI scribe vendors to self-certify that they meet a defined set of standards before their product is listed. These are the key areas suppliers must address:
- Clinical safety case. A documented analysis of how the tool could cause harm and what mitigations are in place. This aligns with the existing DCB0129 standard for clinical risk management of health IT systems.
- Data protection compliance. Evidence that the tool meets the requirements of the Data Security and Protection Toolkit (DSPT), including data minimisation, encryption standards, and breach notification procedures.
- Performance transparency. Published accuracy metrics for transcription and note generation, ideally validated against real-world NHS consultations rather than curated demo recordings.
- Ongoing monitoring. A requirement to report adverse incidents, maintain audit logs, and provide clinicians with clear mechanisms for flagging errors.
For clinicians, the registry provides a useful signal of trustworthiness. Rather than evaluating vendor marketing claims independently, a GP or hospital trust can check whether a product has been through the self-certification process. It is important to note, however, that registry listing is based on self-declaration — NHS England has not independently verified every claim.
Current compliance frameworks
The AVT Registry builds on several existing frameworks that already apply to AI documentation tools used in NHS and UK private practice.
Data Security and Protection Toolkit (DSPT)
The DSPT is the NHS's mechanism for ensuring that organisations handling patient data meet a baseline set of security standards. Any AI scribe vendor processing NHS patient data should have completed a DSPT assessment. This covers areas such as staff training, data access controls, incident management, and encryption. A vendor that has not completed the DSPT should be treated with caution.
DCB0129 and DCB0160
DCB0129 sets out the clinical risk management requirements for manufacturers of health IT systems. If an AI scribe generates clinical notes that could influence clinical decisions -- which, by definition, it does -- then the manufacturer should have a clinical safety case in place. DCB0160 is the corresponding standard for deploying organisations (i.e., the practice or trust using the tool). Together, these standards ensure that clinical risks are identified, assessed, and mitigated on both sides of the vendor-customer relationship.
MHRA classification
The Medicines and Healthcare products Regulatory Agency (MHRA) has been refining its approach to software as a medical device (SaMD). Whether an AI scribe falls within the scope of medical device regulation depends on its intended purpose. A tool that simply transcribes and formats is less likely to be classified as a medical device than one that actively interprets clinical content or provides decision support. However, the boundary is blurred, and vendors should be prepared to justify their classification with a clear regulatory rationale.
UK GDPR and the Data Protection Act 2018
Processing patient data requires a lawful basis — typically Article 6(1)(e) (public task) for NHS settings, or Article 6(1)(b) (contractual necessity) for private healthcare. Explicit consent is not always required, but transparency is. Patients should be informed that an AI tool is being used during their consultation, and they should have the right to object.
Implications for clinicians
Choosing a tool
If you are evaluating AI scribes for your practice, look beyond the feature list. Ask the vendor:
- Have you completed the DSPT?
- Do you have a DCB0129 clinical safety case?
- Where is patient data processed and stored?
- What is your MHRA classification rationale?
- Can you provide accuracy metrics from UK clinical settings?
If the vendor cannot answer these questions clearly, that is a red flag -- regardless of how polished the product demo looks.
Institutional procurement
For practices operating within NHS trusts or ICBs, procurement decisions will increasingly be guided by centrally maintained approved-vendor lists or frameworks. Engaging your trust's digital team and clinical safety officer early in the evaluation process will save time and avoid compliance surprises later.
Medicolegal responsibility
It is worth emphasising that the clinician remains responsible for the content of the clinical record, regardless of whether a note was written by hand, dictated, or generated by AI. An AI scribe is a drafting tool. The clinician must review, correct if necessary, and approve every note before it is finalised. This responsibility does not shift to the vendor.
How to prepare your practice
With the AVT Registry now published, the practical steps for practices evaluating AI scribes are clear:
- Audit your current documentation workflow. Understand how much time clinicians spend on notes, where the bottlenecks are, and what a realistic improvement target looks like.
- Engage your Caldicott Guardian and IG lead. They need to be involved before any patient data is processed by a third-party tool.
- Request a Data Protection Impact Assessment (DPIA). This is a legal requirement under UK GDPR for processing that is likely to result in high risk to individuals. Clinical AI tools almost certainly meet that threshold.
- Run a pilot. Trial the tool with a small number of willing clinicians and a defined patient cohort. Evaluate accuracy, usability, time savings, and patient feedback.
- Document everything. Keep records of your evaluation process, risk assessments, and the rationale for your decision. This is both good governance and medicolegal protection.
WhiteFieldHealth's compliance posture
We have built WhiteFieldHealth with the UK regulatory landscape in mind from day one. Patient data is processed on UK-hosted infrastructure. We maintain a clinical safety case aligned with DCB0129 principles. Our RAG layer cross-references generated notes against NICE guidelines and the BNF, adding a safety check that goes beyond simple transcription and formatting. We are transparent about our accuracy metrics and welcome independent evaluation.
As the regulatory landscape evolves — with the AVT Registry now establishing a formal baseline and ongoing refinement of existing frameworks expected — we are committed to meeting and exceeding the standards that NHS and private clinicians rightly expect.