Doctors are using unapproved AI software to record patient meetings, investigation reveals

9 hours ago 2


Doctors are using AI software that does not meet minimum standards to record and transcribe patient meetings, according to a Sky News investigation.

NHS bosses have demanded GPs and hospitals stop using artificial intelligence software that could breach data protection rules and put patients at risk.

A warning sent out by NHS England this month came just weeks after the same body wrote to doctors about the benefits of using AI for notetaking - to allow them more time to concentrate on patients - using software known as Ambient Voice Technology, or "AVT".

Health Secretary Wes Streeting will next week put AI at the heart of the reform plan to save the NHS in the 10-year plan for the health service in England.

But there is growing controversy around software that records, transcribes and summarises patient conversations using AI.

In April, NHS England wrote to doctors to sell the benefits of AVT and set out minimum national standards.

However, in a letter seen by Sky News, NHS bosses wrote to doctors to warn that unapproved software that breached minimum standards could harm patients.

Text of warning letter with highlighted sections

NHS warning letter with text highlighted

The 9 June letter, from the national chief clinical information officer of NHS England, said: "We are now aware of a number of AVT solutions which, despite being non-compliant ... are still being widely used in clinical practice.

"Several AVT suppliers are approaching NHS organisations ... many of these vendors have not complied with basic NHS governance standards.

"Proceeding with non-compliant solutions risks clinical safety, data protection breaches, financial exposure, and fragmentation of broader NHS digital strategy."

Sky News has previously revealed the danger of AI "hallucinations", where the technology makes up answers then lies about them, which could prove dangerous in a healthcare setting.

Please use Chrome browser for a more accessible video player

Is ChatGPT reliable despite its 'hallucinations'?

NHS England sets minimum standards but does not tell NHS trusts and healthcare providers which software providers to use.

Sky News can now reveal there is growing pressure on NHS England and similar bodies to be more proactive.

Dr David Wrigley, deputy chair of the British Medical Association's GP committee, said: "Undoubtedly, as a GP myself and my 35,000 colleagues, we've got responsibilities here - but in such a rapidly developing market when we haven't got the technical knowledge to look into this.

"We need that help and support from those who can check that the products are safe, check they're secure, that they're suitable for use in the consulting room, and NHS England should do that and help and support us."

Dr Wrigley continued: "We're absolutely in favour of tech and in favour of taking that forward to help NHS patients, help my colleagues in their surgeries.

"But it's got to be done in a safe and secure way because otherwise we could have a free for all - and then data could be lost, it could be leaking out, and that just isn't acceptable.

"So we are not dinosaurs, we're very pro-AI, but it has to be a safe, secure way."

Matthew Taylor, chief executive of the NHS Confederation

Image: The head of the NHS Confederation says the letter is 'a really significant moment'

The spectre of dozens of little-known but ambitious AI companies lobbying hospitals and surgeries to get their listening products installed worries some healthcare professionals.

There are huge profits to be made in this technological arms race, but the question being asked is whether hundreds of different NHS organisations can really be expected to sift out the sharks.

Matthew Taylor, chief executive of the NHS Confederation, said the letter was "a really significant moment".

He said it was right for the NHS to experiment, but that it needed to be clearer what technology does and does not work safely.

"My own view is that the government should help in terms of the procurement decisions that trusts make and should advise on which AI systems - as we do with other forms of technology that we use in medicine - which ones are safe," Mr Taylor said.

"We'll need [government] to do a bit more to guide the NHS in the best way to use this."

When pressed whether in the short term that actually makes it sound like it could be quite dangerous, Mr Taylor replied: "What you've seen with ambient voice technology is that kind of 'let a thousand flowers bloom' approach has got its limits."

Please use Chrome browser for a more accessible video player

Godfather of AI warns of its dangers

Earlier this year, the health secretary appeared to suggest unapproved technology was being used - but celebrated it as a sign doctors were enthusiastic for change.

Mr Streeting said: "I've heard anecdotally down the pub, genuinely down the pub, that some clinicians are getting ahead of the game and are already using ambient AI to kind of record notes and things, even where their practice or their trust haven't yet caught up with them.

"Now, lots of issues there - not encouraging it - but it does tell me that contrary to this, 'Oh, people don't want to change, staff are very happy and they are really resistant to change', it's the opposite. People are crying out for this stuff."

Read more from Sky News:
National investigation launched into maternity services
Every baby in the UK to receive DNA testing

GP Anil Mehta

Image: GP Anil Mehta says the AI software helps cut paperwork and patients are 'extremely reassured'

Doctors who use AI that complies with national standards already say there are big benefits.

Anil Mehta, a doctor in the health secretary's Ilford constituency, told Sky News he backed his MP's drive for more AI technology in healthcare.

"I spend 30% of my week doing paperwork," he said. "So I think once I've explained all of those features of what we're doing, patients are extremely reassured. And I haven't faced anybody that's not wanted to have me do this.

He added: "(I) think that consultation with your doctor is extremely confidential, so that's not changed at all.

"That remains confidential - so whether it's a vulnerable adult, a vulnerable child, teenager, young child with a parent, I think the concept of that confidentiality remains."

An NHS spokesperson said: "Ambient Voice Technology has the potential to transform care and improve efficiency and in April, the NHS issued guidance to support its use in a safe and secure way.

"We are working with NHS organisations and suppliers to ensure that all Ambient Voice Technology products used across the health service continue to be compliant with NHS standards on clinical safety and data security."

Read Entire Article