Google DeepMind received 1.6 million identifiable personal medical records on an “inappropriate legal basis”, according to a letter written by Fiona Caldicott at the UK’s National Data Guardian, a government watchdog that monitors patient confidentiality. The letter obtained by Sky News was sent to the medical director of the Royal Free NHS Trust in London on 20 February. The data sharing agreement between the trust and DeepMind was first revealed by a New Scientist investigation last year.
Google’s AI firm originally obtained the NHS patient records to test a smartphone app called Streams that could help monitor people with kidney disease. A quarter of acute kidney deaths are preventable if caught early, so DeepMind wanted to use its algorithms to try to spot early signs of the disease. If successful the app would save lives.
However, UK regulations specify that unless explicit consent is given by patients, their data can only be shared for the purposes of “direct care”. But Caldicott concluded that DeepMind instead used the data “for the testing of the Streams application, and not for the provision of direct care to patients”.
Caldicott’s assessment is damning, as the National Data Guardian is the leading body on protecting patient data. She has recently submitted evidence to the Information Commissioner’s Office (ICO), which is investigating whether the data transfer between the Royal Free NHS Trust and DeepMind was appropriate under the Data Protection Act.
The report from the ICO is expected soon and, if it sticks to Caldicott’s assessment, will side with New Scientist’s original conclusions that despite the huge potential benefits of DeepMind’s technology to patient care, consent was not obtained when it should have been.
“DeepMind obtained 1.6 million patient records, the vast majority of whom have no kidney condition,” says Phil Booth at MedConfidential, an organisation that campaigns for health data privacy. “Why do they have data for the whole hospital population and how could that ever have been considered direct care? These are questions that must be asked.”
The ICO has the power to issue a fine of up to £500,000 for breaches in data protection law, which in this case would work out at just 30p per patient. It could also issue an enforcement notice requiring DeepMind to delete or stop using the data. “That’s not to say the things that DeepMind want to do couldn’t be ethical, lawful, consensual and transparent,” says Booth. “But at the moment they are failing on every condition.”
In a statement, a spokesperson for DeepMind says that the company recognises the need for more public engagement and discussion about new technology in the NHS. “We want to become one of the most transparent companies working in NHS IT.”