When the tie-up between Google’s DeepMind and London’s Royal Free NHS Trust was announced in 2016, it was praised as the sort of forward-looking innovation the NHS badly needed.
The collaboration across three big London hospitals would see DeepMind build a smartphone app called Streams that could detect sudden deterioration in kidney patients.
But within weeks a wrinkle emerged – DeepMind had been given access to 1.6m patient records stretching back up to five years and taking in other medical conditions too.
This week a leaked letter from the National Data Guardian (NDG) health watchdog described this transfer of data as having been carried out on an “inappropriate legal basis” – a formal way of saying it shouldn’t have happened in the way it did.
The letter lays bare thorny issues, starting with the basis on which an NHS Trust can transfer data. If it’s for direct medical care – keeping patients alive or in good health – consent is implied.
Except it was transferred so that DeepMind could test its technology on live data, and so technically was not care provision. Presumably, the company would respond that providing care still requires it to test its software on real data before doing that so this was a necessary stage in the process.
Fiona Caldicott of the NDG told Professor Stephen Powis, the Royal Free’s medical director:
My considered opinion therefore remains that it would not have been within the reasonable expectation of patients that their records would have been shared for this purpose.
A second potential issue not mentioned by Caldicott is the possibility of over-reach. At what point did a project to build a kidney-monitoring app turn into one that included lots of other conditions, including HIV patients, people experiencing drug overdoses or those having abortions?
DeepMind’s answer to that when the issue was queried last year was to point out that kidney data was not easily separated from other data which, if true, underlines the complexity of untangling datasets for specific projects.
Royal Free responded that the transferred data was encrypted and handled securely. It also said:
As with all information-sharing agreements with non-NHS organisations, patients can opt out of any data-sharing system by contacting the trust’s data protection officer.
This assumes that patients are aware their data has even been transferred in the first place. For the DeepMind project, it’s hard to see how this would have been the case as its scope wasn’t clearly described in advance.
It’s tempting to blame the clumsy design of a challenging project breaking new ground, but there might be a simpler anxiety: Google itself.
Although DeepMind is UK-based and is said to act independently since its acquisition in 2014, its parent Google hovers in the background. That brings concerns over the company’s ambitions in working with an NHS already struggling with a morass of organisational, funding and security problems.
At heart, it’s a trust issue. Patients need to trust the NHS and the companies it works with, especially when they’re massive US organisations with businesses to build.
Britain’s Information Commissioner’s Office (ICO) will soon publish its report on whether the data transfer to DeepMind was legal under the Data Protection Act (DPA). When it does, people on all sides of this tangled story will be paying close attention.