UK watchdog close to verdict in DeepMind Health data consent probe

The UKs data protection watchdog has said its close to concluding a 10-month+ investigation into consent complaints pertaining to a patient data-sharing agreementinked betweenGoogle-owned DeepMind and the Royal Free NHS Trust which operates three hospitals in London.

The ICO began its probe in May last year afterdetails emerged, viaa FOI request made by New Scientist,of the large quantity and scopeofpatient identifiable data being shared with DeepMind by the Trust. The arrangement, inked in fall 2015, had beenpublicly announced in February 2016 but details of which and how muchpatient data was involved in the arrangement were not shared.

Contacted for an update on theinvestigation today, an ICO spokesperson told TechCrunch:Our investigation into the sharing of patient information between the Royal Free NHS Trust and Deep Mind is close to conclusion.

Under the DeepMind-Royal Free arrangement, the Google-owned AI company agreed to buildan app wrapper foran NHS algorithmdesigned to alert to therisk of a person developing acute kidney injury.

Patient data for the Streams app was obtained without patient consent, with DeepMind and the Trustarguing it is unnecessary asthe app isusedfor direct patient care a positionthat has been challenged bycritics, and is being reviewedby regulators.

The ICO spokesperson added: We continue to work with the National Data Guardian and have been in regular contact with the Royal Free and DeepMind who have provided information about the development of the Streams app. This has been subject to detailed review as part of our investigation. Its the responsibility of businesses and organisations to comply with data protection law.

The medical records ofsome 1.6 million individuals are thought to be passed to DeepMind under the arrangement, although the data sharing is dynamic so theres nostatic figure. Datashared under the arrangement includes real-time inpatient data fromthe Trusts three hospitals across multiple departments, as well as historical in-patient data going back fiveyears.

As TechCrunchreported lastAugust, the UKs National Data Guardian (NDG) has also been reviewing how patient data was shared by the Trust. A spokeswomanconfirmed today it is stillliaising with the ICO inits investigation.

Legal experts continueto dispute Deepmind and the Royal Frees interpretation of NHS information governance guidelines including in a new academic paper,published today in the journal Health and Technology,entitled Google DeepMind and healthcare in an age of algorithms, which calls for more to be done to regulate data transfers from public bodies to private firms.

The study argues that inexcusable mistakes were made by the Royal Free and DeepMind, questioning the legal and ethical basis of Trust-wide data transfers, and criticizing the lack of transparency around the arrangement. The paper isauthored by Dr Julia Powles, a research associate in law and computer science at the University of Cambridge, and Hal Hodson, a journalist with The Economist who obtained and published the original data-sharing agreement when working at New Scientist.

Neither DeepMind nor the Royal Free NHS Trust responded to our requests for comment on the study. But aspokesperson for the NDG told TechCrunch: Our consideration of this matter has required a thorough approach in which the NDG and her panel have kept patients rightful expectations of both good care and confidentiality at the forefront of discussions.

The NDG has provided a view on this matter to assist the ICOs investigation and looks forward to this being concluded as soon as practicable, the spokespersonadded.

Theoriginal data-sharing agreement between the Royal Free and DeepMind was superseded by a second dealsigned in November 2016 which continued the sharing ofbroadly similar data types but included a commitment by the pair to publish key agreements, and introduced what they described as an unprecedented level of data security and audit in a bid to win trust in the wake of controversy over thearrangement.

The pairhave alwayssaid patientdata used for Streams isnot being used by DeepMind to train AI models but a separate memorandum of understanding between them dated January 2016 sets out broader ambitions for their partnership tostart applying artificial intelligence to Trust-held medicaldata to seekto accelerate and enhance clinical outcomes. (Which, as we have noted before, introduces another set of consent considerations for accessingsensitive and valuable publicly funded data.)

The scope of the partnership between DeepMind and the Royal Free has also expanded since the first data-sharing arrangement, withthe pair detailing planslast fall for the companyto also build a data-sharing access infrastructure for theTrust which willpositionDeepMindtofacilitate/broker app developers access toNHS patient data via anAPI in the future.

DeepMind has also said it intends tobuild a technical audit infrastructure to try to offer verifiable data access audits of howpatient data is being used. Although, earlier this month, the company confirmed this infrastructure will not be in placein the near term sayingonly that it hopesto havethefirst pieces of the centralized digital ledger implemented this year, emphasizingthedifficulties and challenges involved inbuilding it.

Meanwhile, the Streams app has been rolled out to Royal Free hospitals, and patient identifiable data continues to flow to the Google-owned company which is also now being paid by the Royal Freefor its services. Commercial terms of the arrangement between DeepMind and the Trust have neverbeen disclosed. Attempts to obtain the charging and invoicing details of the arrangement via FOI have been declined oncommercial sensitivity grounds.

Read more: https://techcrunch.com/2017/03/16/uk-watchdog-close-to-verdict-in-deepmind-health-data-consent-probe/