Project Nightingale: Google denies using private health data for its AI research

project nightingale google ascension health healthcare ai artificial intelligence

Google has denied using private health data for its own AI research following The Wall Street Journal’s exposé of Project Nightingale.

Project Nightingale is Google’s codename for its partnership with Ascension, the US’ second-largest health system. The project enables Google to access the health information of around 50 million American patients.

Ascension reportedly did not tell doctors and patients that it was sharing data – including the names, histories, dates of birth, diagnoses, lab results, and hospitalisation records of patients – with Google.

For training AIs, such a large amount of real patient data is a goldmine. However, if true, it also raises serious ethical questions especially given none of the patients gave explicit permission for their data to be used in this way.

The story has caught the attention of lawmakers and a federal inquiry into how the data is being used has been launched. According to WSJ, the Office for Civil Rights in the Department of Health and Human Services “will seek to learn more information about this mass collection of individuals’ medical records to ensure that HIPAA protections were fully implemented.”

Google is developing a system for Ascension to predict the outcome and risks of certain procedures and medications. Images such as MRIs could be uploaded to a network which is accessible by both Ascension and Google staff.

The potential benefits of such a predictive system are clear and potentially lifesaving, but privacy must be respected. Healthcare is a very personal thing and it’s been used to discriminate against some communities in the past.

Eric Silverberg, the founder of Perry Street Software, the largest LGBTQ owned-and-operated software firm, said:

“As a leader of a community that has faced health discrimination in the past, we are deeply concerned by reports that Google is using its platform monopoly to surreptitiously aggregate health information on users without explicit consent.

It is easy to imagine how HIV and STD status could be used to deny health coverage to LGBTQ Americans, as has been the case in years past.

Google should halt this program immediately until it has been fully explained to regulators, policymakers and users. It is because of reckless data use decisions like this that Perry Street Software, publishers of two of the largest gay dating apps in the world, severed its advertising business relationship with Google in 2018.”

Google maintains Project Nightingale is above board and complies with all regulations. The company also says that Google is using the data to assist in building an AI-powered system for Ascension but is not using it to train its own systems. Furthermore, Google is not combining patient data to use across its other healthcare partners.

In an FAQ, Google wrote: “We are building tools that a single customer (e.g., a hospital or primary care group) can use with their own patients’ data. The data is siloed, access controlled, and auditable. We do not combine data across partners, and we would not be allowed to under our agreements or the law.”

A Google spokesperson said the company is “happy to cooperate” with the federal investigation and believes its “work with Ascension adheres to industry-wide regulations (including HIPAA) regarding patient data and comes with strict guidance on data privacy, security, and usage.”

Interested in hearing industry leaders discuss subjects like this? Attend the co-located 5G Expo, IoT Tech Expo, Blockchain Expo, AI & Big Data Expo, and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London, and Amsterdam.

Leave a Reply

Your email address will not be published. Required fields are marked *