AI in medicine, particular in pediatric medicine holds much promise in taking scarce human expertise and making it available throughout rural America and to the rest of the world. Rwanda has one pediatric cardiologist in the country.
In 2015, when neural network technology succeeded in building computer algorithms which were better than humans at image recognition signaled the beginning of this renaissance in AI. But, as the above chart courtesy of Jeff Dean, head of Google Brain shows, the only way to get increasing degrees of accuracy is to have more and more data. Any of you in major metro areas will see Waymo vans driving around collecting more and more data to feed autonomous driving software development.
While advances in image recognition technology are happening on the consumer side we haven’t seen the same sea change in medicine. One of the often-quoted barriers is PRIVACY.
This post is largely based on a talk Mirena Taskova gave at the 2021 HCCA Research Compliance Conference. Mirena was one of my cloud computing students at Stanford. She’s a European Privacy Lawyer with over 14 years of experience both in Europe and in the United States, who took the class while completing her US law degree at Stanford Law School.
There are many misconceptions people in tech, medical research and clinicians have about privacy and I asked her to help educate us. We took the approach of identifying seven myths. Go on and take the quiz. Don’t worry I missed a few the first time.
1. HIPAA and GDPR privacy requirements will impede the progress of AI (T or F)
First, if the personal data used for AI are anonymized, the GDPR does NOT apply.
Second, the GDPR applies to processing (e.g. access, use, disclosure, erasure) of personal data. It provides the framework for the development of a AI applications respectful of individuals. GDPR allows automated decision-making where there is a justification for this (e.g. contract, explicit consent, by law) and safeguards for the individuals concerned are taken into consideration.
2. An application company has an agreement with a hospital to allow patients to download and store health records. The application company now owns these health records (T or F)
The application company is using the health records on behalf of the hospital and under its instructions. The application company does NOT own the health records, and it cannot freely use or disclose them to third parties. Instead, the application company should follow the instructions of the hospital and the personal data processing agreement / business associate agreement concluded with the hospital (if applicable).
3. The application company is based in the United States so the GDPR does not apply (T or F)
Non-EU companies should comply with the GDPR too if they process personal data of people located in the EU and offer them goods / services in the EU Market or monitor their behavior in the EU The GDPR also applies if the application company has an establishment in the EU, and it processes personal data in the context of this establishment regardless of whether the processing takes place in the EU or not.
4. The application company receives and stores MRI images electronically from the hospital. The MRI images are not PHI/ Personal Data. (T or F)
The MRI images contain information that can identify an individual. Therefore, the MRI images likely fall within the scope of the definition “personal data”, as envisaged by the GDPR (provided GDPR applies). The MRI images contain personal data related to health. Therefore, (if GDPR applies) they will be classified as sensitive personal data and will face special protection under the GDPR. Considering the MRI images are received by a healthcare provider (i.e. hospital), relate to the individual’s present physical health and can identify the individual, they will likely be considered PHI under HIPAA (provided HIPAA applies).
5. The application company replaced the name of patients with identification codes so they have de-identified/anonymized the data. (T or F)
Often data can be re-identified or be pseudonymized instead of anonymized. Make sure that the data is indeed de-identified/anonymized. For example, disclosure of a code or other means of record identification designed to enable coded or otherwise de-identified information to be re-identified is considered a disclosure of PHI / Personal Data..
6. Consent. The application company wishes to collaborate with a hospital to introduce a new innovative infrastructure that will lead to significant research progress based on MRI images analysis. The management states that HIPAA/GDPR prohibits the hospital to share PHI / Personal Data without the consent of the patients. (T or F)
GDPR: Under the GDPR “Consent” could be used as a legal basis for processing of personal data but this is NOT the only legal basis for personal data processing. There are other legal bases that could be used, such as: “legitimate interests” and in case of health-related data: “if necessary for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes in accordance with safeguards” or “provision of health care or treatment or management of health care systems and services on the basis of Member State law or pursuant to a contract with a health professional”.
HIPPA. Under HIPAA “Consent” could be used as a legal basis for processing of PHI but this is NOT the only legal basis that could be used. There are other legal bases that could be used, such as: “Documented Institutional Review Board (IRB) or Privacy Board Approval”, “Limited Data Sets with a Data Use Agreement”.
7. A Patient refused a notice of the privacy practices; therefore the hospital cannot provide healthcare services (T or F)
GDPR. The hospital must provide the patient with a Privacy Notice. However, the Notice has informative purpose and does not require consent for receipt. The hospital can provide the healthcare service on the basis of relevant legal ground that may differ from consent.
HIPPA. Healthcare providers must provide patients with a Notice of Privacy Practices, but patients do not have to accept the document, read it, or sign it. A healthcare provider must provide the Notice and make a good faith effort to obtain a signature to acknowledge that the patient has received the Notice. Treatment can be provided even if signature is not obtained.
So how did you do?
If you’re fortunate and live in a major metro area you have the potential to access to the best and brightest minds in medicine. But if you’re in rural America, or the continent of Africa, AI holds the promise that we can bring equity to healthcare on the planet. Don’t let privacy stop progress.