Tangentially related, but a number of telehealth operations with hospitals/therapists/etc... use Zoom -- I suspect because their clients can connect without an app or an account over a browser.
When you join a Zoom session over the browser, you don't sign a TOS. And I assume that actual licensed medical establishments are under their own TOS provisions that are compatible with HIPPA requirements. Training on voice-to-text transcription, etc... would be a pretty huge privacy violation particularly in the scope of services like therapy. Both because there are demonstrable attacks on AIs to get training data out of them, and because presumably that data would then be accessible to employees/contractors who were validating that it was fit for training.
Out of curiosity, has anyone using telehealth checked with their doctor/therapist to see what Zoom's privacy policies are for them?
The law doesn't protect it. HIPAA doesn't apply in that setting.
Attorney client privilege is an interesting case.
"Privacy issues" is a meaningless phrase to me when divorced from the law. Do you mean, like, ethically concerning? This term in the contract is neither uncommon nor illegal.
I know that many smaller therapists use Zoom for exactly the reasons you mentioned above - ease of use. They often don't have the technical know-how to assess the technology they're using.
The UK, for example, has hundreds of private mental health practitioners (therapists, psychologists, etc.) that provice their services directly to clients. They almost universally use off-the-shelf technology for video calling, messaging, and reporting.
IANAL, but I did health tech for 10 years and had my fair share of interactions with lawyers asking questions about stuff I built.
HIPAA applies to the provider. Patient have no responsibility to ensure the tech used by their care provider is secure or that their medical records don't wind up on Twitter. HIPAA dictates that the care providers ensure that happens by placing both civil and sometimes criminal liability on the provider for not going to great lengths here.
In practice, this means lawyers working with the care providers have companies sign legal contracts ensuring the business associate is in compliance with HIPAA, and are following all of the same rules as HIPAA (search: HIPAA BAA).
Additionally, you can be in compliance with HIPAA and still fax someone's medical records.
Healthcare professionals still use fax precisely because of this.
Analog line fax is HIPAA compliant because it is not "stored"
Using a cloud fax provider will inmediately put you out of compliance for this reason, unless you have a HIPAA compliant cloud fax service, which are rare.
I don’t think the question is about Zoom’s safeguards which are audited, and as you say almost certainly stronger than HIPAA requirements, but rather whether they can use the stored PHI for product development where the law appears ambiguous.
Imo the law basically says you can do this with PHI:
-De-identify it then do whatever you want with it
-use it to provide some service for the covered entity, but not for anyone else
-enter a special research contract if you want to use it slightly de-identified for some other specific purpose
One note is that the act of deidentification itself requires accessing PHI when done retroactively, this may be institutional policy or specific to covered entities but per the privacy office lawyers such access (apart from a small dataset) requires a permitted use to be accessible in order to then deidentify and use freely.
As with all things HIPAA, this only becomes a problem when HHS starts looking and I’m sure in practice many people ignore this tidbit (if in fact this is the law and not Stanford policy).
Related to this, anyone know if Zoom has a separate offering for education (universities, schools, etc)? I teach at a university, and not only do we use Zoom for lectures etc, but also for office hours, meetings, etc, where potentially sensitive student information may be discussed. I'm probably not searching for the right thing; all I found was this: https://explore.zoom.us/docs/doc/FERPA%20Guide.pdf
(FERPA is to higher ed in the US what HIPAA is to healthcare.)
"Vanity-URLs" is just a feature, usually a requirement for SSO. I cannot see that that would cause any different treatment of data related to your use.
IANAL but “Zoom for Healthcare” is a business associate under HIPAA and treated as an extension of the provider with some added restrictions.
Covered entities (including the EMR and hospital itself) can use protected health information for quality improvement without patient consent and deidentified data freely.
Where this gets messy is that deidentification isn’t always perfect even if you think you’re doing it right (especially if via software) and reidentification risk is a real problem.
To my understanding business associates can train on deidentified transcripts all they want as the contracts generally limit use to what a covered entity would be allowed to do (I haven’t seen Zoom’s). I know that most health AI companies from chatbots to image analysis do this. Now if their model leaks data that’s subsequently reidentified this is a big problem.
Most institutions therefore have policies more stringent than HIPAA and treat software deidentified data as PHI. Stanford for example won’t allow disclosure of models trained on deidentified patient data, including on credentialed access sources like physionet, unless each sample was manually verified which isn’t feasible on the scale required for DL.
“Limitations on Use and Disclosure. Zoom shall not Use and/or Disclose the Protected Health Information except as otherwise limited in this Agreement or by application of 42 C.F.R. Part 2 with respect to Part 2 Patient Identifying Information, for the proper management and administration of Zoom…”
“Management, Administration, and Legal Responsibilities. Except as otherwise limited in this BAA, Zoom may Use and Disclose Protected Health Information for the proper management and administration of Zoom…”
Not sure if “proper management and administration” has a specific legal definition or would include product development.
“But how should a business associate interpret these rules when effective management of its business requires data mining? What if data mining of customer data is necessary in order to develop the next iteration of the business associate’s product or service? … These uses of big data are not strictly necessary in order for the business associate to provide the contracted service to a HIPAA-covered entity, but they may very well be critical to management and administration of the business associate’s enterprise and providing value to customers through improved products and services.
In the absence of interpretive guidance from the OCR on the meaning of ‘management and administration’, a business associate must rely almost entirely on the plain meaning of those terms, which are open to interpretation.”
Haha wow this is a great post. I am a lawyer and you may have solved a problem I recently encountered. So you think this is saying that generic language in the Zoom BAA constitutes permission to de-identify?
Are there examples of healthcare ai chatbots trained on de-id data btw? If you're familiar would love to see.
> Haha wow this is a great post. I am a lawyer and you may have solved a problem I recently encountered. So you think this is saying that generic language in the Zoom BAA constitutes permission to de-identify?
Not that I’m an expert on the nuance here but I think it gives them permission to use PHI, especially if spun in the correct way, which then gives them permission to deid and do whatever with.
My experience has been that it’s pretty easy to spin something into QI.
> Are there examples of healthcare ai chatbots trained on de-id data btw? If you're familiar would love to see.
https://loyalhealth.com/ is one I’ve recently heard of that trains on de-id’d PHI from customers.
> What's your line of work out of curiosity?
Previously founded a health tech startup and now working primarily as a clinician and researcher (NLP) with some side work advising startups and VCs.
Happy to help. Let me know where to send the invoice for my non-legal legal expertise, if your rate is anything like my startup's lawyer you'll find me a bargain! Haha.
Forgive me for being pedantic but this is like nails on a chalkboard to me.
HIPAA is the correct abbreviation of the Health Information Portability and Accountability Act which as an aside doesn't necessarily preclude someone from training on patient data.
HIPPA is the unnecessarily capitalized spelling of a (quite adorable) crustacean found in the Indo-Pacific and consumed in an Indonesian delicacy known as yutuk.
When you join a Zoom session over the browser, you don't sign a TOS. And I assume that actual licensed medical establishments are under their own TOS provisions that are compatible with HIPPA requirements. Training on voice-to-text transcription, etc... would be a pretty huge privacy violation particularly in the scope of services like therapy. Both because there are demonstrable attacks on AIs to get training data out of them, and because presumably that data would then be accessible to employees/contractors who were validating that it was fit for training.
Out of curiosity, has anyone using telehealth checked with their doctor/therapist to see what Zoom's privacy policies are for them?