top of page
Writer's pictureFoster Health

Data Privacy and Security: FosterHealth AI

FosterHealth’s HIPAA compliant AI-powered scribe generates fact-checked clinical notes based on conversations between patients and physicians.


In many instances, physicians do not have sufficient time to review clinical notes and transfer it to their electronic health records at the end of each patient encounter. Many physicians prefer to do EHR related tasks in one go. To facilitate this workflow, we need a technological infrastructure that enables them to store essential information from these sessions and retrieve it at their convenience, accessible on any device of their choosing. Since this data involves sensitive patient health information, the technological infrastructure should ensure data privacy and implement the right security measures to protect patient health data.



Robust technical safeguards: Our triple layered security features protect patient health information by ensuring that no one except the user with log-in credentials can retrieve the data
Robust technical safeguards: Our triple layered security features protect patient health information by ensuring that no one except the user with log-in credentials can retrieve the data


Our approach

Every healthcare operator must find our application trustworthy. They should feel comfortable and confident every time they use our application to get help with documentation related tasks. To achieve this design goal, we made three design choices:

  1. Store what’s needed, delete what’s not needed immediately

  2. Implement robust technical safeguards to protect patient health information

  3. Be a responsible AI partner and not use patient data to train our models

In this blog, we describe how our cloud storage infrastructure leverages these principles and helps in establishing user trust.


Store what’s needed, delete what’s not needed immediately

In order to transfer clinical notes to EHRs with minimal cognitive burden, physicians need to retrieve the AI generated clinical notes, transcript and ID. They do not need audio data. Hence, our technology stack never writes audio data to disk. We immediately delete the audio data after our AI models complete the transcription task. We only store transcripts, clinical notes and IDs specified by the physician in the cloud.


Triple-layered security with client side encryption, encryption during transit, server side encryption

Protected health data (transcripts, clinical notes) is encrypted before it leaves the physician’s device (laptop, mobile phone). Hence, any protected health data that leaves the physician’s device is not human readable. The data is transmitted to the cloud via a secure connection and is encrypted in the transmission channel — this implementation mitigates man-in-the-middle attack related risks. Finally, the data is again encrypted on the server side before it is written to the disk. Hence, the data stays encrypted during transit and at rest.


When the physician logs in to a device and tries to retrieve their data, the data is transmitted back via a secure connection and stays encrypted. The data is decrypted on the physician’s device, ensuring that only the user with their log-in credentials can retrieve the data and no-one else.


Use secure cloud platform: Google Cloud

We use Google Cloud infrastructure for storing encrypted data, which employs multiple technical and organisational controls to protect the stored data. Google cloud has annual audits for multiple standards including SSAE 16 / ISAE 3402 Type II, ISO 27001, ISO 27017, ISO 27018.


Additionally, we allow our enterprise customers to set the physical cloud storage location for storing the encrypted data.


Responsible AI Partner: We do not use your data to train our models

At FosterHealth AI, we believe that user data should belong to the user- we should not use user data for model training without a clear consent from the user. We made a conscious choice to use only publicly available datasets for training our models. The technical, administrative and organisational controls we enforced ensure that no one at FosterHealth AI can access user data.


Our goal is to deliver the state of the art technology in a reliable and trustworthy manner. We are constantly talking to our users, collaborating with leading research institutes and healthcare experts and continually improving our service. If you have any additional questions or if you want to partner with us, please contact us here.

16 views0 comments

Comentários


bottom of page