Apparently, that answer is yes. According to Amazon, its virtual personal assistant, Alexa, can now transfer and handle protected health information (“PHI”) in accordance with HIPAA. Amazon expects Alexa to handle various healthcare related tasks, including scheduling urgent care appointments, checking health insurance benefits and reading blood-sugar tests, among others. To create these new services, Amazon collaborated with various companies, including Cigna and major hospitals. When it comes to privacy, Amazon and its partners embedded various privacy barriers into the new services, including voice codes or requiring a user to login with passwords for existing health-care specific accounts.
As technology and health care continue to become more intertwined, I would not be surprised if Apple and Google follow Amazon’s lead, rolling out similar products for Siri and Google Home. Additionally, the types of health care services offered through these virtual personal assistants as well as our smart phones will likely only grow in breadth. It no longer seems far-fetched that you may communicate and transmit data to your health care provider or pharmacy by talking into a speaker in the comfort of your home. The question becomes what happens to all the information that you are saying aloud? This will be service-dependent, but it is clear that Amazon, among other tech companies, will now be maintaining your electronic PHI.
This represents a seismic shift from the information maintained by your everyday fitness tracker and comes with an entire new set of compliance responsibilities. Thus, while HIPAA is both scalable, depending on the scope of the covered entity, and flexible to adapt to new technologies, these tech companies may soon realize that HIPAA has real compliance costs as well. In today’s age of big data breaches, even a minor slip-up, for example leaking usernames of a specific health care service through the tech provider’s platform, could ultimately prove to be very costly.