These days, big news stories in healthcare are just as likely to come from large technology companies as from major healthcare institutions. Case in point: In early 2018, Amazon created a huge stir in the industry early by announcing a venture with JPMorgan and Berkshire Hathaway that promised to go after the “hungry tapeworm” that is the cost of healthcare in the U.S.
And now Amazon has a new headline…
Amazon has announced that a version of their virtual assistant technology, Alexa, is now HIPAA-eligible. This means it’s available for applications that are subject to the data privacy and security requirements of the Health Insurance Portability and Accountability Act of 1996 (known as HIPAA). HIPAA legislation was enacted in the United States to establish requirements for protecting data privacy and security of healthcare information. The new HIPAA-eligible version of Alexa, more specifically, the Alexa Skills Kit, is available to a limited number of developers by invitation only.
While Amazon’s latest announcement may not seem quite as dramatic or broad-sweeping as the aforementioned tapeworm remedy, it is, in fact, very big news for both Amazon and for the U.S. healthcare industry.
Since it was first rolled out more than four years ago, Alexa has generated significant interest for its potential as a virtual healthcare assistant.
While other digital healthcare technologies have long promised to address key problems in healthcare, many of these have been plagued with problems and bottlenecks when it comes to patient engagement – particularly among elderly individuals and others who, for one reason or another, can’t or won’t use traditional digital devices like PCs, tablets, or smartphones.
The idea of a smart, always-available, hands-free, voice-powered virtual assistant that can answer health-related questions, deliver medication reminders, facilitate communication with one’s doctor, provide health coaching, and more, has more than piqued the interest of the healthcare community -- and Amazon has responded.
Up until now, use of Alexa for healthcare has been limited to question answering services – voice apps, or “skills” in the Alexa parlance - that can answer general questions about health conditions, treatments, symptoms, etc. If you own an Amazon Echo, you may have tried one of these skills, like Answers by Cigna or one of the many symptom checker skills available in the Alexa marketplace.
The big change is that Alexa can now be used in certain applications that collect and transmit Protected Healthcare Information or PHI.
PHI is any health information about an individual related to things like treatments, diagnoses, test results, or prescriptions. ePHI is PHI that is created, stored, or transmitted electronically. Personally Identifiable Information, or PII, is any information that can be used to identify an individual. HIPAA identifies 18 specific types of PII, which include obvious information like name, social security number, and phone numbers, as well as biomarkers like fingerprints and, relevant to voice assistants, “voice prints.”
To support HIPAA, Amazon likely made a few changes to the way they handle data by Alexa. Importantly, Amazon, as a business associate in the HIPAA parlance, is now also able to execute Business Associates Agreements (BAAs) with HIPAA covered entities, like hospitals, as well as other business associates, to guarantee protection of data in compliance with HIPAA guidelines. BAAs are required contracts for any organization (a “business associate”) providing a HIPAA-eligible service to a covered entity. Amazon has offered BAAs for other services like Amazon Web Services, but Alexa was a hold out.
To application developers, Amazon Alexa and other virtual assistant platforms are frameworks for creating human-like conversational interfaces that leverage voice. Theoretically, any healthcare application that has a digital user interface (web, mobile, kiosk, etc.) – whether for a patient, clinician, caregiver, etc. – could benefit from a voice-powered virtual assistant. Amazon’s announcement is limited to the Alexa Skills Kit, so it applies to consumer-facing Alexa skills used at home over Alexa-powered devices like the Amazon Echo. The examples among the early adopters described in Amazon’s announcement, including Cigna, Express Scripts, and Boston Children’s Hospital, demonstrate this.
Beyond the early examples, there are other applications of Alexa for healthcare that bridge consumer and clinical gaps:
Amazon’s announcement is very promising for the U.S. healthcare industry. If you work for a provider, payer, pharmaceutical firm, or a healthcare solution vendor and you carry any responsibility for patient experience, this news from Amazon impacts you. Clinicians and patients alike have been using hands-free, voice interfaces in their homes and have been demanding the same usability for clinical care applications, so it has seemed inevitable that voice assistants like Alexa and smart speaker-equipped devices like the Amazon Echo would find their way into clinical applications.
Amazon’s announcement confirms this, but it’s important that you understand everything you can about the what, why, and how of making voice-first healthcare applications secure.
Healthcare will see continued deployment of consumer-facing skills that do not require HIPAA’s rigor, as well as a morphing of these customer service-centric offerings into more clinical use cases where HIPAA will be mandated. Consider a basic Q&A member benefit voice skill offered by a health plan. A much richer and more personalized experience can be created by adding PHI to the mix. The lines will blur in this regard and as value-based care brings in new shared risk revenue models across the various sectors within healthcare.
HIPAA is just the start. There is no formal certification process for HIPAA, and it applies only in the U.S. healthcare market. Also, many IT departments at healthcare organizations use standards from other industries or have their own standards for data privacy and security. In their eyes, completely securing a voice application may go well beyond ensuring that a service provider will sign a BAA. Issues like user authentication, data privacy in shared spaces, eavesdropping, network and device hacking, secure system integration (e.g. with an EHR), all remain as concerns that should be addressed. Enterprise-grade considerations are critical from a planning perspective, not only for HIPAA compliance but well beyond.
If you’re considering a secure, voice-powered virtual assistant for a clinical application, contact Orbita. We are the healthcare leader in conversational AI for enterprise voice- and chatbot-powered virtual assistants and can help you navigate these new waters.
Click here to request a complimentary “Amazon Alexa, HIPAA, and You” consultation with Orbita.
During this one-hour call, an Orbita voice security expert will help you understand the implications of Amazon’s announcement for voice solutions in your organization and help you identify opportunities to tap the potential of transformative, secure voice-first applications.