Cell phone pocket calls occur every day, with the first “butt dial” likely happening shortly after the very first person thought it was a good idea to stuff his or her phone into a back pocket and sit.
Contrast that all-too-common scenario to what recently happened with an Echo device. A husband and wife reported that their Echo, unbeknownst to them, had sent the text of a conversation to one of the husband's employees.
There are an estimated fifty million of these Amazon voice assistants in use today, and this is the first case of what can be described as the voice assistant equivalent of a butt dial. Why has it taken so many years and so many devices for this to happen?
A device like an Echo is, in theory, less likely to initiate an unintended communication than a mobile phone, which requires little more than the act of sitting or stuffing the phone into your pocket. For an errant communication to occur on an Echo device, a series of voice commands would be necessary. For the scenario above to unfold, for example, the following would need to transpire:
That’s five steps requiring precise responses. Though I can’t say for certain, it is unlikely that you will hear another “voice assistant butt dial” story again soon.
Voice assistants are designed to only interact when spoken to
The question of whether or not voice assistants are listening to everything we say is not new. Nor is it wrong to wonder, given privacy concerns.
This scenario, which you have probably experienced, is a good analogy to show how a voice assistant becomes aware that you are speaking to it. Imagine you are at a crowded party and are conversing with someone in one room. In another room, a group of people is talking. You can hear the chatter but aren't paying attention. Suddenly, your name is mentioned as part of the conversation, and your ears perk up. Naturally, you start listening to the what's being said (possibly to the chagrin of the person with whom you were speaking).
That is how voice assistants work. They only “perk up” when their wake word—in Alexa’s case, her name—is uttered.
You might ask: if the device is listening for the wake word, doesn’t it prove that it is listening all the time? Yes and no. These devices use two different modes to interact with a user. The first includes a chip that is always “on” and is only listening for the wake word. This chip contains no memory. It’s only purpose is to listen for the word and wake up the second mode.
The second mode is what listens, engages with, and responds to the user. It is designed to filter out any background noise and listen to the human voice.
Understanding this design should help you feel assured that voice assistants are not listening all the time. With that concern out of the way, let’s look at the benefits of voice assistants, especially in the sphere of healthcare.
Improving engagement in healthcare through conversational experiences
Devices like the Echo Dot and Google Home as well as chatbots and other AI-driven conversational agents are providing valuable opportunity to engage consumers. Voice enables natural and hands-free interaction, making the use of this technology and access to the internet available to people with dexterity and sight issues.
From work Orbita has done for pharmaceutical companies and medical device manufacturers as well as interactions through conferences and webinars, we know that the healthcare industry—or at least segments of it—recognize the benefit of voice assistants. They understand that these devices can improve engagement in clinical trials, medication adherence, and data collection.
In truth, the utilization of voice assistants could benefit all players in the healthcare system. For example, payers could use voice assistants to reduce the costs of care by improving access to member services. Care management services, such as telehealth solution providers, could use voice assistants to enhance their existing offerings. Finally, providers could use voice assistants to improve pre-and post-acute care, possibly reducing readmittance rates.
Orbita has validated this by successfully selling solutions to major organizations in each of these categories including American Red Cross, Amgen, Commonwealth Care Alliance, Mayo Clinic and Merck.
Becoming HIPAA compliant
Data around medical conditions are among the most sensitive information about a person. Safeguarding this information is why the Health Information Portability and Accountability Act, or HIPAA, was passed. In the 20+ years since HIPAA became law, the healthcare industry, from doctors to pharmaceutical companies to payers have adapted their practices to comply with it. In that same time, technology has changed, making things like electronic data records and apps on cell phones a reality. It goes without saying that when these new facets of healthcare were implemented, their HIPAA compliance was worked into their design.
The same goes for voice assistants. In fact, Pillo Health has engaged Orbita to produce a healthcare companion robot that will serve as an “in-home extension of the care team.” Knowing the importance of privacy and data security, Pillo Health designed their systems so that data storage and transport are HIPAA compliant.
Sure, protecting a patient’s medical information may not be a sexy new interface or provide a "wow moment," but HIPAA compliance is the unsung hero that provides the assurances that will make all the benefits of voice-assisted health care—or any new technologies used in healthcare—a reality.
Developers like Pillo have worked out HIPAA compliance on their own. In the meantime, Amazon, perhaps recognizing the economic potential of entering the healthcare market, has teams working on making Echo and/or other devices HIPAA complaint.
As more players in healthcare see the value of voice as a means of interface and build HIPAA-compliant devices and assistants, fears about data privacy should be allayed, and the potential benefit of voice in healthcare will be realized by all.