A light tingling sensation draws Alice’s attention to the Derma-abyss smart tattoos on her forearm, one of the biosensors had turned orange signaling high blood sugar levels. Her holographic lenses display a blinking notification, she is running short of medicines. With a swipe gesture, she makes her smart glasses open up a digital overlay automatically popping up with a virtual assistant to assist her. She taps on one of the biomarkers imprinted on her SkinMarks tattoo, to activate her EEG device to initiate Brain typing. While she thinks of the search terms in her mind, the text gets typed in the search box. Unable to find the right medicine, the brainwave signals from her EEG device, promptly understood by the virtual assistant as a person in distress, the AI virtual assistant initiates a voice call. Her skin-hearing bionics help her communicate in Spanish with an English speaking virtual assistant. Alice heaves a sigh of relief, as the order gets placed.
This seems to have come straight out of a sci-fi movie, afterall a utopian world manageable via our thoughts or brainwaves seems too good to be true. A decade ago, Brain-computer interface (BCI) applications were mostly targeted towards users with mobility, speaking, or hearing disabilities. These were aimed to provide an alternative communication channel for those users. Gradually, BCI is entering the world of healthy people as well, enhancing & augmenting lives in a manner unseen or unimaginable before. A simple pair of sunglasses that projects holographic icons that can display app notifications, navigation directives, reminders, etc.A temporary tattoo that, when applied to your skin, transforms your body into a living touchpad. Bionics embedded on the skin to monitor health via interstitial fluids in the body alerts pro-actively when the vitals are not normal. An EEG device which lets you type with your thoughts in your mind, without having to use your fingers or voice. Skin-hearing bionics which provides voice notification, instant translation over voice calls, alerts, or allows listening to an audiobook without bothering anyone else in the vicinity. A physiological measuring tool that retrieves and uses information about an individual’s emotional, cognitive, or affectiveness state.
Dr. Albert Mehrabian concluded in one of his studies that the interpretation of a message is 7 percent verbal, 38 percent vocal, and 55 percent visual, which also means that 70-93% of all communication is nonverbal.
BCI’s utility appears evident in neuro-marketing, advertisements, medical monitoring, immersive gaming & entertainment, security & authentication for voice-enabled applications. But something which excites us more is its potential application for virtual assistants. There is already a lot of buzz in the industry about the convergence of AR, VR, and virtual assistants and how it will revolutionize the retail, healthcare, education industry. Interacting with a virtual assistant to find the right product to purchase, AR makes it easier for the consumer to try-it-before-buying-it. Telepathic typing & skin hearing takes it a step further by allowing the user to do it discreetly even at a public place without disturbing a bystander or at the loss of privacy.
Not just immersive virtual interactions, but the conversations can be taken to a whole new level by the use of a phenomenon called Neural synchrony. A study published in Scientific Reports concludes that “The rhythms of brainwaves between two people taking part in a conversation begin to match each other. the neuronal activity of two people involved in an act of communication “synchronize” in order to allow for a “connection” between both subjects. It involves interbrain communion that goes beyond language itself and may constitute a key factor in interpersonal relations and the understanding of language”. If the synchronization plays a key role in an effective conversation between two human beings, can this also be used in case of a virtual assistant interacting with a human wearing a non-invasive EEG device which emits brainwaves for the virtual assistant to realize if the conversation is adhering to the synchronization or not? I had an interesting experience with a popular concierge service recently, a request was at-first blatantly denied citing the pandemic situation. Disgruntled and upset, I could only blurt out a heavy “Okay”. Whether it was the tone, intensity, or sheer disappointment in the voice, the agent re-considered and asked for a few hours to confirm the same. In my mind, I tried to re-play the exact conversation on a text chat channel or a voice assistant, and I realized the final outcome would have been a lot different.
How do we ensure empathy grounding, better contextual understanding of a user query when there are no non-verbal cues available for the text virtual assistant. Dr. Albert Mehrabian concluded in one of his studies that the interpretation of a message is 7 percent verbal, 38 percent vocal, and 55 percent visual, which also means that 70-93% of all communication is nonverbal. While Language understanding, Voice recognition, ASR, voice to Intent detection technologies grow leaps and bounds in the coming years, we believe BCI bionics can be truly game-changing in making conversations feel a lot more natural and effective. In fact, for AI to get even remotely close to being true artificial general intelligence, Brain-computer interaction bionics might play a pivotal role.