{"id":21306,"date":"2023-08-18T19:32:32","date_gmt":"2023-08-18T19:32:32","guid":{"rendered":"https:\/\/nftandcrypto-news.com\/nft\/unbabel-debuts-wearable-ai-for-thought-based-communication\/"},"modified":"2023-08-18T19:32:34","modified_gmt":"2023-08-18T19:32:34","slug":"unbabel-debuts-wearable-ai-for-thought-based-communication","status":"publish","type":"post","link":"https:\/\/nftandcrypto-news.com\/nft\/unbabel-debuts-wearable-ai-for-thought-based-communication\/","title":{"rendered":"Unbabel Debuts Wearable AI for Thought-Based Communication"},"content":{"rendered":"
Four years ago, language transition startup Unbabel worked towards creating a \u201cbrain-to-communications interface\u201d that would allow businesses to understand (and be understood) by their customers and clients in multiple languages. Its Language Operations platform blends artificial intelligence (AI) with humans that evolve through high-quality transactions and conversations over time.\u00a0<\/p>\n
As a startup with $90 million in VC funding and annual revenues around $50 million while simultaneously surviving the COVID-19 pandemic, Unbabel has concentrated its efforts on better understanding the ways in which our brains have evolved.\u00a0<\/p>\n
TechCrunch<\/em>\u2019s<\/em> Mike Butcher spent an entire afternoon in New York City, sitting in Unbabel\u2019s offices alongside its founder and CEO Vasco Pedro and got a first-hand IRL demonstration of how this technology works.\u00a0<\/p>\n \u201cSitting in a meeting room in a startup office in Lisbon, I silently typed the answer to a question only the person opposite would know the answer to. What kind of coffee had I asked for when I\u2019d arrived at the office? A short moment later, without even moving or opening his mouth, the reply came back via a text message: \u2018You had an Americano.\u2019\u201d <\/p>\n \u2013 Mike Butcher<\/strong><\/cite><\/p><\/blockquote>\n Unbabel\u2019s innovation team, led by Paulo Dimas, VP of Product Innovation, told TechCrunch <\/em>that in time, we will start to see \u201cthe creation of the \u2018uber cortex\u2019\u201d that the company believes \u201cwill be AI-powered\u2026and is going to be existing outside of your biological brain.\u201d<\/p>\n \u201cYou have your limbic system, you have your neocortex. But they\u2019ve actually evolved over millions of years. They\u2019re actually separate systems,\u201d Pedro told TechCrunch.<\/p>\n As part of this research, Unbabel initially began looking into electroencephalogram (EEG) systems, which could be invasive to the human body \u2013 similar to some of the devices that Elon Musk\u2019s Neuralink is actively exploring \u2013 yet, chose to instead focus on an EMG system, or electromyography, which measures muscle response and\/or electrical activity in response to a nerve\u2019s stimulation of the muscle.<\/p>\n These types of systems are easily accessible in the marketplace, with Amazon playing host to a number of them.\u00a0<\/p>\n Pedro told TechCrunch <\/em>that they wanted to explore a \u201cnon-invasive\u201d mechanism instead, pointing towards something that could \u201cmore reliably capture some of the signals.\u201d In other words, they wanted to depict EMG systems as a \u201cgateway to brain interaction directly.\u201d<\/p>\n In doing this, Pedro and his team paired an EMG system with generative AI to create a personalized LLM, or large language model, that is trained on a wide array of specific words or phrases directly relating to how an EMG\u2019s wearer would react when thinking of a particular word or phrase.<\/p>\n Dubbed \u201cHalo\u201d (named after \u201chalogram\u201d), Unbabel uses a mobile app that runs on the wearers phone that enables access to a central hub that allows for the receiving of inbound communications with the personalized LLM as well as outbound communications. It currently leverages OpenAI\u2019s ChatGPT 3.5.\u00a0<\/p>\n Putting this to the test, Butcher asked Vasco what kind of coffee he\u2019d asked for that morning (in an unseen text message), Vasco was sent those words via Halo\u2019s AI voice to his earbuds, thinking of words like \u201cBlack coffee.\u201d As a result, Halo matched Vasco\u2019s physical response to the word, measuring it against the possibility of \u201cBlack coffee\u201d equating to an \u201cAmericano\u201d \u2013 all through the audio Vasco was receiving through his earbuds. Halo\u2019s AI then sent the answer \u201cAmericano\u201d to Butcher via a Telegram text message.\u00a0<\/p>\n \u201cThe LLM expands what you\u2019re saying. And then I confirm before sending it back. So there\u2019s an interaction with the LLM where I build what I want it to say, and then I get to approve the final message,\u201d explained Pedro.<\/p>\n Pedro clarified that at all times, Butcher, as well as all wearers of the EMG device, have \u201cabsolute control\u201d of what they are outputting \u2013 \u201cit\u2019s not recording what I\u2019m thinking. It\u2019s recording what I want to say,\u201d Pedro explained. He distinguished this type of approach from Musk\u2019s Neuralink, which he says attempts to measure \u201csubconscious interactions\u201d \u2013 an invasive method that is done without the permission or choice of the wearer.\u00a0<\/p>\n\n