Thought Experiment: The Telepathic AI Wearable
I wanted to write this blog to connect a few dots, think about the future, and layout the concept for a future product.
It all starts with the latest Sam Altman interview, with Mostly Human, in which he ends off thinking about a world where there's an AI-powered device that will get things done for him and only prompt him at the times he wants. Essentially, read his mind.
Now, mind reading is a really hard problem to tackle. In confusing social situations, we attempt to mind read by gauging intent via detecting mood through facial expressions and outward behavior. However, we're currently not able to literally detect someone's thoughts.
Yet, we have a few examples within the past few years where people have successfully controlled things with their mind. A couple years ago, I came across a stream clip on Instagram of a gamer girl (@perrikaryal) who would play very mechanically challenging games by wearing an EEG (electroencephalogram) which allows her to map brain activity to various key binds. Through the YouTube video below by Great Big Story, you can actually see that anyone can use this same device and play video games. We now live in a world in which you can think your way through Dark Souls.
Now how do we make this EEG tech more powerful?
Step 1: Convert Brain Activity to Words
What if we could use an EEG to detect thought? Translate the brain activity into actual human language. What I mean is to map brain activity and translate it into words–to telepathically communicate. Let's say instead of mapping activity into key binds for games, we take the brain activity have it type on a screen. This would verify that we can indeed capture intent.
Step 2: Feed that intent into an LLM agent
Let's take this a little further, typing onto a screen is not that useful– it's good for anything that requires typing, but it doesn't actually get something done. Let's capture this intent into an on-device LLM agent that can understand your intent and translate it into automations.
We'd in essence need a trigger and a way to prompt. It's almost like having an Alexa or Siri but instead it's silent and something you wear on your head and listening to your thoughts. Ideally, we can have innovative models that are so efficient that it can operate without needing cloud compute. It would live on-device or be local to your home internet.
Step 3: Connect that intent to your smart devices
Assume we've solved for capturing conscious intent and separating it from the noise of passive intent. Now, we need to commute this LLM agent that can interpret your thoughts to automate and send smart commands to things around the house. Imagine a world where you can think your way to turning off the lights before bed. You can think your way to setting a timer. There are a lot of things you can do in your house that would be so much more convenient with thought than by yelling at your Alexa, Siri, or even someone else on the other side of the room.
Why tho?
Now you may be wondering, "Why?". Why must these things exist? Well, I think this product could be one of the most accessible devices that almost all humans living in a house with smart devices and a home internet can use. Voice commands aren't everyone's preferred interaction method. Sometimes we're too tired to vocalize. Capturing thought remains the unsolved problem, but I think the tech exists that we could be solving for this very soon. There might already be startups/labs working on this that I don't even know about.
How do we make this a widespread wearable?
Well, this is something that would usually just be used in your own house, but you could imagine a situation where instead of it looking like it does now, it can be redesigned into a hat. The hats can be any design. Maybe there's a world where you can connect it to your car and think your way through music controls.
There are many use cases for telepathic communication once the ability to capture intent through brain activity is solved. I'm not sure how much this device would cost, but it is very important that it becomes affordable so that it is accessible.
Proactive When I Want It
Now I want to think about how to solve for this. Let's say I've solved for intent capture and can consciously command things to happen through telepathic communication. How do I have something that prompts me?
Part of this requires capturing passive thought and pattern matching it against risks. Let's say you're thinking thoughts like "I feel cold". This isn't an intent driven command, it's more of an observation. Now what could happen is that this EEG -> LLM captures this sentiment and asks you "Would you like the heating turned on?" I haven't thought about the form factor of how it would proactively ask for your input, but it would have to happen from some sort of speaker in your house (like an Alexa).
So to wrap this all up, it's a device that you can wear like a hat that captures your brain activity with an EEG and translates that into human language that an LLM captures to detect conscious intent that can be translated into actions on smart devices. This will live in the privacy of your home internet and all compute of the EEG -> intent capturing will be on-device either in the EEG on your head or via a separate device that acts as an intermediary for more storage space/compute power. We can't have people passively listening to your thoughts as that is a big invasion of privacy. Talking next to a voice assistant was already a privacy risk that was flagged, having someone listening to your thoughts is next level.