Speech-to-text capability is now baked into all modern computers. But what if you didn’t have to dictate to your computer? What if you could type just by thinking?
Silicon Valley startup Sabi is emerging from stealth with that goal. The company is developing a brain wearable that decodes a person’s internal speech into words on a computer screen. CEO Rahul Chhabra says its first product, a brain-reading beanie, will be available by the end of the year. The company is also designing a baseball cap version.
The technology is known as a brain-computer interface, or BCI, a device that provides a direct communication pathway between the brain and an external device. While many companies such as Elon Musk’s Neuralink are developing surgically implanted BCIs for people with severe motor disabilities, Sabi’s device could allow anyone to become a cyborg.
It's not exactly Musk’s vision of the future, which involves implanted brain chips to allow humans to merge with AI. But venture capitalist Vinod Khosla, who was an early investor in OpenAI, says a noninvasive, wearable device is the only path to getting lots of people to use BCI technology.
“The biggest and baddest application of BCI is if you can talk to your computer by thinking about it,” says Khosla, founder of Khosla Ventures, one of Sabi’s investors. “If you're going to have a billion people use BCI for access to their computers every day, it can't be invasive.”
Sabi’s brain-reading hat relies on EEG, or electroencephalography, which uses metal disks placed on the scalp to record the brain’s electrical activity. Decoding imagined speech from EEG is already possible, but it’s currently limited to small sets of words or commands rather than continuous, natural speech.
Photograph: Courtesy of SabiThe drawback of a wearable system is that the sensors have to listen to the brain through a layer of skin and bone, which dampens neural signals. Surgically implanted devices pick up much stronger signals because they sit so close to neurons. Sabi thinks the way to boost accuracy with a wearable is by massively scaling up the number of sensors in its device. Most EEG devices have a dozen to a few hundred sensors. Sabi’s cap will have anywhere from 70,000 to 100,000 miniature sensors.
“Given that high-density sensing, it pinpoints exactly what and where neural activity is happening. We use that information to get much more reliable data to decode what a person is thinking,” Chhabra says.
The company is aiming for an initial typing speed of 30 or so words per minute. That’s slower than most people type, but he says the speed will improve as users spend more time with the cap.
One big problem with decoding imagined speech is the huge variability in natural thought patterns. Even if two people were thinking about saying the same phrase, their brains would fire a bit differently.
BCIs rely on AI to decode neural activity into actionable, real-time commands. For implanted devices, AI models are trained on neural data from one individual. But a wearable will need to be able to decode intended speech from many users.
To address that, Sabi is building a type of large-scale AI model called a brain foundation model, which is trained on extensive neural data from many people to learn fundamental patterns of activity that correlate with inner speech. Chhabra says the company has so far amassed 100,000 hours worth of brain data from 100 volunteers.
JoJo Platt, an independent neurotech consultant based in San Francisco, says consumer brain-sensing devices will need to be universally easy to use if developers hope to have a viable product. Most BCIs need to be calibrated before each use, because brain signals can change from day to day based on a user’s level of fatigue and focus. Consumer devices will need to work right away—and work consistently—for people to use them regularly.
“These devices are going to have to be ready to go out of the box,” Platt says. “They're going to have to conform to me rather than me conforming to it.”
Comfort and camouflage will also be key to wearable devices. Even for medical or assistive applications, Platt says, patients prefer an inconspicuous device. It’s why Neuralink, Paradromics, and Synchron are all developing implantable devices that are cosmetically invisible.
The same goes for consumer wearables. Smart rings and watches are designed to be small, compact, and comfortable. One brain wearable company, Neurable, has installed EEG sensors into a pair of headphones that look identical to something you’d use to listen to music.
Typing from your brain sounds cool and all, but it also raises questions about the privacy and security of your neural data. Chhabra says when data leaves the device and is uploaded to the cloud, it’s encrypted from end to end. Sabi’s AI models are able to train on the encrypted data rather than the raw neural data. The company is consulting with neurosecurity experts from Stanford University and elsewhere to audit its entire technology stack.
“We need to recognize that neural data is the most private kind of data that a person could possibly have,” Chhabra says. “Not treating it with care would just be unfair.”