Translating brain waves into words
Scientists began recording brain waves in 1875. Englishman Richard Caton planted the first cerebral-squiggle-flag on fertile British neurons. He recorded electrical activity from the exposed brains of rabbits and monkeys using a galvanometer. Several decades later, Hans Berger one upped him in teutonic fashion. In 1924, Doctor Berger, a German psychiatrist, recorded the first human brain waves from his son Klaus using scalp electrodes, a galvanometer, and later a Siemens double-coil recorder. In 1929, Berger published his findings, introducing the term “Elektroenkephalogramm” (EEG) and identifying alpha and beta brain waves. Since then, EEG has been used medically for a variety of purposes, including diagnosing epilepsy and categorizing sleep disorders. And by 2025, on the 150th anniversary of EEG’s discovery… mind reading!?!?
C’mon, dude. Are you really saying EEG tech can turn my brain into an open book?
Well, not exactly.
What about all my secret plans to conquer the world?
They’re safe.
Recent advances in AI and machine learning have pushed EEG towards decoding brain activity— often sensationalized as mind reading. TBH, this isn’t literal telepathy but rather pattern recognition: AI analyzes EEG signals to infer what someone is seeing, thinking, or intending.
Successful translation of brain waves into a novel’s worth of words lingers just outside our grasp. True mind reading requires accessing arbitrary thoughts with high fidelity, which EEG can’t do yet due to its low spatial resolution (brain wave signals are noisy and scalp-filtered). It’s tempting, though, to believe EEG is close, real close. Progress has accelerated since 2018, blending EEG with deep learning to reconstruct visuals, translate imagined speech, and control devices.
Hey, dude, if your aim is mind reading, don’t you think it makes more sense to, like, snap a photo of the brain, rather than decipher a bunch of squiggles that waft from neurons?
Nope. In 2018, researchers from Toronto put that notion to rest. The best pictures of the brain, you see, are taken with fMRI (functional magnetic resonance imaging). The Toronto team proved EEG is superior to fMRI. How? It’s all about the speed, baby. EEG excels in temporal precision (mere milliseconds) compared to the plodding pace of MRI. Besides being slow (the subject has to lie still in a noisy tube for a long, long time) MRI is also bulky and costly.
The Canadian scientists were onto something, to be sure. They enlisted thirteen volunteers and reconstructed what they saw in their minds from brainwaves alone! Actual face-like images from brain activity. The volunteers were each shown photographs of people’s faces for half a second. This was repeated multiple times with various faces, mimicking signal averaging but preserving identity-specific variance. EEG recording took place after the subject viewed each image. A computer program evaluated EEG recordings and produced images that reliably correlated with the particular face seen by the volunteer. In other words, human raters matched reconstructed images to original faces (one amongst many) that had been seen by the subject.
In 2023, Researchers at University of Technology Sydney (UTS) debuted a breakthrough in non-invasive brain-to-text translation; converting raw EEG signals from silent reading into coherent sentences, no eye-tracking or implants needed. Dubbed DeWave (Discrete Encoding of EEG Waves) it’s based on an open-source AI framework.
DeWave starts with Signal Capture. A portable EEG cap records brainwaves during silent reading, which produces EEG spikes in language areas (Broca’s and Wernicke’s). UTS’s DeWave AI learns to map waves to words, outputting coherent sentences like “The cat jumped over the fence” from brain data alone.
How? A quantized variational encoder segments waves into codex units (discrete tokens, like language model embeddings), filtering artifacts via self-supervised learning. Then, contrastive training aligns EEG codex to vocabulary; BART (pre-trained LLM) generates sentences.
The scientists believe this neuro tech may prove useful for those suffering from Locked-in syndrome (such as certain ALS or stroke patients). What’s more, DeWave, in various demos, allowed robot control via thoughts.
DeWave is not alone.
Neurable, a Boston-based neurotech startup founded in 2015, is developing non-invasive brain-computer interfaces (BCIs) that blend seamlessly into consumer devices. Backed by over $30M in funding (including a $13M Series B in 2024 from Ultratech Capital, TRAC, Pace Ventures, and Metaplanet), they aspire to become “Fitbit for your brain.” They’re pursuing a licensing ecosystem for earbuds, glasses, helmets, and more, with Apple reportedly eyeing similar integrations for AirPods (such as thought-controlled playback and mental health snapshots).
Neurable’s Flagship Product is called MW75 Neuro LT. Launched in partnership with Master & Dynamic, these noise-canceling headphones hide EEG sensors in the headband for unobtrusive wear. Neurable’s core technology uses patented signal processing and AI-Powered EEG Sensing to filter noisy EEG data from scalp electrodes. Trained on hundreds of hours of brain data, it detects gamma waves (high-focus states) and theta waves (fatigue/drift) in real-time, with more than 80% reliability for burnout prediction. MW75 pairs with apps like Apple Health for enhanced fitness-brain insights and provides nudges like “Take a break” via audio cues. What’s more MW75 tracks cognitive metrics, such as “brain age” (neural efficiency vs. chronological age), cognitive strain (mental load), anxiety resilience, and recovery scores. Early users report 20-30% productivity gains by avoiding context switches, and it’s validated for ADHD management and early detection of cognitive issues (such as depression or Alzheimer’s).
OK, dude, I’ll admit it’s impressive. So where do we stand now?
EEG isn’t reading minds yet— it’s eavesdropping on brain chatter with AI as the translator.
What’s next?
It goes without saying that progress will continue. That’s the way these things go. The tech has already evolved from proof-of-concept to a robust tool for BCIs. With ongoing refinements, your dreams may become mind movies that make other people’s eyes rain. But I wonder, will we build a machine that can alter a person’s EEG waves and perhaps even change— for good or ill— what they’re thinking?
Want to learn more about brain waves? Check out https://brain2mind.substack.com/p/gamma-is-the-greatest-of-brain-waves?utm_source=publication-search


