
AuraCoding fuses EEG brain-wave decoding, gaze-controlled navigation, and voice commands into a single wearable headband. Think text. Look to navigate. Speak to confirm. No keyboard required.

AuraCoding combines three breakthrough technologies — each proven independently — into a single wearable that turns your thoughts and gaze into computer commands.
Powered by ZUNA (Open Source)
8 dry electrodes capture your brain's electrical signals at 256 Hz. Zyphra's ZUNA model — a 380M-parameter foundation model trained on 2 million channel-hours of EEG data — denoises and upsamples the signal to research-grade quality. Downstream classifiers then decode your thoughts into text.

Inspired by eyeTerm / eyede
Dual IR cameras track your gaze at 120 Hz across up to 4 monitors. Look at a terminal to select it. Look at a button to highlight it. Wink or think to confirm. The same paradigm Brian Harms demonstrated with a Mac webcam — but with dedicated hardware for sub-degree precision.
A MEMS microphone array with local Whisper-based speech-to-text provides a confirmation and dictation layer. Whisper "go" to execute, or dictate longer passages when thought-to-text needs a boost.
Raw EEG streams, gaze coordinates, classified mental states, and decoded text — all available via WebSocket API. Build custom apps, integrations, and experiments on top of AuraOS.
See how eye tracking and brain-wave decoding combine to let you control your entire desktop — without touching a keyboard.
Concept visualization — Inspired by @therituallab eyeTerm + Zyphra ZUNA EEG decoding
Watch how AuraCoding tracks your gaze across terminals and decodes your thoughts into code — no hands required.
A lightweight headband that integrates three sensor modalities into a single comfortable form factor. No gel, no wires, no setup friction.

From developers seeking hands-free coding to individuals who need assistive communication, AuraCoding serves distinct markets with a unified platform.

Look at a terminal to select it. Think a command. Confirm with a wink. Silent coding in open offices, libraries, or late at night. The eyeTerm workflow — but powered by thought instead of voice.
For the 30,000 Americans living with ALS and millions more with limited hand/speech function. Current assistive devices cost $5,000-$15,000. AURA delivers richer interaction at a fraction of the cost.
Mental state detection adapts game difficulty in real-time. Eye tracking enables hands-free camera control in VR/AR. Biofeedback during creative work helps you stay in flow state.
| Device | Price | EEG Sensing | Eye Tracking | Thought-to-Text | Open Source | Consumer Grade |
|---|---|---|---|---|---|---|
| AuraCoding | $449-$699 | |||||
| Muse S | $250-$400 | |||||
| Neurosity Crown | ~$999 | |||||
| Emotiv EPOC X | $999 | |||||
| OpenBCI Galea | $42,980 |
Early backers get the best prices and exclusive access. All tiers include free shipping within the US.
Show your support for the future of brain-computer interfaces.
The full AuraCoding headset with core apps for everyday use.
Everything in Consumer plus full SDK access and raw data streaming.
Base goal — Consumer & Developer Kit production
Native macOS integration with Spotlight, Terminal, and Finder gaze control
Cross-platform support with native OS integrations
Community app store for AuraOS plugins and integrations
FDA 510(k) clearance pathway for assistive communication use

Join the waitlist to be notified when the Kickstarter campaign goes live. Early subscribers get exclusive early-bird access.
1 person has already joined the waitlist
No spam. Unsubscribe anytime. We respect your privacy.