axonnic
Bioelectric Computing

Axonnic reads the electrical signals your muscles already produce and classifies your exact hand position in under 20ms — on-device, no cloud. Turn your forearm into a keyboard, game controller, or prosthetic command interface.

< 20ms latency6+ gestures32–100+ channels
Scroll

What Axonnic sees

Every time a muscle fires it sends an electrical signal. Axonnic reads dozens simultaneously — building a precise, real-time picture of exactly what your hand is doing.

LIVE · Muscle signals · 32-channel HD-sEMG
2,048 Hz
Flexors
Extensors
Deep
Lateral

Illustrative signal. Real data streams from a wearable electrode array on your forearm.

It knows what your hand is doing.

Axonnic identifies your exact hand position in under 20 milliseconds — faster than a keypress, more expressive than any button.

INFERENCE ENGINEONLINE
17msLive

PREDICTED CLASS

Fist

91.0% confidence

Window: 256 samples · 17ms

CLASS PROBABILITIES

Fist
91%
Wrist Flexion
3%
Side Pinch
2%
3-Finger Flexion
2%
3-Finger Extension
1%
Wrist Extension
1%

Tap to pause/resume · Cycles automatically · Probabilities are illustrative

Applications

Built for real human impact.

From clinical prosthetics to next-generation computing — the same platform, infinite configurations.

Give back control

For people with upper limb differences, ALS, or spinal injuries — EMG-based control requires no button presses, no rigid positioning. Intent is the only input.

Move naturally

Translate forearm EMG into multi-DoF prosthetic commands. Individual finger-level control from a wearable sensor array — no surgery, no calibration lab.

Beyond the keyboard

Muscle signals as keyboard, mouse, and gamepad. The next interaction paradigm starts at the neuromuscular interface — not the desk.

< 0msLATENCYsignal to action
0.0+GESTURESbuilt-in · unlimited custom
0+CHANNELSelectrode inputs
0°COVERAGEforearm surface
$0B+MARKETwearable input by 2030

One interface. Infinite applications.

When you can read the exact position of every finger in real time, almost every interaction with a machine becomes possible.

Replace the keyboard

Small finger movements become keystrokes, commands, shortcuts, and gestures. No physical keys. No latency. No hardware in the way of your ideas.

Use it anywhere

No desk, no mouse, no screen required. Axonnic works in the field, on the move, in the operating room — wherever your arm goes, the interface goes.

Build anything

An open platform for developers. Map any gesture to any action. Create controllers that don't exist yet. Test interaction paradigms at the speed of thought.

Tailored to your body

Axonnic learns your specific muscle patterns. The more it knows you, the more precise it gets — personalised accuracy that off-the-shelf hardware can never match.

Accessibility & prosthetics

Give people control that rigid input devices deny them. Read intent directly from muscle signals — not button presses, not sensor positions.

More electrodes, more power

More electrode channels means finer spatial resolution — tracking individual fingers, joints, and micro-movements. The platform scales as the hardware scales.

Privacy by default

On-device ONNX inference. No cloud dependency, no data transmission. Your muscle signals never leave your hardware — clinical-grade privacy without compromise.

How it works

Three steps. Zero middlemen.

A wearable array reads the signals your muscles already produce. On-device AI interprets them instantly. Your device responds — no cloud, no lag.

01

Muscle fires

Electrode array reads HD-sEMG across 32–100+ channels at the forearm surface. Every signal, every fibre.

02

Sensor streams

A wearable sensor array streams raw muscle signals wirelessly to the Axonnic runtime — no cloud hop, no proprietary lock-in.

03

AI predicts

On-device ONNX inference classifies movement intent in under 20ms with personalised accuracy trained to your exact muscle patterns.

Outputs

Keyboard & typing
Computer control
Prosthetics
Any application

The Opportunity

Bioelectric computing is a platform shift.

The wearable input market is a $10B+ category. The interface layer — translating biological intent to digital action — is still wide open. Axonnic is building the infrastructure.

Three curves crossed simultaneously: dry-electrode arrays that cost £10,000 per channel now cost under £50; Meta's $1B CTRL-Labs acquisition validated the category but placed the bet on AR glasses on a five-year horizon, leaving the consumer and gaming window open; and transformer-based processing turned what required a 2019 PhD thesis into a fine-tune. Unlike existing products limited to wrist-snap gestures, Axonnic reads the whole forearm — 32 to 100+ channels — resolving individual fingers, not just gross movements.

$10B+

Wearable input market by 2030

$1B

Meta's CTRL-Labs acquisition — market validated

<20ms

Signal-to-inference — faster than human perception

32-ch HD-sEMG

Axonnic Runtime

Keyboard
Prosthetics
Computer
Custom

Our Story

A mission older than computing.

This is not about input devices. It is about where information systems go next — and why we believe the next inflection belongs to the neuromuscular interface.

For most of life on Earth, information was stored and transmitted biologically. DNA and RNA formed a base-4 system capable of preserving complexity across generations. Nervous systems emerged on top — a faster electrical layer capable of learning within a single lifetime.

Humanity then externalised this process: computers, the internet, and now artificial intelligence. External computation is beginning to evolve faster than human interaction with it can keep pace.

We still communicate with machines through interfaces designed decades ago. The bottleneck is no longer computation. It is the translation layer between biological intent and digital action. Axonnic is an intermediate step toward closing that gap — non-invasive, immediate, and grounded in the body's own electrical language.

"The bottleneck is no longer what computers can do — it's how slowly humans can tell them to do it."

That's what Axonnic fixes.

We are not waiting for neural implants. The body already speaks electrically. We just need to listen.

CredentialsOxford MedicineOxford Signal ProcessingPublished Research · Neuroscience & NeurosurgeryHardware · Embedded Systems

The Team

Science meets tech.

MW

Max Williams

CEO & Co-Founder

MBiol Signal Processing, Oxford

Hardware company experience · Skyscanner ecosystem

AP

Arsh Patankar

CTO & Co-Founder

Oxford Medic · AI & Neuroscience Researcher

Published papers in neuroscience and neurosurgery · Fullstack developer

AdvisorsVoxblock & Skyscanner FoundersOxford · Harvard · UC Berkeley ResearchersTop NeurosurgeonsTop-3 EMEA Overwatch Pro Gamer

FAQ

Common questions.

Early Access

Be first.

Join researchers, developers, and early builders shaping the next input paradigm.

Hardware dev kitONNX SDKPriority support

No spam. Early access invites and product updates only.

The interface of the future
is already part of you.

Every computer interaction you have ever had was constrained by the hardware in front of you. Axonnic removes that constraint.

Talk to the team

We're building the infrastructure layer for bioelectric computing. If you're investing in consumer hardware, HCI, gaming peripherals, or next-generation input paradigms — we'd like to talk, whether you're writing first checks or leading rounds.

Pre-seed stageHardware + software platformRaising Q3 2026
info@axonnic.com