AlphaHandOpen repo

About

What is AlphaHand?

AlphaHand is a research project focused on identifying individual finger movement intent from Muse 2 EEG, with an emphasis on reproducible methods and deployable inference.

Why it matters

Reliable finger-level intent decoding can improve assistive interfaces and broaden HCI options for users who need alternatives to conventional input devices.

How it works

The workflow captures Muse 2 EEG, applies preprocessing and temporal feature extraction, then classifies finger intent with a lightweight model for low-latency inference.

What's next

Results are updated regularly as the validation and manuscript process advances. Placeholder metrics and schema fields are intentionally exposed for transparent updates.

Research team

Primary researcher: Jonathan Davanzo

Collaborators: TODO collaborator names

Start with the results page for current performance placeholders and methods notes.

For manuscript status, check the paper page.