Why it matters
Reliable finger-level intent decoding can improve assistive interfaces and broaden HCI options for users who need alternatives to conventional input devices.
About
AlphaHand is a research project focused on identifying individual finger movement intent from Muse 2 EEG, with an emphasis on reproducible methods and deployable inference.
Reliable finger-level intent decoding can improve assistive interfaces and broaden HCI options for users who need alternatives to conventional input devices.
The workflow captures Muse 2 EEG, applies preprocessing and temporal feature extraction, then classifies finger intent with a lightweight model for low-latency inference.
Results are updated regularly as the validation and manuscript process advances. Placeholder metrics and schema fields are intentionally exposed for transparent updates.
Primary researcher: Jonathan Davanzo
Collaborators: TODO collaborator names
Start with the results page for current performance placeholders and methods notes.
For manuscript status, check the paper page.