Instead of jumping straight into prediction, AlphaHand starts with a reproducible capture layer. Each session stores lossless raw EEG shards, event logs, and metadata so every figure, score, and demo remains tied to the original signal.
During recording, the operator marks finger and action cues in real time. That gives the system clean examples of thumb-through-pinky movement plus REST, OPEN, and CLOSE, creating the labeled foundation needed for reliable decoding.
Verified handoff
Technical note: Step 1 is built around a 4-channel recording setup and writes raw shards plus an authoritative `events.jsonl` log.
What comes out: A reproducible session directory containing raw EEG, event labels, metadata, and timebase information.
