Skip to content

james0248/EMGTrackpad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

67 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

EMGTrackpad

[Demo]

Decode motor intention from forearm surface EMG signals to control a macOS trackpad in real time.

EMGTrackpad demo

Overview

EMGTrackpad captures surface EMG signals from a MindRove armband, trains neural networks to decode cursor movement and discrete actions (click, scroll), and drives the macOS trackpad in real time.

The repo has two components:

  • Model (src/emg/) — data collection, signal processing, model training, and real-time inference
  • Platform (apps/platform/) — web app that presents structured tasks (e.g., click targets, drag paths) during data collection

Setup

Prerequisites

  • macOS (Apple Silicon)
  • Python 3.12+
  • uv
  • MindRove EMG armband
  • Bun (for the Platform only)

Installation

git clone https://github.com/your-username/EMGTrackpad.git
cd EMGTrackpad && uv sync && uv pip install .
cd apps/platform && bun i

Data Collection

Collect synchronized EMG and trackpad events into an HDF5 session file:

uv run python src/emg/track.py

Sessions are saved to data/session_YYYYMMDD_HHMMSS.h5.

Training

Train a continuous controller (cursor movement + click/scroll actions):

uv run python -m emg.train_controller --config-name channel_attention

Available configs: rms, freq_rms, channel_attention, freq_rms_lstm, channel_attention_lstm

Inference

Run the trained model to control the macOS trackpad:

uv run python -m emg.inference.controller checkpoint=path/to/checkpoint.pt

Platform

Run the web app for presenting structured tasks while recording EMG data:

cd apps/platform
bun dev

Then open http://localhost:3000.

About

Control your cursor with your bare hands (and EMG signals) 🖱️

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors