Research

We work directly with participants with disabilities to develop personalized human-machine interfaces that support people’s rehabilitation goals and improve technology accessibility. We develop personalized human-machine interfaces that: 1) leverage wearable sensors to support health and rehabilitation during everyday technology use 2) use biosignals to improve technology accessibility for people with upper-body movement disabilities 3) are equitable for people with a diverse range of physical characteristics, with a focus on people with motor disabilities like stroke, Parkinson disease, spinal-cord injury, and muscular dystrophy.

Personalized Gestures for Health and Accessibility

Standardized gesture sets used in mobile and VR/AR interactions are not accessible to people with diverse motor abilities, limiting their physical interactions with technology. We develop algorithms that support personalized gesture recognition for people with upper-body motor disabilities and/or chronic conditions using biosignal (EMG, IMU) interfaces. These personalized interfaces could encourage movement during technology use, supporting ubiquitous rehabilitation.

Recent Publications

Yamagami M, Portnova A, Kong J, Wobbrock JO, Mankoff J. How Do People with Limited Movement Personalize Upper-Body Gestures? Considerations for the Design of Personalized and Accessible Gesture Interfaces. In the ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2023). 2023. (30% acceptance rate) [ DOI | PDF ]

Privacy-Preserving Co-Adaptive Algorithms

Biosignal interfaces using EMG or IMU sensors are highly personalized and linkable to the specific user. As wearable sensors become more ubiquitous in our daily lives, this could be an issue because these interfaces could inadvertently expose people’s disability or other physical characteristics. We are investigating how model parameters can be shared and updated between users in a privacy-preserving manner.