Research

We work directly with participants with disabilities to develop personalized human-machine interfaces that support people’s rehabilitation goals and improve technology accessibility. We engineer personalized human-machine interfaces that: 1) leverage wearable sensors to support health and rehabilitation during everyday technology use 2) use biosignals to improve technology accessibility for people with upper-body movement disabilities 3) are equitable for people with a diverse range of physical characteristics, with a focus on people with motor disabilities like stroke, Parkinson disease, spinal-cord injury, and muscular dystrophy.

Personalized Gestures for Health and Accessibility

Standardized gesture sets used in mobile and VR/AR interactions are not accessible to people with diverse motor abilities, limiting their physical interactions with technology. We develop algorithms that support personalized gesture recognition for people with upper-body motor disabilities and/or chronic conditions using biosignal (EMG, IMU) interfaces. These personalized interfaces could encourage movement during technology use, supporting ubiquitous rehabilitation.

Recent Publications and Presentations

Yamagami M, Portnova A, Kong J, Wobbrock JO, Mankoff J. How Do People with Limited Movement Personalize Upper-Body Gestures? Considerations for the Design of Personalized and Accessible Gesture Interfaces. In the ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2023). 2023. (30% acceptance rate) [ DOI | PDF ]

Wearable Sensors for At-Home Rehabilitation

Many chronic conditions, such as end-stage kidney disease, require long-term rehabilitation or physical therapy. It can be difficult, time consuming, or even dangerous for individuals with these chronic conditions to attend in-person sessions. We are investigating whether off-the-shelf wearable technologies such as smartwatches can be used to predict relevant features of rehabilitation, such as exercise type performed and perceived exertion, in order to track adherence and offer real-time biofeedback during at-home rehabilitation sessions.

Recent Publications and Presentations

Deehring M, Dhala A, Yamagami M. Data-Driven Modeling to Predict Rehabilitation Adherence from Wearable Sensors. Poster Presentation at 2024 IEEE BSN, Chicago, IL. 2024

Privacy-Preserving Co-Adaptive Algorithms

Biosignal interfaces using EMG or IMU sensors are highly personalized and linkable to the specific user. As wearable sensors become more ubiquitous in our daily lives, these interfaces could inadvertently expose people’s disability or other physical characteristics. We are investigating how model parameters can be shared and updated between users in a privacy-preserving manner.

Recent Publications and Presentations

Malcolm K, Uribe C, Yamagami M. A Federated Learning Framework For Personalized and Privacy-Preserving Biosignal Interfaces. In preparation.

Synthesizing Inclusive Gestures

Avatar customization is important for users to feel connected to their avatars and welcome in the virtual environment. However, avatars currently deployed in these spaces are primarily made for individuals without disabilities, and are therefore not inclusive for the 12.2% of individuals living in the US with movement disorders. We are investigating the feasibility of using generative statistical models to synthesize gestures realistic to those of people with motor disabilities.

Recent Publications and Presentations

Peterson LN, Yamagami M. Generating Novel Representative Gestures from Kinematics Data. Poster Presentation at 2024 TEROS, College Station, TX. 2024.