Dr. Yamagami presented her work on “Optimizing human-machine interfaces for health and accessibility” at the IEEE-BSN Conference in Chicago. Mikayla Deehring was also present at the conference. Great job!
Category Archives: Conferences
Mikayla Deehring presents at the 2024 AI in Health Conference at Rice University
Graduate student Mikayla Deehring gave a poster presentation on “Predicting Adherence to At-Home Rehabilitation Using Biosignals”. Great job Mikayla!
2024 Summer REU present at Rice Undergraduate Research Symposium
Congratulations to our 2024 summer REU students, Tony Martinez (Rice University, Computer Science) and Steven Chen (National Taiwan University, Biomechatronics Engineering) for finishing up their REU programs. They presented at the Rice Summer Undergraduate Research Symposium.
- Tony Martinez: Developing a Virtual Reality Platform to Assess VIsual and Motor Attention
- Steven Chen: Development of a Biosignal Gesture Recognizer Web Demo
Stay tuned for the web demo, which will be available this fall on the website!
Yamagami lab participates in first Health Equity Workshop
The Yamagami lab participated in the first Health Equity Workshop hosted by the Digital Health Initiative at Rice University. Momona Yamagami presented on “Accessible and Inclusive Digital Health Technologies for Ubiquitous Rehabilitation“. We also had poster presentations from Kai Malcolm and Mikayla Deehring:
- Kai Malcolm: Towards Health Equity: Model Personalization for Fairer Outcomes and Privacy Protection
- Mikayla Deehring: Predicting Adherence to At-Home Rehabilitation Using Biosignals
Dr. Yamagami presents at ASSETS 2023 on “How do people with limited movement personalize upper-body gestures? Considerations for the design of personalized and accessible gesture interfaces”
“Personalized upper-body gestures
that enable input from various body parts, according to the abilities
of each user, could be useful for ensuring that gesture systems are
accessible. However, we do not know what types of gestures
(or gesture sets) people with upper-body motor impairments would
want to use, or whether wearable sensors can diferentiate between
an individual’s chosen gestures.”
Dr. Yamagami presented her postdoctoral work, “How do people with limited movement personalize upper-body gestures? Considerations for the design of personalized and accessible gesture interfaces” at ASSETS 2023 [ DOI ] (the 25th International ACM SIGACCESS Conference on Computers and Accessibility) in New York City.
Goal: To understand what types of gestures people with upper-body motor impairments would want to use, or whether wearable sensors can differentiate between an individual’s chosen gestures.
Method: We characterize the personalized gesture sets designed by 25 participants with upper-body motor impairments and develop design recommendations for upper-body personalized gesture interfaces.
Result:
We found that the personalized gesture sets that participants designed were highly ability-specifc. Even within a specifc type
of disability, there were signifcant diferences in what muscles participants used to perform upper-body gestures, with some predominantly using shoulder and upper-arm muscles, and others solely using their finger muscles.
Implications: Personalized upper-body gesture interfaces that take advantage of each person’s abilities are critical for enabling accessible upper-body gestures for people with
upper-body motor impairments
She also presented her TACCESS (ACM Transactions on Accessible Computing Journal) 2022 and 2023 papers, titled:
Kai Malcolm presents posters in the AI in Healthcare Conference and the Texas Colloquium on Distributed Learning
The Yamagami lab participated in the October 2023 AI in Healthcare Conference [https://events.rice.edu/event/347482-2023-ai-in-health-conference] as well as the September 2023 Texas Colloquium on Distributed Learning [https://sites.google.com/view/tldr2023], both hosted by the Ken Kennedy Institute at Rice University. Kai Malcolm presented his recent work as a poster at both venues: Protecting Sensitive Biosignal Data in Model Training: Federated Learning for Healthcare Applications. Great job Kai!