I want to highlight some really amazing work led by Akhil Padmanabha and Jessie Yuan at Carnegie Mellon University that I was tangentially involved in. WAFFLE (Wearable Approach For Feeding with LEarned Bite Timing) won Best Paper at the 21st ACM/IEEE International Conference on Human-Robot Interaction (HRI ‘26) in Edinburgh, Scotland. The work, in collaboration with Cornell University, addresses one of the core challenges in robot-assisted feeding: knowing when to offer the next bite.
The full paper and project videos are available at the WAFFLE project page and via ACM (Padmanabha et al., 2026).
The problem
Around 3.5 million people in the United States need assistance with feeding. Robotic feeding systems can reduce caregiver workload and restore independence, but their widespread adoption has been limited in part because they struggle with bite timing — deciding the right moment to move food toward the user’s mouth.
Most systems either rely on fixed intervals (too inflexible) or require the user to explicitly signal readiness by opening their mouth (disruptive, especially in conversation). What caregivers actually do is watch for natural cues: chewing winding down, a pause in conversation, a small head movement toward the spoon.
The approach
WAFFLE captures exactly those cues using two lightweight wearables:
- a Posey IMU mounted on a glasses frame, tracking head movements and chewing vibrations; and
- a throat contact microphone worn around the neck, detecting swallowing and vocal activity without recording ambient sound.
A supervised regression model takes 1-second windows of these sensor signals and predicts the time until the user is ready for the next bite. A user-adjustable assertiveness threshold converts the continuous prediction into a simple stop/proceed command for the robot, letting users tune how proactively the robot feeds them.
Key results
The system was evaluated across three phases involving 31 participants total, including two people with motor impairments using the system in their own homes.
- 93% of participants found WAFFLE’s bite timing appropriate overall, versus 87% for Mouth Open and 70% for Fixed Interval.
- In individual dining, WAFFLE matched the Mouth Open baseline on feeling of control while requiring significantly less physical workload.
- In social dining, WAFFLE statistically outperformed Mouth Open on not disrupting conversation and natural conversation flow.
- 87% of participants ranked WAFFLE first for individual dining; 67% ranked it first for social dining.
- When asked which method most resembled a human caregiver, 9 out of 15 participants chose WAFFLE.
- One participant with motor impairments said it was “way better than they [caregivers] would do it,” noting the system naturally paused when she was still chewing without needing to be told.
The system generalized across two different robot platforms (Obi and a Kinova 7DOF arm), varied robot positioning and feeding trajectories, multiple foods, and both individual and social dining contexts.
How Posey helped
The IMU boards at the heart of WAFFLE use the Posey sensor hardware and firmware platform that I designed and fabricated while in the Soft Machines Lab at CMU. Each sensor integrates a microcontroller, IMU, lithium-polymer battery, and Bluetooth antenna into a compact, research-ready package small enough to mount on glasses frames or inside 3D-printed earbuds.
For WAFFLE’s data collection study, four Posey boards were worn simultaneously — two on the glasses frame and two embedded in earbuds — streaming 9-DoF inertial data at 200 Hz over BLE. For their research, it turned out just one was enough to capture the necessary cues, but the extra sensors allowed them to explore which locations and modalities were most informative for bite timing. I also built the BLE hub to aggregate all sensor streams reliably, which was a prerequisite for running multi-sensor studies across many participants without the latency and connectivity issues that plague commodity laptop BLE stacks.
The Posey hardware, firmware, and command-line tools are fully open-source and available on GitHub.