Robot Feeds People

Cornell’s New Robot Feeds People with Severe Disabilities


Cornell researchers develop a cutting-edge robotic feeding system using computer vision and machine learning to aid individuals with severe mobility limitations. Discover the groundbreaking technology improving quality of life.


Researchers at Cornell University have introduced a revolutionary robotic feeding system to help people with severe mobility challenges, such as those with spinal cord injuries, cerebral palsy, and multiple sclerosis. This innovative system leverages advanced computer vision, machine learning, and multimodal sensing technologies to safely and effectively feed individuals who struggle with traditional feeding methods.
“Feeding individuals with severe mobility limitations with a robot is difficult, as many cannot lean forward and require food to be placed directly inside their mouths,” explained Tapomayukh “Tapo” Bhattacharjee, an assistant professor of computer science at Cornell Ann S. Bowers College of Computing and Information Science and the senior developer behind this pioneering system. “The challenge intensifies when feeding individuals with additional complex medical conditions.”
Bhattacharjee and his team at the EmPRISE Lab have dedicated years to perfecting the intricate process by which humans feed themselves, transforming it into a task that a robot can perform. The complexity of this challenge is significant, involving everything from identifying food items on a plate, picking them up, and transferring them directly into the mouth of the care recipient. “This last 5 centimeters, from the utensil to inside the mouth, is extremely challenging,” Bhattacharjee noted.
One of the primary difficulties in developing this system is accommodating the varied needs and conditions of different care recipients. Some people have very limited mouth openings, less than 2 centimeters, while others might have involuntary muscle spasms even with a utensil in their mouth. Additionally, some individuals can only bite food at specific locations within their mouth, which they indicate by using their tongue to push the utensil. Current technologies often assume that a person’s face will remain still, which is not always the case, limiting the effectiveness of these systems for many care recipients.
To address these challenges, the Cornell researchers developed a robot equipped with two essential features: real-time mouth tracking and a dynamic response mechanism. The real-time mouth tracking system adjusts to the user’s movements, while the dynamic response mechanism allows the robot to detect and respond to various physical interactions as they occur. This allows the system to tell the difference between sudden spasms, intentional bites, and the user’s attempts to move the utensil inside their mouth.
The robotic feeding system was successfully tested with 13 individuals who have diverse medical conditions. These tests took place at three different locations: the EmPRISE Lab on Cornell’s Ithaca campus, a medical center in New York City, and a care recipient’s home in Connecticut. Users reported that the robot was both safe and comfortable to use.
“This is one of the most detailed real-world tests of an autonomous robot-assisted feeding system with actual users,” Bhattacharjee said, highlighting the importance of their accomplishment.
The robotic system features a multi-jointed arm that holds a custom-built utensil capable of sensing the forces being applied to it. The mouth tracking method, which is trained on thousands of images of participants with various head poses and facial expressions, utilizes data from two cameras positioned above and below the utensil. This setup allows for precise mouth detection and helps overcome visual obstructions caused by the utensil itself. The physical interaction-aware response mechanism combines visual and force sensing to accurately perceive how users interact with the robot.
“We’re empowering individuals to control a 20-pound robot with just their tongue,” said Rajat Kumar Jenamani, the paper’s lead author and a doctoral student in computer science. Jenamani emphasized the profound emotional impact the robotic system has had on care recipients and their families. In one notable session, the parents of a daughter with schizencephaly quadriplegia, a rare birth defect, were moved to tears when they saw their daughter successfully feed herself using the system. “It was a moment of real emotion; her father raised his cap in celebration, and her mother was almost in tears,” Jenamani recounted.
While further research is necessary to explore the system’s long-term usability, the initial results are promising and highlight the potential to significantly enhance the independence and quality of life for care recipients.
“It’s amazing,” Bhattacharjee said, reflecting on the impact of their work. “And very, very fulfilling.”
The team’s research paper, “Feel the Bite: Robot-Assisted Inside-Mouth Bite Transfer using Robust Mouth Perception and Physical Interaction-Aware Control,” was presented at the Human Robot Interaction conference held from March 11-14 in Boulder, Colorado. The paper received a Best Paper Honorable Mention, and a demo of the broader robotic feeding system was awarded the Best Demo Award.
This innovative research was funded primarily by the National Science Foundation and involved contributions from several co-authors, including Daniel Stabile, M.S. ’23; Ziang Liu, a doctoral student in computer science; Abrar Anwar of the University of Southern California; and Katherine Dimitropoulou of Columbia University.

FAQs

1: What is the new robotic feeding system developed by Cornell researchers?
The system is an advanced robotic feeding assistant that uses computer vision, machine learning, and multimodal sensing to safely feed individuals with severe mobility limitations.
2: Who can benefit from this robotic feeding system?
The system is designed to help individuals with severe mobility limitations, including those with spinal cord injuries, cerebral palsy, and multiple sclerosis.
3: What are the key features of the robotic feeding system?
The robot has real-time mouth tracking and a dynamic response mechanism that adjusts to user movements and detects various physical interactions, distinguishing between spasms, intentional bites, and utensil manipulation.
4: Where was the system tested?
The system was tested in three locations: the EmPRISE Lab at Cornell, a medical center in New York City, and a care recipient’s home in Connecticut.
5: What were the outcomes of the user tests?
Users found the robotic system to be safe and comfortable, and it successfully fed 13 individuals with diverse medical conditions.
6: What recognition did the research paper receive?
The paper received a Best Paper Honorable Mention at the Human Robot Interaction conference, and a demo of the system won the Best Demo Award.
7: Who funded this research?
The research was primarily funded by the National Science Foundation.
8: What is the future potential of this robotic feeding system?
The system has the potential to greatly improve the independence and quality of life for care recipients, although further research is needed to explore its long-term usability.

Also Read: Google’s AI Revolution: Transforming Search and Privacy in the Digital Age

Leave a Reply

Your email address will not be published. Required fields are marked *