Role for robots: helping elderly remain at home
Milos Zefran, Jezekiel Ben-Arie and Barbara Di Eugenio are working on software that would let the elderly communicate with robotic caregivers.
Photo: Roberta Dupuis-Devlin
Robots that lend a human-like helping hand to healthy elderly people with limited mobility may be on the horizon, thanks to three UIC engineers and a Rush University nursing specialist.
“We want to help elderly people communicate with robots, to tell them what they need, and to perform physical activities,” says Milos Zefran, associate professor of electrical and computer engineering.
Zefran is lead investigator in a three-year, $989,000 National Science Foundation grant to develop software that will allow the elderly to communicate with robots that can respond to a wide range of verbal language, nonverbal gestures and touch.
“If we can help the elderly remain independent and continue living in their own homes, that will improve their health outlook while relieving the burden on family members and health care providers,” he says.
Partnering in the project are Jezekiel Ben-Arie, professor of computer and electrical engineering, Barbara Di Eugenio, associate professor of computer science, and Marquis Foreman, professor emeritus of behavioral health science at UIC and professor and chair of adult health and gerontological nursing at Rush University.
Zefran’s expertise is in robotics and computerized sense of touch, called haptics. Ben-Arie specializes in computer vision and pattern recognition, Di Eugenio in natural language processing and Foreman in nursing care for the elderly.
The communication interface software will have at its core a novel adaptive and reliable recognition methodology called RISq Recognition by Indexing and Sequencing invented and patented by Ben-Arie.
He says RISq will allow the robot to learn and adapt in comprehending speech altered by impairments and noise.
“One of the main obstacles in communicating with elderly patients is the need to use a personal vocabulary with almost everyone,” Ben-Arie says.
RISq will also be used for recognition of hand gestures and objects.
“Language would still, in many cases, be the primary means of communication, but haptics and vision may help the robot interpret a command,” says Di Eugenio.
“My role is taking individual words and making sense of the sentence in which they appear, using disambiguating information coming from haptics and vision.”
By combining techniques from natural language processing and haptics, the robot will understand and correctly respond to various forms of human touch. It will know how to respond to the user when performing everyday chores such as cooking or making a bed.
“We’ll start by observing interaction between human helpers and the elderly,” Zefran says.
“We’ll identify what kind of language, physical interactions and nonverbal interactions are used. Then we’ll develop a mathematical framework to model this interaction so it can be treated by the robot as a single way of communicating.”
The research team will program and test a robot and devise refinements as the project progresses.
“The human-robot interface is really a long-standing, open problem that won’t be solved in three years,” Zefran says.
“But we’ll have a working prototype by then, and we’ll know what additional research needs to be done.”
This research may find widespread use in delivery of institutionally based health care, where routine tasks now done by nurses could be handled by robots, he adds.
“If robots can alleviate some of the burden nurses face, they then could spend more time where they’re really needed providing the human contact that a robot can’t replace.”
The project will include seminars or a new graduate or upper-level undergraduate course that will consider the various factors allowing robots to perform more sophisticated tasks.
The NSF award is funded under the federal government’s economic stimulus plan, the American Recovery and Reinvestment Act of 2009.