Yuhan Hu and Michael Suguitan, Ph.D. students in Professor Hoffman's research group, Human-Robot Collaboration & Companionship Lab , recently both won an award at the 14th ACM/IEEE International...Read more about Ph.D. students Yuhan Hu and Michael Suguitan both won awards.
Smart foam and artificial intelligence could help robots know if they're injured
This foam creation can figure out what's happening to it.
By Rob Verger - Popular Science online.
If you fall hard and break your arm, your body will let you know with crackling hot speed that something is wrong. Robots, though, don’t have neurons, so need a method to know what’s going on with their artificial bodies.
Consider a future where a robot operates autonomously, but an appendage becomes injured, says Robert Shepherd, an associate professor of mechanical engineering at Cornell University. “It’s going to continue moving its limb and thinking its hand or foot is going to be in one position, when it’s actually going to be in a different position,” he says. “We need skins, or internal neural-like sensors, to communicate this information three-dimensionally and continuously, to the robot’s controller.”
Shepherd’s lab has developed a foam, light, and artificial intelligence system that allows it to sense what’s happening to it—whether the foam is bending up or down, or twisting, or both. The results were published today in the journal Science Robotics.
Here’s how it works: the key sensor is a layer of 30 optical fibers in the foam, which is made of silicone. The fibers stick out of one end of the foam, and connect to other equipment. The intensity of the light coming out of the end of those optical fibers lets the system know what’s happening to the foam. When the foam is at rest, the light looks a certain way. If the foam bends or twists, the light changes.
“So you can detect changes in shape by looking at the change in the overall pattern of light intensity,” says Ilse Van Meerbeek, a PhD candidate in mechanical engineering at Cornell, and the first author of the paper describing the foam.
Humans obviously have brains to interpret what’s going on with their bodies, but this foam has no noggin. For that job, the researchers turned to artificial intelligence. To build the AI, the researchers first gathered information about how the light from the fibers changed when the foam was bent or twisted in a known way. That data let them train machine learning models that they could use going forward to interpret what’s happening with the foam.
This isn’t the only sensing strategy out there that researchers can use to see how a soft robotic creation is stretching: flexible electronic sensors use a change in current to notice how they’re stretching, while previous work in Shepherd’s lab has used stretchable light fibers to measure whether something has become deformed.
Of course, sensors like these are crucial for the robot to know what’s going on with it and around it. “Your robot needs to have a sense of itself in the world,” Shepherd, who is senior author on the new paper, says.
Right now, the foam and AI experimental set-up at Cornell involves gear that’s external to the foam, but Van Meerbeek says that it would be possible to miniaturize everything with the goal of having a self-contained, self-sensing foam setup. One possible application she sees for this kind of sensor system? “Robots learning how to walk for themselves,” she says, referring to soft ‘bots. “It has to be able to sense its shape.”
See this article online: https://www.popsci.com/self-sensing-foam#page-3
Jingyi Guo, 5th year Ph.D. student in Prof. Chung-Yuen Hui ’s group in the department of Mechanical and Aerospace Engineering at Cornell University, recently attended the 42nd Annual Meeting of The...Read more about Jingyi Guo, 5th year Ph.D. student in Professor Hui's group recently won 2 awards.
Taylor Oeschger, a 2nd year Biomedical Engineering student working in MAE, and 5 other engineering graduate students traveled to the Society of Women in Engineering Local conference in Baltimore,...Read more about Taylor Oeschger, 2nd year Biomedical Engineering student working in MAE wins first prize.