Minoru Asada presented the work of the research team on February 15 at the annual meeting of the American Association for the Advancement of Science in Seattle. This artificial “pain nervous system” was then trapped in a life-like Android robot baby that was able to respond to sensations using a variety of facial expressions. Scientists at Osaka University have created a synthetic skin that has sensors to detect changes in pressure subtly, be it light touch or hard prick.
The robotic child, named Afetto, was first published by Osaka University in 2011. At the time, it was just a realistic head that was capable of drawing in a variety of expressions such as laughter and fetching. This was made possible by moving the robot using 116 different facial points through soft skin-like material. This latest project has given the boy a body, which is filled with aka-skeleton covered in artificial skin with new sensitive sensors.
Japan has already built robots in nursing homes, offices, and schools as a way to deal with its aging and narrow staff. Some states in the United States are also experimenting with using real-life robotics to patrol the streets – often with mixed results. The goal is to create more realistic “social” robots that are able to communicate more deeply with humans. It may sound like a long-term pipedream, but it’s not as far-fetched as it sounds.
However, speaking to Science News, Antonio Damasio, a neuroscientist at the University of Southern California, was quick to point out that this is not “the same thing” as a robot actually creates a computer and somehow experiences the interior. The theory goes that these robots will be able to communicate with humans more authentically and effectively if they are able to give the impression that they are like us. So, if the young Afetto puppy is looking at you with dog eyes and a sad frown, don’t try to look too bad.