Researchers have managed to create the world’s creepiest robot, able to make decisions completely of it’s own accord without any human intervention.
The Atler robot instead uses its own neural network that allows it to respond to its environment through an array of sensors that detect sound, temperature, humidity and proximity.
Developed by researchers at the University of Tokyo and Osaka University, Atler is currently on display at Japan’s National Museum of Emerging Science and Innovation.
Alter relies on 42 pneumatic actuators and a “central pattern generator” (CPG), which allows it to make its own decisions. The CPG is based on an artificial model for neurons called the Izhikevich neuron that gives it human-like movements.
The project is an attempt to bridge the gap between programming a robot to move and allowing it to move for itself. With a neural network in place, movement is given a loose degree of flexibility, what the researchers are calling “chaos.” Alter’s arm movement, head and posture will adjust and change on the system’s own volition. The neural network ticking behind the scenes offers multiple movement modes, switching between a longer movement mode and a more random “chaos” mode.
The decision to switch is influenced by the sensors dotted around the base and take in what’s happening around Alter: proximity, humidity, noise and temperature. These sensors operate like the robot’s version of skin, copying our own senses, even if the system is far, far simpler. If the proximity sensors detect a lot of people nearby, for example, the torso shudders as the robot’s body reacts to its environment.
Agencies/Canadajournal