Japan's National Science Museum unveiled a new exhibit last month: a creepy-looking robot that's powered by 42 pneumatic actuators and its own neural network. According to a report from Engadget, this neural net (which is modeled, in a very broad sense, after the brain's own system of neurons) is programmed to respond to its environment — with the robot's movements and strange vocalizations reacting to information from sensors detecting movement, temperature, and humidity.
The idea was to create a robot that can guide its own actions. You can argue that just like a human being, Alter is responding to external stimuli rather than following strict patterns of behavior. The neural network controlling the android has also been given a "loose degree of flexibility," writes Engadget's Mat Smith, allowing its movements to adjust and change "on the system's own volition."
While this model of control is certainly rudimentary, one of Alter's creator, Osaka University's Kouhei Ogawa, told Engadget that it makes it easy to make a robot that operates for long periods of time without human supervision. And while Alter is, technically, moving on its own, it's not particularly self-aware. In a manner of speaking, it's just swaying in the breeze: arms flapping, mouth tunelessly singing, as it reacts to its surroundings. On second thought, that sounds pretty human after all.