Human brains and machine brains are different. Human brains are really good at critical analysis. Machine brains are really good at working with lots of data. Human brains are power efficient. Machine brains need 100 million times more power than human brains to perform similar cognitive tasks.
That's why researchers and companies have been interested in chips with structures that mimic the human brain, also known as "neuromorphic computing." IBM's TrueNorth is one of those chips. It has 4,096 computer cores that support about a million digital brain cells and 256 million connections. Information travels over those connections like it does across human synapses.
On Thursday, Eric Ryu, a vice president of research at the Samsung Advanced Institute of Technology, showed off how TrueNorth could help a computer be better at recognizing hand gestures while using one-tenth of the power used by typical phones. Samsung isn't the first to use the chip, although the company does manufacture it. The Lawerence Livermore National Lab has been using it for cyber security research. The US Air Force has been using it to detect unusual events in videos and to build smarter autonomous drones.
Samsung has built TrueNorth into its Dynamic Vision Sensor, which uses the chip to recognize images at 2,000 frames per second. That kind of speed is really good for generating 3D maps, driving autonomously, and controlling computers with gestures.
Venture Beat shot a quick video of Ryu on stage at IBM's research laboratory in Almaden, California, demonstrating how these hand gestures could be used to control a TV.
"It recognized hand waves, finger waves, closed fists and finger pinches from about 10 feet away," wrote CNET.
Seems like machine learning-optimized hardware is here to stay. Let's just hope the tech industry stops thinking we want to use it to make ridiculous hand gestures.