(ZDNet) It’s been an open question for many years why deep learning forms of neural networks achieve great success on many tasks; the discipline has a paucity of theory to explain its empirical successes. Facebook’s Yann LeCun puzzled that deep learning is like the steam engine, which preceded the underlying theory of thermodynamics by many years. Researchers recently presented a proof of deep learning’s superior ability to simulate the computations involved in quantum computing. According to these thinkers, the redundancy of information that happens in two of the most successful neural network types, convolutional neural nets, or CNNs, and recurrent neural networks, or RNNs, makes all the difference. Amnon Shashua, who is professor of computer science at the Hebrew University in Jerusalem, and the president and chief executive of Mobileye, the autonomous driving technology company bought by chip giant Intel last year for $14.1 billion, presented the findings recently.
The work amounts to both a proof of certain problems deep learning can excel at, and at the same time a proposal for a promising way forward in quantum computing. What Shashua and team found, and what they say they’ve proven, is that CNNs and RNNs are better than traditional machine learning approaches such as the “Restricted Boltzmann Machine,” a neural network approach developed in the 1980s that has been a mainstay of physics research, especially quantum theory simulation.