What's up in
A new model of learning centers on bursts of neural activity that act as teaching signals — approximating backpropagation, the algorithm behind learning in AI.
To help them explain the shocking success of deep neural networks, researchers are turning to older but better-understood models of machine learning.
Melanie Mitchell has worked on digital minds for decades. She says they’ll never truly be like ours until they can make analogies.
For all their triumphs, AI systems can’t seem to generalize the concepts of “same” and “different.” Without that, researchers worry, the quest to create truly intelligent machines may be hopeless.
Two new approaches allow deep neural networks to solve entire families of partial differential equations, making it easier to model complicated systems and to do so orders of magnitude faster.
To the surprise of experts in the field, a postdoctoral statistician has solved one of the most important problems in high-dimensional convex geometry.
Get highlights of the most important news delivered to your email inbox