What's up in
A new model of learning centers on bursts of neural activity that act as teaching signals — approximating backpropagation, the algorithm behind learning in AI.
To help them explain the shocking success of deep neural networks, researchers are turning to older but better-understood models of machine learning.
Researchers are turning to the mathematics of higher-order interactions to better model the complex connections within their data.
Melanie Mitchell has worked on digital minds for decades. She says they’ll never truly be like ours until they can make analogies.
A temporal pattern of activity observed in human brains may explain how we can learn so quickly.
For all their triumphs, AI systems can’t seem to generalize the concepts of “same” and “different.” Without that, researchers worry, the quest to create truly intelligent machines may be hopeless.
Glycans, the complex sugars that stud cellular surfaces, are like a language that life uses to mediate vital interactions. Researchers are learning how to read their meaning.
Two new approaches allow deep neural networks to solve entire families of partial differential equations, making it easier to model complicated systems and to do so orders of magnitude faster.
Get highlights of the most important news delivered to your email inbox