What's up in
Tiny amounts of artificial noise can fool neural networks, but not humans. Some researchers are looking to neuroscience for a fix.
Results from neural networks support the idea that brains are “prediction machines” — and that they work that way to conserve energy.
The computational biologist Anne Carpenter creates software that brings the power of machine learning to researchers seeking answers in mountains of cell images.
A new model of learning centers on bursts of neural activity that act as teaching signals — approximating backpropagation, the algorithm behind learning in AI.
To help them explain the shocking success of deep neural networks, researchers are turning to older but better-understood models of machine learning.
Computational neuroscientists taught an artificial neural network to imitate a biological neuron. The result offers a new way to think about the complexity of single brain cells.
Researchers are turning to the mathematics of higher-order interactions to better model the complex connections within their data.
Melanie Mitchell has worked on digital minds for decades. She says they’ll never truly be like ours until they can make analogies.
A temporal pattern of activity observed in human brains may explain how we can learn so quickly.