What's up in
Dreams are subjective, but there are ways to peer into the minds of people while they are dreaming. Steven Strogatz speaks with sleep researcher Antonio Zadra about how new experimental methods have changed our understanding of dreams.
Self-supervised learning allows a neural network to figure out for itself what matters. The process might be what makes our own brains so successful.
Christopher Kanan is building algorithms that can continuously learn over time — the way we do.
Intelligent beings learn by interacting with the world. Artificial intelligence researchers have adopted a similar strategy to teach their virtual agents new tricks.
Results from neural networks support the idea that brains are “prediction machines” — and that they work that way to conserve energy.
Scientists thought that the brain’s hearing centers might just process speech along with other sounds. But new work suggests that speech gets some special treatment very early on.
When animals move through 3D spaces, the neat system of grid cell activity they use for navigating on flat surfaces gets more disorderly. That has implications for some ideas about memory and other processes.
Familiar categories of mental functions such as perception, memory and attention reflect our experience of ourselves, but they are misleading about how the brain works. More revealing approaches are emerging.
Get highlights of the most important news delivered to your email inbox