What's up in
A simple algorithm that revolutionizes how neural networks approach language is now taking on image classification as well. It may not stop there.
Algorithms that use the brain’s communication signal can now work on analog neuromorphic chips, which closely mimic our energy-efficient brains.
Two researchers show that for neural networks to be able to remember better, they need far more parameters than previously thought.
A new model of learning centers on bursts of neural activity that act as teaching signals — approximating backpropagation, the algorithm behind learning in AI.
To help them explain the shocking success of deep neural networks, researchers are turning to older but better-understood models of machine learning.
Get highlights of the most important news delivered to your email inbox