What's up in
Two new approaches allow deep neural networks to solve entire families of partial differential equations, making it easier to model complicated systems and to do so orders of magnitude faster.
Rediet Abebe uses the tools of theoretical computer science to understand pressing social problems — and try to fix them.
A recent paper set the fastest record for multiplying two matrices. But it also marks the end of the line for a method researchers have relied on for decades to make improvements.
Avi Wigderson and László Lovász won for their work developing complexity theory and graph theory, respectively, and for connecting the two fields.
By harnessing randomness, a new algorithm achieves a fundamentally novel — and faster — way of performing one of the most basic computations in math and computer science.
In a second season of enlightened conversations, Steven Strogatz and leading researchers nourish our pandemic-starved minds.
To the surprise of experts in the field, a postdoctoral statistician has solved one of the most important problems in high-dimensional convex geometry.
The learning algorithm that enables the runaway success of deep neural networks doesn’t work in biological brains, but researchers are finding alternatives that could.
Two teams found different ways for quantum computers to process nonlinear systems by first disguising them as linear ones.