What's up in

Computer Science

A blue person made of code walks down a pink road patterned with hexagons

Best-Ever Algorithm Found for Huge Streams of Data

To efficiently analyze a firehose of data, scientists first have to break big numbers into bits.

Photo of AlphaGo board by dreamdream | Quanta Magazine
Abstractions blog

Artificial Intelligence Learns to Learn Entirely on Its Own

A new version of AlphaGo needed no human instruction to figure out how to clobber the best Go player in the world — itself.

Abstractions blog

One-Way Salesman Finds Fast Path Home

The real-world version of the famous “traveling salesman problem” finally gets a good-enough solution.

Information dog bottleneck
Wired to Learn: The Next AI

New Theory Cracks Open the Black Box of Deep Learning

A new idea is helping to explain the puzzling success of today’s artificial-intelligence algorithms — and might also explain how human brains learn.

Brain made of wires
Wired to Learn: The Next AI

A Brain Built From Atomic Switches Can Learn

A tiny self-organized mesh full of artificial synapses recalls its experiences and can solve simple problems. Its inventors hope it points the way to devices that match the brain’s energy-efficient computing prowess.

Digital question marks surrounded by vines
Wired to Learn: The Next AI

Clever Machines Learn How to Be Curious

Computer scientists are finding ways to code curiosity into intelligent machines.

Nash equilibrium maze
Game Theory

In Game Theory, No Clear Path to Equilibrium

John Nash’s notion of equilibrium is ubiquitous in economic theory, but a new study shows that it is often impossible to reach efficiently.

Abstractions blog

Why Quantum Computers Might Not Break Cryptography

A new paper claims that a common digital security system could be tweaked to withstand attacks even from a powerful quantum computer.


How to Force Our Machines to Play Fair

The computer scientist Cynthia Dwork takes abstract concepts like privacy and fairness and adapts them into machine code for the algorithmic age.