A journey that began nearly a century ago, when scientists invented the first electron microscope, has taken yet another step.
A group of physicists have leapt even closer to the ultimate limit of how much scientists think objects can be magnified. This group previously held the world record for the highest resolution achieved with a microscope. Their latest work, published in Science, shrinks that record even further.
“This is the highest-resolution imaging in human history,” says David Muller, a physicist at Cornell University and one of the paper’s authors.
You won’t get resolution anywhere near this high with the sort of microscopes that you might have used in school. Those microscopes—like those that Robert Hooke used over 300 years ago to glimpse a hidden world of cells—see light. That means they’re incapable of seeing anything smaller than the wavelength of that light. It’s a hard limit that’s a thousand times too large to think about seeing atoms.
Scientists were already hitting that roadblock in the early 20th century. If you want to go smaller—to enter the world of viruses, for instance, to develop a polio vaccine—you’ll need to see with a medium that has a shorter wavelength than light.
You might turn to electrons, the tiny charged particles that orbit the nucleus of an atom. In the 1930s, scientists like Ernst Ruska began building the first electron microscopes, which can reveal minuscule objects in vivid detail by probing them with electron beams.
Electrons have wavelengths some 100,000 times shorter than light. In theory, you can use them to peer into atoms—those fundamental building blocks of all normal matter. But there’s a problem, and it’s not the electrons’ fault. “The lens quality of electron lenses is terrible,” Muller says.
No imaging system is perfect, as many astronomers know well. But the electromagnetic lenses inside electron microscopes are particularly blurry. Looking through a typical electron microscope, according to Muller, is like looking at light through a beer bottle.
One way around this is to attach bits of hardware called “aberration correctors,” which are like prescribing your electron microscope a pair of glasses. But to look at atoms, you’ll need to conduct a symphony of aberration correctors. Imagine a hundred pairs of constantly shifting glasses.
By the 1990s and 2000s, computers had actually made this possible, pushing microscope resolution to new limits. For a time, aberration correctors held the resolution throne. But by the 2010s, the technology was starting to run out of steam.
To keep pushing the limit of microscope resolution, physicists at Cornell took a road less travelled: They did away with lenses completely. Instead, they shot electrons at an object and watched how they scattered.
As those electrons fly, the object’s atoms will throw the bombarding electrons off course, bending them into a pattern on the object’s far side. By shining electrons at an object from multiple positions, you can snap a whole album of patterns. With today’s computers, you can stitch those patterns together to reconstruct a microscopic image of the original object.
It’s called ptychography (tai-KAW-graf-ee). X-ray scientists today commonly use their own version of ptychography, but to electron-watchers, it was a dead end. Scientists had been talking about electron ptychography in theory for half a century, but only in the last half-decade has it really been feasible, according to Yi Jiang, a physicist at Argonne National Laboratory and a co-author on the paper.
For one, scientists didn’t have detectors in the past that were capable of pinpointing where enough of the electrons landed. For another, electrons are especially prone to being thrown off in all sorts of wild directions, even by a single atom. That isn’t easy to account for, even with modern computers. As a result, aberration correctors held an order of magnitude-sized lead over ptychography when it came to the resolution record.
But the Cornell group believed ptychography held promise. By the mid-2010s, they’d developed state-of-the-art electron detectors. To do this, they borrowed algorithms from X-ray scientists. They also simplified the problem by dialling down their electron beam and filing their object down to the smallest thickness possible.
And in 2018, it worked. The Cornell group beat aberration correctors to achieve the highest microscope resolution ever, earning a Guinness World Record for good measure.
[Related: 6 bright microscopic images of life]
Of course, it wasn’t a foolproof method. “All we could do was work with these materials that were just one atom or two atoms thick,” says Muller.
But the group wondered if they could go even smaller. They had the equipment to do it, but they needed their computers to account for electrons’ pesky scattering. Essentially, they needed to force their way through a physics problem that hadn’t been solved in 80 years.
It took the Cornell group three years of tinkering with algorithms—three years of work that Muller says often felt fruitless. But thanks to the work of Cornell postdoc Zhen Chen, they found a way that worked.
The result? They’ve beaten their own world record by twofold.
“The paper is a landmark study,” says Matthew Joseph Cherukara, a computational scientist at Argonne National Laboratory who wasn’t involved with the paper. “It is a demonstration of the power of advanced algorithms and computation in breaking and surpassing physical limitations of microscopes.”
Can scientists go even further?
The answer to that is, quite literally, hazy.
Look at the Cornell group’s pictures, and you’ll see that the atoms appear blurry. That isn’t an aberration from the detector or interference from the air. It’s the trembling of the atoms themselves, vibrating in heat. You could cool the atoms down to make them stay in place, but by probing them with electrons, you’re only heating them up again.
So that blur, as far as scientists know, isn’t something they can overcome—unless they find another way of looking at atoms entirely.
“We’re kind of almost at an ultimate limit,” says Muller.