Artificial intelligence creates better, faster MRI scans

You may someday be able to spend much less time in an imaging machine. Just let AI make the picture.

When a patient climbs into an MRI scanner, it peers inside their body to reveal the complex anatomy within, like the ligaments and tendons in a knee. But in January, before COVID struck, some patients who needed their knee scanned at NYU Langone Health started getting intentionally scanned twice. A scan for a typical human knee takes around 10 minutes, and these subjects—who had consented to taking part in a study—had their joint scanned at the normal speed, as well as about twice as fast (with the help of AI). After the coronavirus interruption, the work has since resumed using one scanner at the hospital.

That initiative is part of an ongoing effort at the medical center, in partnership with Facebook Artificial Intelligence Research, to see if running an MRI machine faster—and grabbing less data in the process—can produce images that are just as good those that arise the normal way. Reducing an approximately 10-minute knee scan to about 5 minutes, or shortening the scan time for other body areas, has obvious benefits: A patient could spend less time in a clanging tube (a procedure that demands they hold as still as possible) and hospitals could do more with the expensive, limited hardware they have.

To make this possible, radiologists and computer scientists need to employ artificial intelligence. If they were to run an MRI machine twice as fast as usual and then try to spin the data they collected into an image with the normal method, the result would be unusably bad. Enter AI: Using machine learning to analyze that comparably scant data and then create a picture produces something that is indeed usable, and in fact, looks to be of nicer quality to some radiologists’ eyes than the alternative.

MRI scan data
This is what the complete raw data from an MRI machine looks before it’s transformed into a usable image. The traditional way of doing so is called an inverse Fourier transform. Facebook AI & NYU Langone Health

The project reported good news last month. The researchers involved published the results of another study that aimed to determine if radiologists could tell the difference between typical MRI images and those that used AI, and if those scans were interchangeable diagnostically. Last year, Popular Science took a deep, exclusive dive into that process, shadowing a physician who took part in the experiment. The study’s results were published in the American Journal of Roentgenology last month.

What the study showed was encouraging. Dr. Michael Recht, the first author on the published study and the chair of the radiology department at NYU Langone Health, says that the images created by artificial intelligence (from a slimmer amount of data than is usually gathered) held up well compared to images made via the normal process. “There is no difference in how people read the scans, whether they’re reading the accelerated or the clinical [traditional] sequences,” Recht says. “They’re able to make the diagnosis equally well on either of the scans.”

In fact, he says he would rely on an AI-generated image of a patient’s knee to arrive at a diagnosis—a conclusion that a surgeon may then use when deciding whether or not to operate. “The sequences really are interchangeable, and I’m very, very comfortable using those sequences to make a diagnosis,” he says. Of the six radiologists in the study, only one of them was able to discern whether the scans were made the normal way or with AI.

With this recently published study, patients were not actually scanned twice. Instead, the team took MRI scans of patients’ knees and simulated the process of what a faster imaging process would have created by stripping some of the raw data out, and then used AI to knit that data into a complete picture.

under-sampled MRI data
This raw MRI data is missing information. If it’s interpreted the traditional way, the results are poor. But AI can create good images from a reduced amount of data. Facebook AI & NYU Langone Health

But the current work is indeed scanning patients twice, and Recht hopes to then use what they learn from patients who go on to have arthroscopic surgery on their knee as a “gold standard.” That way, they can look at the two different scans—one created the normal way, and the other created through a faster, five-minute AI scan—and then ideally compare them with what a surgeon ultimately sees on the table.

Eventually, the process could help MRI machines take the place of X-rays or CT scanners in some cases—meaning that someone who needs brain imaging, for example, from a CT scanner could instead skip the ionizing radiation that machine produces and opt instead for a speedy MRI.

Rob Verger

Rob Vergeris an associate editor at PopSci, where he covers aviation, the military, transportation, outdoor gear and gadgets, and other tech topics. A graduate of Columbia Journalism School, he's also written for The Boston Globe, Newsweek, The Daily Beast, CJR, VICE News, and other publications. Contact the author here.