The Brilliant 10: The top up-and-coming minds in science

These scientists and engineers are taking on some of medicine, chemistry, and society’s biggest challenges—and succeeding.
Popular Science

There’s a phrase that rings loudly in the heads of Popular Science editors any time we bring together a new Brilliant 10 class: “They’ve only just begun.” Our annual list of early-career scientists and engineers is as much a celebration of what our honorees have already accomplished as it is a forecast for what they’ll do next. To find the brightest innovators of today, we embarked on a nationwide search, vetting hundreds of researchers across a range of institutions and disciplines. The collective work of this year’s class sets the stage for a healthier, safer, more efficient, and more equitable future—one that’s already taking shape today. 

Turning food waste into filters

Kandis Leslie Abdul-Aziz: Assistant Professor, Chemical and Environmental Engineering; University of California, Riverside University of California, Riverside

After earning a bachelor’s in chemistry in 2007, Kandis Leslie Abdul-Aziz took a position at an oil refinery along the Schuykill River in South Philadelphia. Part of her job was to analyze refined petroleum products, like acetone and phenol, that other industrial manufacturers might buy. She was also tasked with testing the refinery’s wastewater—which, she couldn’t help but notice, flowed out right next to a residential neighborhood. “Literally, if you looked out past the plant,” she says, “you could see houses close by.”

That was more than a decade before an explosive fire forced the refinery to close and spurred an unprecedented cleanup effort. But the experience got Abdul-Aziz thinking about the life cycle of chemical byproducts and their potential impacts on human health. She went back to school for a PhD in chemistry, and her lab at the University of California, Riverside, now focuses on giving problematic waste streams—from plastic trash to greenhouse gases—a second life.

To start, Abdul-Aziz decided to investigate whether she could convert corn stover into something with economic value. The stalks, leaves, tassels, and husks left over from harvest add up to America’s most copious agricultural waste product. Much of it is left to rot on the ground, releasing methane and other greenhouse gases. A small percentage does get salvaged and converted into biofuels, but the payoff usually isn’t worth the effort.

Abdul-Aziz and her colleagues set out to test multiple processes for turning the refuse into activated carbon, the charcoal-like substance that’s used as a filter everywhere from smokestacks to your home Brita pitcher. Her analysis, published in 2021, looks at the activated carbon produced by various methods—from charring stover in an industrial furnace to dousing it in caustic substances—and the molecular properties that affect which contaminants it can soak up. The ultimate aim: Tell her what kind of chemicals you want to clean up, and she’ll create a carbon filter that can do the trick.

Abdul-Aziz has since applied to patent her customizable process, and is looking into other sources of detritus and use cases. Wastewater treatment companies have expressed interest, she says, in using her tools on environmental toxins such as PFAS—the stubborn, hormone-disrupting “forever chemicals” ubiquitous in household products and prone to contaminating drinking water. At the same time, she has also demonstrated that she can derive activated carbon from citrus peels, and is now investigating whether she can do the same with plastic trash.

She’s also exploring an even bigger swing. Earlier this year, the National Science Foundation awarded her half a million dollars to develop absorbent materials to capture carbon dioxide emissions and help convert them back into useful materials such as polymers and fuels. Abdul-Aziz wants to identify practical recycling processes that don’t require overhauling existing infrastructure. “For us it’s about trying to develop realistic solutions for these sustainability problems so they can actually be implemented,” she explains. It’s these small steps that she believes will move us toward a truly circular economy—one where materials can be reused many times. And with any luck, her innovations will help buffer the worst impacts of the very petrochemicals that inspired her quest.—Mara Grunbaum

Harnessing the power of immunotherapy for breast cancer

Sangeetha Reddy: Assistant Professor, Internal Medicine; University of Texas Southwestern Medical Center Courtesy Sangeetha Reddy

In recent decades, immunotherapy has been a game-changer in cancer treatment. Drugs that augment the body’s natural immune response against malignant tumors have dramatically improved survival rates for patients with diseases like lymphoma, lung cancer, and metastatic melanoma. But the method has been far less successful in breast cancers—particularly the most aggressive ones. Sangeetha Reddy, a physician-scientist at The University of Texas Southwestern Medical Center, is trying to change that. “We could do better,” she says.  

Reddy works with patients with triple-negative breast cancers, so-called because the malignancies don’t have any of the three markers scientists have historically targeted with anti-cancer drugs. Even with aggressive chemotherapy and surgery, the prognosis for these patients—who account for about 15 percent of breast cancer diagnoses worldwide—is relatively poor. Immunotherapies, in particular, often fail because breast cancers tend to hobble the body’s dendritic cells, the roving molecular spies that sweep up pieces of suspicious material and carry them back to immune system headquarters to introduce as the new enemy. When the body doesn’t know what it’s supposed to be attacking, boosting its power is of little use.

Reddy is therefore trying to figure out how to restore dendritic cell function. As a physician-scientist, she uses a relatively new approach that she describes as “bedside to bench and back.” She treats patients in her clinic, conducts in vitro and mouse experiments in her lab, and designs and manages her own clinical trials. This physician-scientist method enables a positive feedback loop: Reddy can analyze tumors excised from her own patients to assess whether treatments are working. Then she can test out new drugs on those same cancer cells. When she identifies a promising tactic, she can design clinical trials to test things like safety, dosage, and timing. At every step, she can find something in what she learns to incorporate back into her research or her patients’ care.

This cyclical strategy has led Reddy to the combination of three drugs that she’s currently testing against triple-negative breast cancer: Flt3-ligand, a protein that stimulates the proliferation of dendritic cells; a chemical that helps activate these cells and others; and anthracycline, a standard chemotherapy agent. In mice, this triad kept breast cancer tumors at least 50% smaller than chemotherapy alone. “A couple of our mice, we actually cured them,” says Reddy. A Phase-1 clinical trial investigating the safety and efficacy of the regimen in people began enrolling patients earlier this year.

Though it can take years to work out all the kinks in a new cancer treatment and clear the hurdles on the way to FDA approval, Reddy’s multi-pronged strategy should streamline this process as much as possible. Doing so will allow her to enable a transformation she’s been eyeing since she started to specialize in cancer treatment more than eight years ago. As a fellow at the MD Anderson Cancer Center, Reddy worked with melanoma patients in clinical trials of immunotherapy, which gave her a firsthand look at the treatment’s emerging potential. “We were taking patients who would have passed away within months and giving them ten years,” she says. “Just that hope that we can get there with [triple-negative breast cancer] led me to this path.”—M.G.

Decarbonizing the internet

Mohammad Hajiesmaili: Assistant Professor, Manning College of Information and Computer Sciences; University of Massachusetts Amherst Zinj Guo

The internet as we know it is inextricable from the cloud—the ethereal space through which all e-mails, Zooms, and Instagram posts pass. As many of us well-know, however, this nebulous concept is anchored to the Earth by sprawling warehouses that crunch and store data in remote places. Their energy demands are enormous and increasing exponentially: One model predicts they will use up to 13 percent of the world’s power by 2030 compared to just 3 percent in 2010. Gains in computing efficiency have helped matters, says University of Massachusetts Amherst assistant professor of informatics and computer science Mohammad Hajiesmaili, but those improvements do little to reduce the centers’ impact on the environment.

“If the power supply is coming from fuel sources, it’s not carbon optimized,” explains Hajiesmaili. But renewable power is sporadic, given its reliance on sun and wind, and geographically constrained, since it’s only harvested in certain places. This is the puzzle Hajiesmaili is working to solve: How can data centers run on carbon-free energy 24/7?

The answer involves designing systems that organize their energy use around a zero-carbon goal. Several approaches are in the works. The simplest uses schemes that schedule computing tasks to coincide with the availability of renewable energy. But that fix can’t work on its own given the unpredictability of bright sunlight and gusts of wind—and the fact that the cloud doesn’t sleep. Another strategy is “geographical load balancing,” which involves moving tasks from one data center to another based on local access to clean power. It, also, has drawbacks: Transferring data from one place to another still requires energy, Hajiesmaili notes, and, “if you’re not careful, this overhead might be substantial.”

An ideal solution, and the focal point of much of his work these days, involves equipping data centers with batteries that store renewable energy as a reserve to tap, say, at night. “Whenever the carbon intensity of the grid is high,” he says, “you can just discharge from the battery instead of consuming local high-carbon energy sources.” Even though batteries that are big enough, or cheap enough, to fully power data centers don’t exist yet, Hajiesmaili is already developing algorithms to control when future devices will charge and discharge—using carbon optimization as their guiding principle. This “carbon-aware” battery use is just one of many ways in which Hajiesmaili thinks cloud design should be overhauled; ultimately, the entire system must shift to put carbon use front and center. 

Most big technology companies have pledged to become carbon-neutral—or negative, in Microsoft’s case—in the coming decades. Historically, they have pursued those goals by buying controversial offset credits, but interest in carbon-intelligent computing is mounting. Google, for one, already uses geographical load balancing and is continuing to fine-tune it with Hajiesmaili’s input, and cloud-computer company VMWare has its own carbon-cutting projects in the works. In his view, though, the emerging field of computational decarbonization has applications far beyond the internet. All aspects of society—agriculture, transportation, housing—could someday optimize their usage through the same approach. “It’s just the beginning,” he says. “It’s going to be huge.”—Yasmin Tayag

Predicting how wildlife will adapt to climate change

Rachael Bay: Assistant Professor, Evolution and Ecology; University of California, Davis David Slipher/UC Davis

Evolutionary biologists typically think about changes that took place in the past, and on the scale of thousands and millions of years. Meanwhile, conservation biologists tend to focus on the needs of present wildlife populations. In a warming world, where more than 10,000 species already face increased risk of extinction, those disciplines leave a crucial gap. We don’t know which animals will be able to adjust, how quickly they can do it, and how people can best support them.

Answers to these questions are often based on crude generalizations rather than solid data. Rachael Bay, an evolutionary biologist at the University of California, Davis, has developed an approach that could help make specific predictions about how at-risk species might evolve over the coming decades. “Injecting evolution into conservation questions is really quite novel,” she says.

The central premise of Bay’s work addresses a common blind spot. Conjectures about how climate change will affect a particular creature often assume that all of them will respond similarly to their changing habitat. In fact, she points out, it’s exactly the variation between individuals that determines if and how a species will be able to survive.

Take the reef-building corals she looked at for her PhD research: Thought to be one of the organisms most vulnerable to extinction as a result of warming oceans, some already live in hotter waters than others. Bay identified genes associated with heat tolerance in the coral Acropora hyacinthus and measured the prevalence of that DNA in populations in cooler waters; from there, she was able to model how natural selection would change the gene pool under various climate-change scenarios. Her findings, published in 2017 in Science Advances, made a splash. The data indicated that the cooler-water corals can, in fact, adapt to warming if global carbon emissions start declining by 2050; if they don’t, or keep accelerating as they have been, the outlook becomes grim.

Bay has continued her work on corals and other marine organisms, but she has also applied her method to terrestrial animals. In 2017, work she conducted with UCLA colleague Kristen Ruegg bolstered the case for keeping a Southwestern subspecies of the willow flycatcher on the US endangered list. Though the species as a whole is abundant, with a breeding range that spans most of the US and southwestern Canada, the subgroup that occupies southern California, Arizona, and New Mexico has struggled with habitat loss. The scientists demonstrated not only that the desert-dwelling birds were genetically distinct enough to merit their own listing, but also that individuals in that population have unique genes that are likely associated with their ability to survive temperatures that regularly top 100°F. Protecting this small subgroup—less than one-tenth of a percent of the total population—could help the entire species persist.

That kind of specific, forward-looking decision is exactly what Bay hopes to enable for other wildlife facing an uncertain future. Other recent work has focused on how yellow warblers, Anna’s hummingbirds, and a coastal Pacific snail called the owl limpet might shift their ranges in response to climate change. “The pie-in-the-sky goal is to make evolutionary predictions that can be used in management,” she says.—M.G.

Building an immune system from scratch

John Blazeck: Assistant Professor, School of Chemical and Biomolecular Engineering; Georgia Institute of Technology Courtesy John Blazeck

When a new pathogen invades, the immune system unleashes a suite of antibodies into the bloodstream—the bodily equivalent of throwing spaghetti at the wall to see what sticks. While most of those proteins will do an okay job of neutralizing the trespasser, a valuable few will zero in with deadly accuracy. The faster scientists can identify and replicate those killers, the better we’ll get at beating disease. Case in point: Antibody therapy helped many at-risk patients sick with COVID-19. The big challenge in studying the body’s natural response, however, is that in order to do so, people have to get sick.

John Blazeck, of Georgia Tech’s School of Chemical and Biomedical Engineering, is developing a workaround. Instead of using the human body as a “bioreactor” for antibodies, he wants to use microbes. That way, the repertoire that fires off in response to a pathogen can be studied in, say, a flask or a chip. The dream of a “synthetic immune system” has kicked around biotech circles for the last two decades, but Blazeck’s work is ushering it into reality. “We can have a million different microbes, making a million different antibodies that would mimic what a person would be doing,” he says.

His career began in synthetic biology, a field that involves sticking genes into microbes to make them do new things. Specifically, he tried to get them to pump out biofuels. His interest in advancing health, however, led him to use his expertise to fight disease in 2013, when he injected microbes with the human genes known to produce antibodies. Recreating the immune system in this way is a colossal undertaking. “The catch is that the process has been optimized for millions of years, so it’s very hard to make it happen,” he explains.

Nevertheless, his team has made foundational progress that could underpin the future of this research. Recently, they figured out how to efficiently mutate antibody DNA after it’s been inserted into microbes, which will help them select antibodies that bind more tightly to a given pathogen. The process is meant to mimic how the immune system uses its B cells—the body’s antibody factories—to self-select the proteins that generate the strongest defenses.

Building a synthetic immune system is only half of what Blazeck is doing to supercharge immunity. The rest builds on his postdoctoral research on engineering a means to thwart cancer cells’ defenses. Tumors secrete molecules that shut down immune cells trying to get in their way. Blazeck—with his former advisor George Georgiou, of the University of Texas, Austin—found an enzyme that can render those molecules harmless, allowing the immune system to do its thing. Ikena Oncology, a company specializing in precision cancer treatment licensed the enzyme, one of the first of its kind, in 2015. Both aspects of Blazeck’s work are at the forefront of burgeoning new fields, and he’s been heartened by the early response. “I hope that people continue to appreciate the value of trying to engineer immunity, and how it can contribute to understanding how to fight disease—and also directly fight disease,” he says.—Y.T.

Spying our future in near-asteroid flybys

Daniella Mendoza DellaGiustina: Assistant Professor; Principal Investigator, OSIRIS-APEX; University of Arizona Courtesy Daniella Mendoza DellaGiustina

The whole world will be watching when a 1,000-foot-wide asteroid called Apophis swoops by Earth in mid-April 2029. But Daniella Mendoza DellaGiustina, a planetary scientist at the University of Arizona, will be looking more closely than anyone else. Her gaze will be trained on what the space rock reveals about our past—and what it means for our future. “It’s going to captivate the world,” she says. In 2022, NASA named her principal investigator of the OSIRIS-APEX mission, which will send the OSIRIS-ReX spacecraft that sampled the asteroid Bennu in 2020 chasing after Apophis.

DellaGiustina wasn’t always interested in space, but as a “cerebral young person” gazing into the famously clear skies of the desert Southwest, she had a lot of big questions: Why are we here? How did we get here? A community college class in astronomy piqued her interest. Then, a university course on meteorites led to an undergraduate research position with Dante Lauretta, who later became the principal investigator of OSIRIS-ReX. DellaGiustina knew “very early on” that the research environment was right for her: “You’re actively pushing the boundary of human knowledge.” A master’s degree in computational physics led her to field work on the ice sheets of Alaska, which resemble those on other planets. Eventually, she returned to the University of Arizona, where completed a PhD in geosciences (seismology) while working on image processing for OSIRIS-ReX.

A belief that asteroids hold answers to the big questions of her youth drives her to understand them from the inside out. “They really represent the leftovers of solar system formation,” she says. “It’s kind of like finding an ancient relic.” So-called carbonaceous asteroids like Ryugu and Europa—rich in volatile substances, including ice—may explain how water and the amino acids that jumpstarted life once made their way to Earth. They may also offer a glimpse of the future: “Near-Earth asteroids, especially, hold tremendous potential for resource utilization,” DellaGiustina says, “but one might also take us out someday.”

Apophis is not considered dangerous, but it will swing by at roughly one-tenth the distance between Earth and the Moon. “If we ever have an incoming threat to our own planet, we need to understand ‘what’s the structure of this thing?’ so that we can properly mitigate against it,” she says. With DellaGiustina at the helm, the OSIRIS-APEX project will use this once-in-7,500-years chance to study how close encounters with planets can change an asteroid. Earth’s tidal pull, for example, is expected to “squeeze” Apophis—a tug DellaGiustina hopes to measure via a seismometer dropped on the surface.

Lauretta, who has worked with DellaGiustina since she was an undergraduate, jumped at the chance to nominate her to lead the next phase of the OSIRIS mission. She had always been keen on designing experiments—Lauretta seriously considered her proposal to equip OSIRIS-ReX with a dosimeter to measure the radiation risk for future asteroid-hopping astronauts. Her “decisive leadership is rare and critical for a program of this size,” he adds. On the off chance that an errant space rock ever threatens Earth, it’ll be a comfort to know she’s at work behind the scenes.—Y.T.

Making transit sustainable and equitable

Samitha Samaranayake: Assistant Professor, School of Civil and Environmental Engineering; Cornell University Charissa King-O’Brien

Picture this: It’s Tuesday morning, and you’re planning to ride the train to work. Walking to the station takes 25 minutes, so you hop on the local bus. Today, though, the bus is delayed, and doesn’t reach the station in time to catch the train. You wait for the next one. You’re late for work.

If your boss is a stickler and you rely on public transit, a missed connection can be make or break. These are the kinds of problems that Samitha Samaranayake, a computer-scientist-turned-civil-engineer at Cornell University, has made it his mission to solve. He designs algorithms to help varied modes of mass transit work more seamlessly together—and help city planners make changes that benefit those who need them most.

Before Cornell, Samaranayake spent several years studying app-based ridesharing, including the potential of on-demand autonomous car fleets. In 2017, he co-authored an influential paper showing that companies like Uber and Lyft could reduce their contribution to urban congestion if cars were dispatched and shared efficiently. But he quickly became disillusioned with entirely car-centric solutions. “It’s convenient for people who can afford it,” he says, but when it comes to moving city-dwellers efficiently and accessibly, mass transit can’t be beat.

So Samaranayake began investigating how new technology can best be incorporated into city transit systems—and possibly solve some of their most-common pitfalls. Take the “last mile problem:” the challenge of transporting people from transit hubs in dense urban areas to the less-centralized places that they need to go—like their homes in far-out neighborhoods. If these connections aren’t quick and reliable, people may not use them. And if people aren’t using a neighborhood bus line or other last-mile service, says Samaranayake, a transit agency might cut it rather than run more buses, making the problem worse.

That’s where the technology developed by ride-sharing companies becomes useful, says Samaranayake. In recent years, he’s designed algorithms to integrate real-time data from public transit with the software used to dispatch on-demand vehicles. This could let transit authorities send cars to pick up groups of people, then deliver them to a commuter hub in time to make their connections.

This approach is known as “microtransit,” and after pandemic-related delays, a test project with King County Metro in Seattle launched earlier this year. It uses app-based rideshare vans to shuttle shift workers and others who live in the outskirts of the city to and from the regional rail line. Although it’s too early to measure success, Samaranayake has seen enthusiastic uptake from some commuters without many good alternatives.

That points toward his other goal: finding better ways to quantify how equitably transit resources are apportioned, so that city planners can ultimately design new systems that reach more people more efficiently. This social-justice element helps motivate Samaranayake to keep working on mass transit, even though funding has typically been more abundant for flashier technology like self-driving cars.

That could be changing: In recent years, Samaranayake and his collaborators have received nearly $5 million from the US Department of Energy and the National Science Foundation to pursue their vision. “Transit is not ‘cool’ from a research perspective,” Samaranayake admits. “But it’s the only path forward to a transportation system that is environmentally sustainable and equitable, in my view.”—M.G.

Finding the roots of neurodegenerative disease

Chantell Evans: Assistant Professor, Cell Biology; Duke University Jeff Fusco / HHMI

Anyone who’s taken high school biology knows that mitochondria are the powerhouses of cells. While it’s true that these organelles are responsible for converting sugars into energy, they also have many less-appreciated jobs, including generating heat, storing and transporting calcium, and regulating cell growth and death. In recent decades, researchers have linked the breakdown of these functions to the development of certain cancers and heart disease.

When it comes to diseases like dementia, Parkinson’s, and ALS, however, Duke University cell biologist Chantell Evans thinks it’s time to look specifically at neurons. “Mitochondria are implicated in almost every neurodegenerative disease,” says Evans. By unraveling how neurons deal with malfunctioning mitochondria, her work could open up possibilities for treating many currently incurable conditions.

Evans’ work focuses on understanding a process called mitophagy—how cells deal with dead or malfunctioning mitochondria—in neurons. There are plenty of reasons to believe brain cells might manage their organelles in unique ways: For one, they don’t divide and replenish themselves, which means the 80 billion or so we’re issued at birth have to last a lifetime. Neurons are also extremely stretched out (the longest ones run from the bottom of the backbone to the tip of each big toe) which means each nucleus has to monitor and maintain its roughly two million mitochondria over a great distance.

Before Evans launched her investigation in 2016, research on epithelial cells—those that line the surface of the body and its organs—had identified two proteins, PINK1 and Parkin, that seem to be mutated in patients with Parkinson’s disease. But, confusingly, disabling those proteins in mice in the lab didn’t lead to the mouse equivalent of Parkinson’s. To Evans, that suggested that the story of neural mitophagy must be more complicated.

To find out how, she went back to basics. Her lab watched rodent brain cells in a dish as they processed dysfunctional mitochondria. Evans gradually cranked up the stress they experienced by removing essential nutrients from their growth medium. This, she argues, is more akin to what happens in an aging human body than the typical process, which uses potent chemicals to damage mitochondria.

Results she published in 2020 in the journal eLife found that disposing of damaged mitochondria takes significantly longer in neurons than it does in epithelial cells. “We think, because [this slowness] is specific to neurons, that it may put neurons in a more vulnerable state,” she explains. Evans has also helped identify additional proteins that are involved in the best-known repair pathway—and determined that that action takes place in the soma, or main body, of a neuron but not in its threadlike extensions, known as axons. That, she says, could mean there’s a separate pathway that’s maintaining the mitochondria in the axon. Now, she wants to identify and understand that one too.

Thoroughly documenting these mechanics will take time, but Evans says charting the system could lead to precious medicine. “If we understand what goes wrong,” she says, “We might be able to diagnose people earlier… and be more targeted in trying to develop better treatment options.”—M.G.

Mapping every human cell

Aaron Streets: Associate Professor, Bioengineering, Computational Biology, and Biophysics; University of California, Berkeley Michelle Tran/Berkeley Computing, Data Science, and Society

It took the Human Genome Project a decade to lay out our complete genetic code. Since then, advances in sequencing technology have vastly sped up the pace by which geneticists can parse As, Gs, Ts, and Cs, which has allowed biologists to think even bigger—by going smaller. Instead of spelling out all of a person’s DNA, they want to create a Human Cell Atlas that characterizes the genetic material of every single cell in the body. Doing so will create “a reference map of what a healthy human looks like,” explains bioengineer Aaron Streets.

Understanding what makes individual cells unique requires insight into the epigenome—the suite of chemical instructions that tell the body how to make many kinds of cells out of the same string of DNA. “This is where the notion of the epigenome comes into play,” says Streets, who runs a lab at the University of California, Berkeley. All cells may be reading from the same book, but each one’s epigenome highlights the most relevant passages—essentially how and which genes are expressed. Streets is inventing the tools scientists need to zero in on those specifics.

Reading the epigenome is important, says Streets, because, in addition to showing why healthy cells act the way they do, it can also reveal why an individual one goes haywire and causes illness—cancer, for example. Once the markers of a rogue actor are known, he explains, researchers can develop therapeutics that address the question: “How can we engineer the epigenome of cells to fix the disease?”

Characterizing cells is highly interdisciplinary work, which Streets is perfectly suited for. He majored in art and physics but “just wasn’t good at” biology organismal studies. It wasn’t until graduate school, where he worked with a physicist-turned-bioengineer, that he realized how much insights gleaned from math, physics, and engineering could benefit the study of living things.

As a start, this year Streets and his colleagues published a protocol in the journal Nature Methods for reading particularly mysterious parts of the genome. The tool identifies sections within hard-to-read DNA regions that bind proteins—and thus have epigenomic significance—by bookending the strings with chemical markers called methyl groups. To James Eberwine, a pharmacology professor at the University of Pennsylvania and a pioneer of single-cell biology, “it is going to be very useful” for building a cell atlas.

Now, Streets’s lab is building new software to piece together the millions of sequences that comprise a single cell’s genome. And, because mapping every single anatomical cell will require a fair bit of teamwork, the programs they create are shared freely with other scientists who can use the tools to make their own discoveries. “If you look at really huge leaps in progress in our understanding of how the human body works,” says Streets, “they correlate really strongly with advances in technology.”—Y.T.

Crunching the numbers to get ahead of outbreaks

Daniel Larremore: Assistant Professor; University of Colorado Boulder Glenn Asakawa, University of Colorado Boulder

Like everyone in early 2020, Daniel Larremore wondered whether this virus making its way around the globe was going to be a big deal. Would he have to cancel the exciting academic workshop he had planned for March? What about his ongoing research on the immune-evading genes of malaria parasites?

As the answers became clear, so did his next big task: predicting the trajectory of the disease so that scientists and policymakers could get ahead of it. “You have a background in infectious diseases and mathematical modeling,” thought the University of Colorado Boulder computer scientist. “If you’re not going to make a contribution when there’s a global pandemic, when are you going to step up?” He put his work on the epidemiology of malaria on hold as he emailed colleagues studying the emerging outbreak to ask how his lab could help. “I sent that mid-March,” he says, “and didn’t stop working until early to mid-2021.”

Before coming to Boulder, Larremore had been a postdoctoral candidate at Harvard T.H. Chan School of Public Health, where he was first immersed in the world of infectious disease—how it was transmitted, how it evaded immunity, and how to model its spread. It prepared him well for the first wave of COVID-19 research questions, which were all about working around the shortcomings of antibody tests. At the time, they were the only tools available for counting infections, but their sensitivity and specificity varied widely. A paper he co-authored in those early months described how to estimate infection rate, a key metric in justifying public health measures like mask mandates and social distancing.

As the pandemic wore on, Larremore and his collaborators continued to think forward: “What’s the question we’re going to be asking six months from now that we’ll wish we had the answer to right away?” The research they conducted now underpins much of American COVID policy: Their modeling found that speed, not accuracy, in testing was more important for curbing viral spread; that the success of immunity passports depended on the prevalence and infectiousness of the virus; and that elderly and medically vulnerable people should be prioritized for vaccination. “Dan did a huge amount of work across a number of different disciplines, and I think the contributions he’s made have really been remarkable,” says Yonatan Grad, an associate professor at the Harvard T.H. Chan School of Public Health who frequently collaborates with Larremore.

While his work on COVID-19 winds down, Larremore is already helping develop a general theory of disease mitigation involving at-home testing. Through modeling, he’s hoping to find out how much testing might slow the spread of different infectious diseases—and how that changes with disease or the variant. He’s excited about leveraging the jump in public science literacy induced by COVID-19: “If you tell people to self-collect a nasal swab, they’ll do a great job at it,” he says. He imagines a world where the public can reliably self-diagnose common illnesses like flu, and take the appropriate steps (wearing a mask, opening windows) to protect others. “That just seems really empowering,” says Larremore. “And, potentially, a cool future.” —Y.T.