YouTube science videos are riddled with scams, plagiarism, and misinformation

The site is a crucial tool for ‘edutainment,’ but it lacks quality control.
A man holds a smartphone with the YouTube app.
Upstart video makers have benefited young and old audiences alike. But how do these YouTube science communicators source their facts? Rachit Tank/Unsplash

Share

Dan Garisto is a science journalist based in New York. He writes about physics and has been published in outlets including Scientific American, Symmetry, Science News, Hakai, and Nature News. This story originally featured on Undark.

Siraj Raval enthusiastically begins his YouTube videos by greeting his audience with the first two words any aspiring coder learns to produce: “Hello world!”

A self-described technology activist, Raval has built a YouTube following of almost 700,000 over the past four years with popular videos like “TensorFlow in 5 Minutes,” which explains a popular software platform used in artificial intelligence research, and “How to Make Money as a Programmer in 2018.” To keep his audience entertained and engaged with topics like machine learning and Bitcoin, Raval uses flashy graphics, memes, and even raps. The schtick is often referred to as edutainment—a cross between education and entertainment.

Edutainment aims to teach, “but it employs some of the strategies of entertainment in order to do so,” says Gordon Carlson, an associate professor in the communications studies department at Fort Hays State University in Kansas. On YouTube, edutainment is both a popular and wildly diverse genre, encompassing videos about the mathematics of cake cutting and what tattooing looks like in slow motion. A 2018 Pew survey of nearly 5,000 adults in the US found that about nine in 10 users of the site value it as a learning resource, while in a similar study, 60 percent of Generation Z respondents preferred using YouTube to learn over books. Top YouTube channels like Raval’s receive the majority of these views—one 2018 estimate found that the top 3 percent of channels take in 85 percent of all views.

While Raval’s videos have been praised for their high production quality and accessibility, his work has recently been called into question. Last summer, for instance, his online course called “Make Money with Machine Learning” turned out to be effectively a scam, as The Register reported, and Raval was forced to refund hundreds of students. Around the same time, Raval published an academic paper that was later revealed to be plagiarized .

Backlash ensued. Critics found multiple instances where Raval failed to properly attribute code and questioned his lack of credentials. Though he claimed to be a data scientist, Raval never graduated from college and had little industry experience. Raval has since posted two video apologies admitting that he’d made mistakes, but saying that he would remain committed to inspiring people to learn computer science. (He did not respond to multiple requests from Undark for comment.)

Researchers who study platforms like YouTube have been concerned about falsehoods—as well as other worrying trends, such as toxic ideologies—for years. “As one colleague recently said, ‘Basically, YouTube is the Wild West,’ ” says Joachim Allgaier, a sociologist who studies science communication at RWTH Aachen University in Germany. Raval’s story, then, is unusual not for what he did, but for what he didn’t do. None of his videos were “fake news” or pseudoscientific. He didn’t spread conspiracy theories or hate. Instead, his hustle was entertaining and nominally educational videos.

There’s no reason to think that Raval’s specific misconduct is part of a pattern by YouTube edutainers, but the case raises questions about qualifications, substance, and accuracy on a platform that is the primary source of extra-scholastic science education for millions of people.

Drawing conclusions about the vast amount of scientific information, let alone edutainment specifically, on YouTube is difficult because it remains understudied, according to Asheley Landrum, an expert in science communication at Texas Tech University.

One exception is health information, which has been comparatively well-surveyed, with dozens of studies that examine the quality and accuracy of YouTube videos across topics such as anorexia, heart attacks, and smoking. There’s no consensus, but in general, there appears to be plenty of misinformation. A 2011 survey of anorexia on YouTube found that 41 out of 140 videos were pro-anorexia. In 2015, researchers examined 200 videos about asthma and found that 38 percent promoted unsupported alternative treatments, including acupuncture and ingestion of live fish.

YouTube has taken at least some action against bogus health information. In early 2019, for example, the company demonetized anti-vaccination videos by removing ads, and changed its recommendation algorithms to fight other conspiracy theories. According to a YouTube spokesperson, videos like these, which the company considers “borderline content,” are now watched 70 percent less often.

Allgaier remains unconvinced: “I mean, this is how YouTube works—to get as much traffic as possible.”

Two converging trends further complicate the picture. First, thanks in part to advances in video production, it is now easier than ever to make and upload videos. “The credentials for traditional edutainment programming matched more closely the credentials for being a traditional educator or scholar,” says Carlson. Today, anyone with the skills to be popular on YouTube can take that place. While the change isn’t necessarily bad, Carlson adds, it opens the door to abuse. Second, YouTube is increasingly mainstream, with more of its biggest stars being backed by professional teams and organizations. In theory, this should add a level of quality control. In reality, the results aren’t so clear.

Short, simple, and flashy is the recipe for success on YouTube, according to the user Coffee Break, a YouTuber who has produced videos critical of pop science. (In a phone interview with Undark, Coffee Break would only give his first name as Stephen, declining to provide more personally identifying information because he says his work can put him at legal risk.)

The recipe is also a problem for edutainment. “That isn’t really the way people really learn. You can’t learn TensorFlow in five minutes,” he says, referring to Raval’s videos. “Everyone would like to learn TensorFlow in five minutes. By promising that, you can gain an audience.” Successful channels spawn imitators, and the result can be a race to the bottom: TensorFlow in four minutes, then TensorFlow in three minutes. The aim, Stephen says, is often to make people “feel smart.”

Many of these videos fall into what Carlson calls “infotainment,” where the goal isn’t to teach, but to pass on information, even if it doesn’t lead to understanding. “They’re not saying, ‘Here’s how you would do the problem yourself,’” Stephen agrees. “They’re just handing you a bunch of microwaved facts.”

Exceptions are channels like Khan Academy, which creates educational videos on topics from geopolitics to geoscience, with step-by-step instruction—and, critically, on the basis of building up knowledge over time. Unlike many channels, Khan Academy doesn’t promise each video will be accessible to everyone. And in 2019, YouTube launched “learning playlists,” which cover topics like anatomy and physiology and move from beginner to advanced levels. Currently, the playlists also hide recommendations in an effort to keep viewers focused on learning.

But creating prerequisites and limitations for viewers may be a tough sell. Many popular videos promise to provide deep knowledge about complex topics like quantum mechanics and genetic engineering technology like CRISPR in a few minutes. “To be charitable, a lot of them probably wish they could go into more detail,” Stephen says. “It feels like if you want to really do your homework, and you really want to put a great video together, you’re on the wrong platform.”

Youtube isn’t the only medium to offer edutainment. Before the platform’s first video was uploaded in 2005—an 18-second clip of one of the site’s founders at the zoo, which now has more than 80 million views—television was responsible for the vast majority of the genre.

David Attenborough’s “Life on Earth,” Carl Sagan’s “Cosmos,” and kid-themed fare like “The Magic School Bus” captured generations of viewers with explorations of the universe and all the knowledge that science had to offer about it.

There were relatively few broadcasters, and in their educational programming they enforced an ethos that generally prized accuracy over entertainment, according to Carlson. But over the years that crumbled. From misleading depictions of sharks as lethal human-hunting machines during Discovery’s Shark Week to pseudoscientific documentaries like History’s Ancient Aliens, the scientific quality of the content declined.

Worries about a lack of gatekeeping fueled Andrew Keen’s 2007 book “The Cult of the Amateur,” one of the earliest broadsides leveled against user-generated content, from blogs to YouTube to Wikipedia. Keen blamed these upstarts for destroying professionally made media like the Encyclopedia Britannica and newspapers. Some aspects of the argument have aged better than others. While Wikipedia did mostly supplant Britannica and other professional encyclopedias, studies have long suggested the crowd-sourced version is just as accurate. And since YouTubing is no longer only an activity for amateurs, there have been some improvements in style and substance.

“It would be too easy to say, ‘YouTube is the key area where all the bad stuff happens and everything else is really nice and good,’ ” says Allgaier.

YouTube creators are experimenting with ways to improve educational videos. Last year, the YouTube channel Kurzgesagt, which explains scientific topics to its 10 million subscribers in short animated videos, published an unusual video titled “Can You Trust Kurzgesagt Videos?” This meta-video explains Kurzgesagt’s relatively new process of contacting multiple experts and soliciting feedback. In years past, its videos didn’t go through this filter, and in the video about their new process the Munich-based team announced they were removing two videos, which they said no longer met their standards for quality.

It’s an approach that’s not so different from what traditional science publications are doing to build trust. For instance, in October 2019, Science News released a guide explaining their standards for journalism, answering similar questions about how they find and evaluate expert sources.

Another recent development on YouTube has been the use of the description box, located directly below the video. Typically used for copyright disclaimers or links to other social media accounts, many YouTubers have begun to treat it as a bibliography for citations. The lists are frequently comprehensive. When the popular German YouTuber Rezo posted an hour-long takedown of the German government’s inaction on climate change, he prepared a 13-page document with hundreds of references to academic papers, videos, websites, and popular science articles. Even the absurdist “True Facts” nature documentary spoof has gotten serious about verifying its facts with scientists and providing sources in the description box.

It’s unclear whether user-led sourcing efforts will have a large effect. Before his recent misconduct, Raval presented himself as an expert in data science by using others’ code. In one of his apologies, he said that “All I did was was reupload someone else’s code. It’s not my code. The reason I did that is just selfishness and ego.” Raval then went on to read a list of developers he’d failed to attribute code to.

Still, multiple experts Undark spoke with agreed that the efforts to clearly attribute credit were promising. “How the YouTubers deal with sources is actually more transparent,” says Allgaier. “Journalists could learn from this as well.”

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.