SHARE

This story originally featured on Task & Purpose.

There’s a war going on and, like it or not, you’ve already been drafted into it. But there’s no battlefield, no explosions, and no uniforms in this fight. Instead, it’s unfolding in Reddit threads, Facebook comments, and other social media forums where people swap information and, unwittingly or not, disinformation. Unlike misinformation, which includes false, incomplete, or misleading information shared without the intent to mislead a target population, disinformation is deliberately designed to mislead targeted groups of people, and is being used by other countries such as Russia and Iran to advance their own interests in the United States.

That, at least, is the picture painted by a new RAND report about foreign disinformation being spread on social media. The results are grim: nearly five years after disinformation first made headlines after Russian troll farms spread millions of misleading Facebook, Instagram and other social media posts in the midst of the 2016 U.S. presidential election, the U.S. government’s response to the threat “remains fractured, uncoordinated, and—by many actors’ own admission—dubiously effective,” the report says.

While social media may be the latest tool for deploying it, disinformation is nothing new in warfare. The famous Ghost Army of World War II involved the U.S. Army’s use of inflatable tanks, planes, artillery pieces, and fake messages to convince Nazi Germany that the Allies had two more divisions than they actually did. More recently, disinformation has become much more high-tech and effective at sowing confusion in rival countries. 

Russia used disinformation to try to discredit the government of Ukraine during its 2014 invasion of the country. Moscow’s army of shit posters sought to portray the government as a fascist, xenophobic, racist, anti-Semitic junta in an effort to sow unrest among its people, according to Foreign Policy. The trick still seems to be working: just two months ago, Sen. Ted Cruz (R-Texas) shared Russian propaganda in an attack on a U.S. Army corporal after she talked about how her two moms inspired her to join the Army in a recruiting commercial. Russia also used disinformation to try to undermine the effectiveness of the Pfizer COVID-19 vaccine.

“These manipulations don’t create tendencies or traits in our societies,” said Molly McKew, an expert on information warfare, at a 2017 Senate hearing on Russian disinformation. “They elevate, exploit, and distort divides and grievances that already are present.”

But there might be a way to turn things around, and Air Force special operations might be near the center of it. The RAND study was commissioned by Air Force Special Operations Command, which, like the Army Special Operations Command, hosts psychological warfare units. Such units have become too focused on operational security (i.e. preventing data leaks, watching for aircraft spotters) during the wars in Iraq and Afghanistan, and they have fallen out of the practice of proactively detecting and countering disinformation campaigns, RAND found.

The Air Force seems to be aware of this problem. In 2016, the service created a new career field called 14F Information Operations. The first nine airmen to join the field graduated skills training in December, 2020, where they learned all about “military information support operations, operations security, and military deception,” according to a press release.

14F is a step in the right direction, RAND wrote, but the problem is that the unit “lacks both the resources and the training to look for disinformation campaigns,” nor does it have a mechanism for passing the information to another element that can investigate further. The Air Force could learn from the Marine Corps, which has a Marine Information Group; training exercises where information operations plays a role in larger combat, and a three-star general in charge of developing plans, policies and strategies while serving as Deputy Commandant for Information, RAND wrote.

Read the rest of the article on the report over at Task & Purpose.