Last week, Instagram announced that it was introducing a new feature for users that will allow them to curate what kind of content they would see in their Explore page.
The feature, called Sensitive Content Control, lets users choose if they want to see more or less “sensitive content” from people or accounts that they do not follow. “You can think of sensitive content as posts that don’t necessarily break our rules, but could potentially be upsetting to some people — such as posts that may be sexually suggestive or violent,” Instagram says in their press release.
“Because we show you posts from people you don’t follow on Explore, we try not to show content that some may find sensitive,” the company elaborated on Twitter following the announcement. That could include content like “posts about smoking, violent posts like people fighting, or about pharmaceutical drugs.”
The user’s regular Feed, Stories, and Reels will not be affected by this new control. Instagram has previously explained that the purpose of the Explore tab—find it by clicking the magnifying glass in the bottom panel—is to help users discover new things by recommending photos and videos that it thinks a certain user might like based on what posts they have historically liked, saved, and commented on.
Experts have mixed feelings about the impacts of the new feature. Corynne McSherry, the legal director of the Electronic Frontier Foundation, thinks that this move might have both positive and negative effects on user experience. “I think that any time that we see platforms giving users some control is a good thing,” she says. “One of the things that we see in content moderation spaces is that far too often, so much power is placed in the hands of the platforms and the decisions that they make have huge ramifications for online speech and online expression.”
[Related: How to ignore people on social media without blocking or unfollowing them]
Instagram, which Facebook purchased in 2012, has been repeatedly criticized for some of its content moderation decisions and censorship guidelines. Back in 2017, the company began blurring “sensitive” posts before users could view them. But even then, the bounds around what was considered to be “sensitive” was murky at best. To date, Instagram still struggles with this. McSherry thinks that the new policies still don’t fully address this issue. “The place where users still don’t have any power is in deciding what’s going to qualify as sensitive. You have a blunt instrument here, and you can dial it up or dial it down. But you can’t really influence very much what is going in in the first place,” she says. “And a lot of content that Instagram or Facebook might consider sensitive, other people wouldn’t.”
Because the filter functions like an imprecise catch-all, many people might not realize that some of the “sexually explicit” content that gets flagged could quite often be various artistic images or photos from museums that just happen to depict nudity, McSherry explains. “Similarly, a thing that we’ve seen is that content that is actually talking about abuse—trying to raise awareness of abuse and violence—can be flagged itself as abusive and violent,” she adds. “It’s hard to really be nuanced when you have millions or billions of pieces of content to flag.”
Joshua Tucker, co-director of the NYU Center for Social Media and Politics, says that sensitivity filters usually rely on a way of coding content at scale, which will have to be done in part by automated technologies like machine-learning or AI. “They get better and better over time, and violence strikes me as one of the things these types of deep learning methods are good at [spotting]. But, there’s going to be slippage with these types of things. None of these things are ever going to be perfect,” he says.
Additionally, to Tucker, it is also currently unclear how much this policy will affect most users. “One of the things we want to know is how much are people getting content off of the Explore tab versus content off of their followers,” he says. “If you’re announcing a policy that makes a place where you get 1 percent of your content safer, that’s really different than some place where you get 60 percent of your content.”
Instagram did not respond to a request for comment.
You can view and change your Sensitive Content Control by going to your profile, then access the Settings menu through the three-line hamburger-like icon in the upper right corner. Tap on Account, then Sensitive Content Control. The default setting is to “Limit” some sensitive content, but you can choose to “Limit Even More,” or “Allow” all sensitive content. However, the Allow option is not available to people under 18, a decision that is likely a part of Instagram’s new initiative to give younger users a safer and more private experience.