Last year, Instagram added a way for users to filter some kinds of “sensitive” content out of the Explore tab. Now, Instagram is expanding that setting, letting users turn off that content in recommendations throughout the app.
Instagram doesn’t offer much transparency around how it defines sensitive content or what even counts. When it introduced the sensitive content control last year, the company framed sensitive content as “posts that don’t necessarily break our rules but could potentially be upsetting to some people — such as posts that may be sexually suggestive or violent.”
The expanded content controls will soon apply to search, Reels, hashtag pages, “accounts you might follow” and in-feed suggested posts. Instagram says the changes will roll out to all users within the coming weeks.
Rather than letting users mute certain content topics, Instagram’s controls only have three settings, one that shows you less of this bucket of content, the standard setting and an option to see more sensitive content. Instagram users under the age of 18 won’t be able to opt for the latter setting.
In a Help Center post explaining the content controls in more depth, describing the category as content that “impedes our ability to foster a safe community.” Per Instagram, that includes:
“Content that may depict violence, such as people fighting. (We remove graphically violent content.)
Content that may be sexually explicit or suggestive, such as pictures of people in see-through clothing. (We remove content that contains adult nudity or sexual activity.)
Content that promotes the use of certain regulated products, such as tobacco or vaping products, adult products and services, or pharmaceutical drugs. (We remove content that attempts to sell or trade most regulated goods.)
Content that may promote or depict cosmetic procedures.
Content that may be attempting to sell products or services based on health-related claims, such as promoting a supplement to help a person lose weight.”
In the imagery accompanying its blog posts, Instagram notes that “some people don’t want to see content about topics like drugs or firearms.” As we noted when the option was introduced, Instagram’s lack of transparency on how it defines sensitive content and its decision to not offer users more granular content controls is troubling, particularly given its decision to lump sex and violence together as “sensitive.”
Instagram is a platform notorious for its hostility to sex workers, sex educators and even sexually suggestive emoji. The update is generally more bad news for accounts affected by Instagram’s aggressive parameters for sexual content, but those communities are already well accustomed to bending over backward to remain in the platform’s good graces.
From where we’re standing, it’s not at all intuitive that a user who doesn’t want to see posts pushing weight loss scams and diet culture would also be averse to pictures of people in see-through clothing, but Instagram is clearly painting in broad strokes here. The result is a tool that invites users to turn off an opaque blob of “adult” content rather than a meaningful way for users to easily avoid stuff they’d rather not see while surfing Instagram’s algorithms.