Instagram’s ‘sensitive content’ control will soon filter all suggested content – TechCrunch
Last year, Instagram added a way for users to filter certain types of “sensitive” content. out of the Explore tab. Now, Instagram is expanding that setting, allowing users to turn it off in app-wide recommendations.
Instagram doesn’t offer much transparency about how it identifies sensitive content or what’s even valuable. When it introduced its sensitive content controls last year, the company framed sensitive content as “posts that don’t necessarily violate our rules but may be offensive to some people.” people – such as posts that may be sexually explicit or violent”.
Extended content controls will soon apply to search pages, Stories, hashtags, “accounts you can follow,” and suggested posts in the feed. Instagram says the changes will roll out to all users over the next few weeks.
Instead of allowing users to hide certain content topics, Instagram’s controls only have three settings, one that shows you less content, a standard setting, and an option for viewing content. more sensitive. Instagram users under the age of 18 will not be able to choose the second setting.
In one Posts on Help Center further explains the content controls, describing the category as content that “hinders our ability to promote a safe community”. On Instagram, including:
“Content may depict violence, such as people fighting. (We remove graphic violence.)
Content that may be sexually explicit or suggestive, such as photos of people wearing see-through clothing. (We remove content with adult nudity or sexual activity.)
Content that promotes the use of certain regulated products, such as tobacco or tobacco products, adult products and services, or pharmaceutical drugs. (We remove content that attempts to sell or trade most controlled goods.)
Content may promote or depict cosmetic procedures.
Content may be trying to sell products or services based on health-related claims, such as promoting dietary supplements to help a person lose weight. “
In images accompanying its blog posts, Instagram notes that “some people don’t want to see content about topics like drugs or guns.” As we noted when this option was introduced, Instagram’s lack of transparency about how sensitive content is identified and its decision not to provide users with more granular content controls is causing trouble. confusion, especially when the decision to lump sex and violence together is “sensitive”.
Instagram is a platform known for its hostility to sex worker, sex educators and even sexually suggestive emoji. The update is generally more bad news for accounts impacted by Instagram’s positive parameters for sexual content, but those communities are used to stepping back to stay good. beautiful background.
From where we are standing, it’s not intuitive that a user who doesn’t want to see posts promoting weight loss scams and diet culture would also be averse to photos of people wearing see-through clothing, but Instagram is clearly drawing broad strokes here. The result is a tool that invites users to turn off a fuzzy patch of “adult” content rather than a meaningful way for users to easily avoid things they don’t want to see while surfing Instagram’s algorithms.