People are at the heart of effective digital defense
Therefore, content moderation—UGC monitoring—is essential to the online experience. In his book The Internet Monitor, sociologist Tarleton Gillespie writes that effective content moderation is necessary for digital platforms to function, despite the “utopian concept” of an internet. open. “There is no platform that does not impose rules, to some extent—failure to do so simply cannot stand,” he wrote. “Platforms must, in one form or another, be moderate: both to protect one user from another user or a group from adversaries, and to eliminate offensive, vile or illegal—and show their best face to new users, to their advertisers and partners, and to the general public.”
Content moderation is used to address many types of content, across industries. Ingenious content moderation can help organizations keep their users safe, their platforms usable, and their reputation intact. The best practice approach to content moderation relies on increasingly sophisticated and precise technical solutions and supports those efforts with human skill and judgment.
Content moderation is a rapidly growing industry that is vital to all the organizations and individuals focused in the digital space (that is, more than 5 billion people). According to Abhijnan Dasgupta, director of practice for trust and safety (T&S) at Everest Group, the industry is valued at around $7.5 billion by 2021—and experts predict that number will double. in 2024. Gartner’s research suggests that nearly a third (30%) of large companies will make content moderation a top priority by 2024.
Content moderation: More than social media
Content moderators remove hundreds of thousands of problematic content every day. by Facebook Community Standards Enforcement ReportFor example, the documents show that in the third quarter of 2022 alone, the company removed 23.2 million cases of violent and objectionable content as well as 10.6 million cases of hate speech—in addition to There are also 1.4 billion spam posts and 1.5 billion fake accounts. But while social media may be the most widely reported example, a large number of industries rely on UGC—everything from product reviews to customer service interactions—and thus requiring content moderation.
Mary L. Gray, a senior principal investigator at Microsoft Research who also serves on the faculty of the Luddy School of Computing, Computing, explains: “Any website that allows the submission of information is not produced internally. All inputs are subject to content moderation. Engineering at Indiana University. Other areas that rely heavily on content moderation include telemedicine, gaming, e-commerce and retail, the public sector, and government.
In addition to removing offensive content, content moderation can detect and remove bots, identify and remove fake user profiles, address fake reviews and ratings, remove spam, control deceptive advertising, reduce deceptive content (especially content targeting minors) and facilitate safe both-way communication
in online messaging systems. One area of concern is fraud, especially on e-commerce platforms. “There are a lot of bad guys and scammers trying to sell fake products,” said Akash Pugalia, global president of reliability and safety at Teleperformance, which provides non-serious content moderation support. —and there is also a big problem with fake reviews.” global brand. “Content moderators help ensure products follow the platform’s guidelines, and they also remove banned merchandise.”
This content is produced by Insights, the custom content arm of MIT Technology Review. It was not written by the editorial staff of the MIT Technology Review.