World

Andrew Tate has been banned from social media. But his malicious content still reaches young men

Last month, controversial influencer Andrew Tate was banned from several social media platforms for violating their policies.

But nearly two weeks after these bans, the platforms are still inundated with clips of Tate making comments derogatory to women – highlighting what some media experts say is part of a dangerous system. There is a danger that the algorithm can be manipulated to radicalize young men into adopting harmful views against women and the LGBTQ community.

And as the Tate case shows, banning controversial figures can actually make matters worse.

Tate, a former boxer, rose to fame after appearing on the UK reality show Oldest brother in 2016. He was dropped from the show when a video of him appearing to assault a woman with a belt was made public. Tate has said that the matter is unanimous.

Recently, he has gone viral thanks to recordings shared on platforms like TikTok. These clips feature Tate, often wearing a shirt and sunglasses, making comments that are offensive to women. One notable example includes a clip of Tate saying that if a woman dates a man she “belongs” to him. In another clip, Tate suggests women with social media accounts are cheating.

In a video posted to Vimeo on August 23, Tate responded to the bans saying he was “unfairly defamed” and that his comment was taken out of context.

Tate did not respond to a request for comment from CBC News.

From harmless memes to mischief abounds

Joanna Schroeder, a writer with a focus on gender and media representation, says content like Tate’s often starts out in a relatively innocuous fashion, but then gradually becomes more nefarious.

For example, she said, boys often visit sites like YouTube to look for videos related to Minecraft, a hugely popular video game. But YouTube’s algorithm will usually guess their age and gender – and Schroeder says it can then push harmful content at them.

VIEW | Algorithms and their working programs:

How Algorithms Target Young Men

Joanna Schroeder, a writer focusing on gender and media, explains why social networking algorithms target young men and how this can impact what they do. watching online.

“There are people who want to target this demographic, who are starting to show them content that’s become more and more bizarre.”

Schroeder says that Tate’s appeal is partly down to how his views are framed. The idea that what he’s saying is an “unpopular opinion that no one else will say out loud” may suggest to a young person that it has merit, she said.

And since “shrewd” content often presents itself as something a younger demographic should consider normal – or even funny – it slowly becomes problematic.

An example of that is the Pepe the Frog meme, which starts off as a harmless cartoon frog and turns into a symbol of hate.

Pepe the Frog started out as a non-political meme but was later adopted by the alternative movement. (Wikipedia)

It started out as a popular non-political meme on sites like Myspace and 4chan in the 2000s. But as its popularity grew, it was usurped by the alt-right movement.

Schroeder said Pepe began to represent “anti-gay” and “anti-women” sentiments. And she said that teenagers may initially see the meme as a joke, but over time it can influence the way and thinking of young people.

Ellen Chloë Bateman, a documentary and podcast producer who has studied online radicalization in young men and subcultures.

Violence against women became normalized, she said, embedding herself in the psyche of young men through images and memes, in what she called “a culture of intense competition and s one-man god”.

Schroeder says this can often be seen on TikTok. Videos with clips from creators like Tate will often also share video displays from games like Minecraft or Call of Duty to try to get teenagers involved.

This screen features a TikTok video of controversial creator Sneako, along with Minecraft gameplay. The creators try to capture the attention of young men and teenagers by combining their clips with video games. (@hiddeno.talks/TikTok)

At this point, she said, some social media algorithms notice a high level of user engagement – and then start serving them more “clearly racist” content.

“The algorithms that push the content are often extreme. Extreme views, hateful views get a lot of traction on places like YouTube… and TikTok,” said Schroeder.

Go to ‘manosphere’

The parts of the Internet where these memes, and sometimes more obvious racist or misleading content, are circulating are what Bateman calls the “manosphere”.

She describes it as a space where “men’s rights activists, male secessionists, nihilists, sexual predators and trolls – who often share their membership members with neo-fascist and far-right groups – gather.”

VIEW | ‘Manosphere’: Where incels, trolls and Neo-Nazis meet:

What is ‘Manosphere’?

Ellen Chloë Bateman, a documentary and podcast producer, disrupted what is known as the ‘manosphere’, an area of ​​the Internet where extremist groups often congregate and target young men.

“What unites them all is a radical anti-feminist worldview,” says Bateman.

And far-right groups often use this space to target young and impressionable men, she said.

Where do social media bans come from?

Social media companies say they are actively working to remove this type of content – as studies have found that hate speech online is correlated with an increase in physical violence and hate crimes.

In Tate’s case, TikTok, Facebook and Instagram removed his content.

A TikTok spokesperson said “misogyny is a hate ideology that is not tolerated on TikTok” and it continues to investigate other accounts and videos that violate its policies.

The spokesperson also said that TikTok is looking for ways to “strengthen enforcement” against this type of harmful content.

That includes a partnership with UN Women and another NGO looking to stop Violence Against Women and Girls to launch a new in-app hub to educate users about violence. on the basis of gender.

Bateman says partnerships like these are essential for social media spaces to become safer and more educational, especially for young people.

Twitter has also taken action against the controversial creators. The platform has issued temporary bans on creators like Jordan Peterson, Matt Walsh, and Steven Crowder. (Each creator was later allowed back into the app.)

But Schroeder says bans can sometimes be counterintuitive. In Tate’s case, in some ways, it may have actually helped him.

“The bans are just drawing more attention to him,” she said. “It gave him a very large microphone.”

Switch to other platforms

Bateman agrees, pointing out that these creators often find new apps, like Reddit, Gab, Telegram, and Discord, to post their content.

She said some of these platforms are actually harder to monitor due to their closed group structures or subscription requirements, making it harder to research and track content. She found a website devoted to the incel subculture, which promotes wrongdoing and violence, has 17,000 users.

“It’s a complex online world. It’s dynamic… it’s moving. It’s spreading all over the place and these groups are basically banded together into one big group of hate.”

Source link

news7h

News7h: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button