TikTok’s algorithm is highly sensitive – and could send you down a hate-filled rabbit hole before you know it | TikTok

TikTok’s algorithm works in mysterious ways, but a Guardian Australia experiment on a blank account shows how quickly a breaking news event can funnel users down a conservative Christian, anti-immigration rabbit hole.

Last week we reported how Facebook and Instagram’s algorithms are luring young men into the Manosphere. This week, we explore what happens when TikTok’s algorithm is unleashed on a blank account in the absence of any interactions such as liking or commenting.

In April, Guardian Australia set up a new TikTok account on a completely blank smartphone linked to a new, unused email address. A John Doe profile was set up as a generic 24-year-old male. We scrolled through the feed every couple of weeks.

Initially it was difficult to identify a clear theme to the video being served through the app. Then the Wakeley church stabbing attack happened on 15 April.

For the first two days of the experiment, TikTok served up generic content about Melbourne, where the phone was located, along with videos about iPhone hacks – typical content one can expect on TikTok as an iPhone owner.

After the April attack on him, videos of Mar Mari Emmanuel’s conservative Christian sermons began appearing on the For You page of the blank account set up by Guardian Australia. Photograph: Supplied

On day three, TikTok news content began appearing, coinciding with the stabbing attack on bishop Mar Mari Emmanuel at the Assyrian Christ the Good Shepherd church in the Sydney suburb of Wakeley.

It wasn’t the video of the stabbing itself, but rather videos of Emmanuel’s evocative and conservative Christian sermons. Watching them appears to have triggered TikTok’s algorithm – more and more of his sermons were served up and conservative Christian videos began appearing one after the other.

Three months later, the algorithm is still serving up conservative Christian content, alongside videos that are pro-Pauline Hanson, pro-Donald Trump, anti-immigrant and anti-LGBTQ – including one video suggesting drag queens be fed into a woodchipper.

As with the experiment run in parallel on Instagram and Facebook accounts, no posts were liked or commented on. But unlike that experiment, TikTok’s algorithm appears to be much more sensitive to even the slightest interaction – including the time spent watching videos. It will push similar content to users unless you indicate you’re not interested.

“The more someone searches or engages with any type of content on TikTok, the more they will see,” a TikTok spokesperson said. “But at any time, you can totally refresh your feed, or let us know that you’re not interested in a particular video, by long pressing on the screen and selecting, ‘not interested’.”

Jing Zeng, assistant prof of computational communication science at the University of Zurich, says there is a lot of randomness in TikTok’s “for you” algorithm, and early interactions can have strong implications for what you see.

“If their first pro-Trump video ‘made you look’, then the ‘for you’ algorithm may test more of such content.”

skip past newsletter promotion

Jordan McSwiney, senior research fellow at the University of Canberra’s Centre for Deliberative Democracy and Global Governance, says TikTok’s approach differs from that of Facebook and Instagram because it has a more active recommendation system, designed to keep users engaging with videos one after the other. He says Meta is introducing this into its Reels product, which has a lot of the same features as TikTok.

An example of the content served up on TikTok’s For You page on the blank account set up by Guardian Australia. Photograph: Supplied

“We know that these platforms, they’re not operating with any kind of social licence. They’re not like a public broadcast or anything. They are beholden to one thing and one thing only, and that’s their bottom line,” he says.

“Their modus operandi is not to facilitate nuanced debate, to promote a healthy democratic public sphere. It is to create content that people will keep clicking, to keep eyeballs on the app, to keep people scrolling, because that’s advertising revenue.”

McSwiney says governments have a role in forcing tech platforms to be more transparent in how the algorithms operate, as they currently exist in a “black box”, with limited ability for researchers to see how they operate.

He says the platforms can’t shrug off concerns over what is being served up as merely a reflection of the society in which they operate.

“I just I don’t think we should let multibillion dollar companies off the hook like that. They have a social responsibility to ensure that their platforms are not causing harm – their platforms shouldn’t be promoting sexist content, [and] shouldn’t be promoting racist content.”

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Swift Telecast is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – swifttelecast.com. The content will be deleted within 24 hours.

Leave a Comment