Illyboy Posted July 17 Posted July 17 (edited) Quote The researchers say that their findings prove no active collaboration between TikTok and far-right parties like the AfD but that the platform's structure gives bad actors an opportunity to flourish. Literally what @Communion said above ig a solution would be for other parties to start pushing more content on tik tok to out-content AfD (if this makes any sense) Edited July 17 by Illyboy 1
Gesamtkunstwerk Posted July 17 Posted July 17 It's funny how we hear this again and again. People get radicalized through social media, especially with right wing content, yet nothing happens. No one demands these social media companies to do better, we just let them do it 1
Illyboy Posted July 17 Posted July 17 Just now, Gesamtkunstwerk said: It's funny how we hear this again and again. People get radicalized through social media, especially with right wing content, yet nothing happens. No one demands these social media companies to do better, we just let them do it I was abt to say sth and this gives me a good foot for it, we should take into consideration content bubbles: One thing media algorithms do is "personalize your experience" aka if you like something they'll show you more of it. What this ends up doing is that after some time they don't show you new perspectives and just regurgitate the same things. In other words, if an user starts liking some of the alt-right content that is recommended to them from the beginning (just bc of how much of it there is) then the algorithm will push that content more because the user reacted to it positively before. This is only a problem with political stuff, if I like dog videos and tik tok recommends me a lot of dog videos to the point I don't get any cat videos or bird videos anymore it's not really a "problem", but it is one when it radicalizes people's political views. 1 1
Communion Posted July 17 Posted July 17 7 minutes ago, Illyboy said: Literally what @Communion said above ig a solution would be for other parties to start pushing more content on tik tok to out-content AfD (if this makes any sense) Sadly it's Germany so the liberals in the country seem more concerned with banning pro-Palestine content off the platform than generating content that actually engages their young voters or finding a way to target actual hate speech and not kids chanting "from the river to the sea".
Illyboy Posted July 17 Posted July 17 Just now, The7thStranger said: now that's the most HD pikachu surprised face image I've seen. 1
on the line Posted July 17 Posted July 17 1 hour ago, Communion said: This. It's not "TikTok is made to serve propaganda". It's called a feedback loop. You search content and then you will be delivered more of.that content. It amplifies general sentiments that already exist. So in a conservative country like Germany, conservative users will end up populating and perpetuating more of that content, feeding it back into the algorithm as consumption and keeping it high on your recommended. Most apps do this. I send my mom a cat video on Instagram and now my feed is largely cat videos. TikTok just does this extremely well plus the swipe up for new content mechanism is particularly pleasing and let's you consume more content quickly. In countries like America where the user base is largely pro-Palestine, you'll end up getting recommended more pro-Palestine videos. In contrast, from what I've seen of TikTok in Israel, young Israelis have basically locked themselves in a hasbara pro-genocide feedback loop that will take decades of forced re-education to fix. But TikTok hasn't created that out of thin air as part of some Chinese scheme. That's just part of Israeli culture. Like how far-right sympathies are part of German youth culture. People blame conspiracy when they don't like a piece of technology accidentally acting as a mirror and reflecting back to them their society's values. Agreed. See the Official 2024 Presidential Thread for another example. 1
Illyboy Posted July 17 Posted July 17 3 minutes ago, Communion said: Sadly it's Germany so the liberals in the country seem more concerned with banning pro-Palestine content off the platform than generating content that actually engages their young voters or finding a way to target actual hate speech and not kids chanting "from the river to the sea". and I thought Spain was cooked... 1
NextBish90 Posted July 17 Posted July 17 It's a Chinese app, what would we expect? It's not only in Germany though. 1
Gesamtkunstwerk Posted July 17 Posted July 17 4 minutes ago, Illyboy said: I was abt to say sth and this gives me a good foot for it, we should take into consideration content bubbles: One thing media algorithms do is "personalize your experience" aka if you like something they'll show you more of it. What this ends up doing is that after some time they don't show you new perspectives and just regurgitate the same things. In other words, if an user starts liking some of the alt-right content that is recommended to them from the beginning (just bc of how much of it there is) then the algorithm will push that content more because the user reacted to it positively before. This is only a problem with political stuff, if I like dog videos and tik tok recommends me a lot of dog videos to the point I don't get any cat videos or bird videos anymore it's not really a "problem", but it is one when it radicalizes people's political views. Definitely It's also worth noting that right wing content creators are very good at staying in the algorithms with grifting, rage baiting, saying outrageous things, and it generates a HUGE amount of money for whatever social media company that hosts them, so it's pushed more so than a lot of other content
Communion Posted July 17 Posted July 17 (edited) 7 minutes ago, on the line said: Agreed. See the Official 2024 Presidential Thread for another example. Your posts work as a great example of liberals building conspiracies of something insidious and nefarious working from the shadows because they can't contend with that their political views are antiquated and no longer reflected in the majority of people in their country and thus all views but their own most be artificial. Something something Russia! Interesting that neoliberalism only seema to still thrive on Twitter, the platform we know the owner has an editorial bias in what content gets promoted. Edited July 17 by Communion
Digitalism Posted July 17 Posted July 17 The thing is people are not thaaaat easy to convince. They were probably people that already were on the fence and keen on the right
Illyboy Posted July 17 Posted July 17 (edited) 14 minutes ago, Digitalism said: The thing is people are not thaaaat easy to convince. They were probably people that already were on the fence and keen on the right yep, see what what was written above about bubbles/feedback loops If someone is already right wing, social media algorithm will just show them even more right-wing content and they'll likely radicalize. It could explain the polarization we have right now, with many people being very right-y and many people being very left-y and the main "centre-left" and "centre-right" parties trying to if not already failing keep some voters. Spoiler (I mean, over here the main "centre-right" party did gain votes in the last elections (EU elections) but mostly because they rallied against some specific policies that are very unpopular among pretty much everyone (edit: basically snatching moderate voters from the main centre-left party and the leftovers of the libertarian party, which was already on the way out after failing to distinguish themselves from that main centre-right party). The far-right only "tanked" because a different far-right party popped up and stole potential voters from them. The far-left has been tanking for years because it has split into a million pieces so the voters never concentrate onto one single party and that means less seats for all of them if they get any seats at all. For the sake of completion (bc i already mentioned every other party), the centre-left party also lost voters but not many seats surprisingly) Edited July 17 by Illyboy
Digitalism Posted July 17 Posted July 17 3 minutes ago, Illyboy said: yep, see what what was written above about bubbles/feedback loops If someone is already right wing, social media algorithm will just show them even more right-wing content and likely they'll radicalize. It could explain the polarization we have right now, with many people being very right-y and many people being very lefty and the main "centre-left" and "centre-right" parties trying to if not already failing keep some voters. Hide contents (I mean, over here the main "centre-right" party did gain votes in the last elections (EU elections) but mostly because they rallied against some specific policies that are very unpopular among pretty much everyone, the far-right only tanked because a different far-right party popped up and stole voters from them, the far-left has been tanking for years because it has split into a million pieces so the voters never concentrate onto one single party and that means less seats for all of them if they get any seats at all) Yeah is the far trying to get the centre votes. In reality not that many people will really change their minds tho. But it will help to win an election in a close horse race.
ScorpiosGroove Posted July 17 Posted July 17 highly dubious chinese spy app spreads propaganda, i'm shocked 1 1
vale9001 Posted July 17 Posted July 17 (edited) 1 hour ago, Communion said: TikTok is literally the most prominent platform leftist activism is being built on, so much so the US is literally trying to ban it due to anti-Israel sentiment on the app growing to the point of intervening within national politics. Leftit propaganda too. It'still China- Russia- Iran behind It. Edited July 17 by vale9001 1 1
Communion Posted July 17 Posted July 17 25 minutes ago, vale9001 said: Leftit propaganda too. It'still China- Russia- Iran behind It. Do you think think the frogs are gay too? 1
Recommended Posts