TikTok as a stage for extremist ideologies? - How TikTok and co. are changing our reality
When we think of social networks and platforms, a familiar buzzword often comes to mind: filter bubbles. TikTok in particular is often associated with the idea that algorithms keep users trapped in their own ideology. Two young researchers from Karlsruhe have now found exciting approaches that could challenge this common assumption.
Marcel Erik Lemmer and Ioannis Theocharis from the Karlshochschule Karlsruhe are investigating how TikTok as a platform is shaping a young public and possibly promoting new forms of political debate. Their initial findings suggest that TikTok not only amplifies opinions, but could also offer space for diversity and democratic discourse – a fascinating idea that raises many questions.
In this interview, the two talk about their experiment and the dynamics they observed on TikTok. What content do we actually see – and what could this digital public sphere mean for our democracy? The answers are as surprising as they are worthy of discussion.
karlsruhe.digital (k.d): TikTok is often perceived as a platform for superficial entertainment, but you have also found evidence in your research that it can promote political mobilization and diversity of opinion. In your opinion, what are the indications that TikTok could actually contribute to democratization – and where are the limits?
Marcel Erik Lemmer (MEL): I think we first have to recognize that large sections of democratic societies hardly have any access to day-to-day politics. People also crave larger narratives and discourses – and TikTok addresses both of these needs. Of course, it is a problem that extremists in particular have used the platform successfully to date. Low-threshold communication of political content can only be a first step towards creating a more mature political public again. TikTok has played a particularly important role in waves of protest, most recently in Georgia and Syria, by enabling rapid mobilization and the dissemination of information. At the same time, TikTok is not as simple as it seems. The complexity of jokes, memes and their political message is often underestimated as they can convey subversive and profound criticism in an accessible way. TikTok can be the best “politics for beginners” course and therefore a democratization tool.
Ioannis Theocharis (IT): TikTok videos may not do full justice to the complexity of a topic in some places due to their low-threshold communication. At the same time, they are part of a digital ecosystem that is constantly evolving and combining different perspectives. At the same time, TikTok videos are part of a digital ecosystem that is in a constant process of adaptation and coexistence of divergent perspectives. It can therefore be argued that the diversity of opinions is potentially increased and thus the depth of illumination of a topic also increases. It is important for us to emphasize that traditional media also “flatten” topics and present them in an under-complex way. The “depth” that could be achieved by presenting different perspectives is often missing here. Often the pressure to professionalize or the desire to present socially desirable positions prevents a truly diverse presentation of opinions and viewpoints.
k.d: So could platforms like TikTok even help to unite young people from different political camps instead of driving them further apart?
MEL: That really is key. It is important to see that TikTok not only reaches young people, but also appeals to a broad section of the population. What stands out are new discourses and points of contact that have not existed for so long. TikTok has the potential to promote political education and participation, even if this potential is often overshadowed by other content. The direct confrontation is particularly exciting: political groups meet via stitches and livestreams and discuss at a respectable level – often with humor and irony. On TikTok, viewers can send virtual gifts during livestreams. These gifts appear as symbols such as cowboy hats, flowers or other animations and are displayed directly in the stream. They are often humorously integrated into the conversation or the proceedings.
k.d.: How do you explain the fact that this aspect of deradicalization was not only not noticed by others, but that there was actually a consensus to the contrary?
MEL: We may not be that alone with our findings. Radicalization and deradicalization processes, like the concepts of filter bubbles, social bots or disinformation, are scientifically more controversial than the public debate often suggests. Overall, this seems to be primarily a problem of science communication. We can therefore be all the more grateful to formats such as karlsruhe.digital, which make such topics accessible to a wider public.
k.d: Do you see this danger that TikTok is becoming a platform for the radicalization of young peopleor does your research not support this at all?
IT: The possibility of disproportionate exposure to extremist content, at least in the initial phase of a TikTok account, cannot be completely ruled out. Social media algorithms favor a tendency to increase the interaction rate, which is already a given for some target groups in the case of extremist content due to its “unusualness”. The characteristic of “exceptionality” can be a decisive factor for young people. Our observations suggest that AfD TikToks, for example, can act as entry videos for extremist content. However, there is also the possibility that the first connection to the AfD is established through extremist videos. Combined with the slot-machine dynamics of these platforms, which aim to maximize interaction time with the medium, there is the potential to keep people in these political digital spaces. However, there has been little development in the research literature to date that addresses the question of the extent to which people are actually radicalized or even deradicalized by platforms such as TikTok, particularly through the variety of content that can be presented there.

k.d: Can you explain in more detail how the algorithm extremist content on TikTok due to its ‘exceptionality’? amplified?
IT: In the case of a shitstorm, for example, i.e. a large number of comments that are characterized by predominantly derogatory content, these are “translated” by the algorithms as significant interaction. As interaction is the “currency” of these platforms, a wide range of interactions are rewarded, which helps to increase the reach of videos that are particularly polarizing.
MEL: Shitstorms are another good example of the limits of the bubble concept. Phenomena such as “hate following” or shitstorms show that communities deliberately enter other bubbles in order to disturb their feel-good atmosphere and confront them with their content. The current understanding of the term bubbles does not cover such phenomena, which perhaps come closer to the formation of Internet gangs.
k.d: What could TikTok do to slow down this development?
IT: As part of a conference, we had the opportunity to enter into an exchange with representatives of AlgorithmWatch. AlgorithmWatch is of the opinion that transparency of algorithms is essential. However, this intervention would have to come from higher authorities, such as the European Union. If it really is the algorithm that shapes the content, then a reform would have to be introduced that prioritizes diversity and quality more strongly. The first approach should be to persuade the platforms to take strict action against extremist content that is not compatible with our free and democratic basic order. At the same time, we believe that the responsibility lies not only with the algorithm, but also with us. We need more content from the democratic spectrum that picks up on the characteristics of the “new public sphere” – casual, rough and real. Such content can not only influence the algorithm, but also reach the audience without losing the dynamism and speed of the platform.
MEL: For both of us, the extent to which TikTok moderates at all and favors certain content is still under discussion. That all remains in the black box of the algorithm. My personal impression is that TikTok hardly intervenes at all, otherwise the platform wouldn’t work so well. TikTok itself is primarily interested in keeping users glued to their screens for as long as possible. Whether we like it or not, the platform is probably primarily a result of the preferences and needs of the many users for certain content.
n.d: Are there differences between TikTok and other platforms such as Instagram or YouTube in terms of the promotion of political discourse and the dissemination of extremist content? What makes TikTok special in this regard?
IT: The comparison with YouTube is interesting. YouTube offers numerous video streams for users to choose from via the suggestion list. TikTok, on the other hand, is much more direct: the next video is played automatically after each swipe. In this way, TikTok deliberately reduces the active choices available to users. This direct confrontation harbors risks, such as sliding into certain content without thinking, but also offers opportunities, as users are presented with a variety of different content and formats.
MEL: Instagram has followed suit with Reels and YouTube with Shorts to copy this immediacy – that was exactly TikTok’s revolutionary approach. At the same time, TikTok remains less effective at moderating and censoring content than other platforms, probably due to its size, which makes it easier for extremist content to spread.
n.d: What are the limits or weaknesses of your methodology? What further research would be necessary now?
IT: Our methodology is primarily qualitative and is based on the analysis of individual accounts and interactions. This allows us to gain deeper insights, but not to make statistically representative statements. A larger, quantitative study would be necessary in order to include broader usage patterns across the entire political spectrum and more control groups. A detailed study of language patterns in commentary in livestreams where political groups clash would also be particularly relevant. And finally, the open questions about the algorithm and possible bias in favor of extremist positions should be clarified.
k.d.: What long-term implications does your research have for the way social media shapes our public communication? Do you think that platforms like TikTok will have to take greater responsibility for the content they distribute in the future?
MEL: TikTok is currently setting the pace for public communication. Reels that work on TikTok also work on Instagram – and will increasingly find their place in our everyday lives in the future. When a Christian Lindner talks about D-Day and open field battles or Olaf Scholz attacks him very personally in public, these narratives almost seem like the result of TikTok logic. On the question of responsibility: I don’t just see the platforms as having a duty here, but also us as a society. Why are we leaving the design of these now almost essential digital infrastructures – such as social media, classified ads or car-sharing platforms – to foreign corporations? The state and civil society could take on this responsibility by making such platforms transparent and accessible to all. Social media platforms are already trying to give responsibility back to users with fact checks and community notes. This makes the political question all the more urgent: why are the state and civil society unable to set up and manage these infrastructures independently?

Ioannis Theocharis has an interdisciplinary background in biomedical engineering, economics and social sciences. His research focuses on ecological and political economy and the analysis of mental infrastructures that maintain system path dependencies.

Marcel Erik Lemmer studied sociology and political science in Frankfurt and Vienna. His research focuses on the study of statehood, civil society and extremist movements. He is also the founder of the Young Democracy Foundation. Both work in the Future Democracies real-world laboratory.