The spread and impact of disinformation in the context of online platforms (No. 546) © Photo Credit: Skórzewiak - stock.adobe.com

The spread and impact of disinformation in the context of online platforms (No. 546)

This study analyzes the dynamics of dissemination and current developments in disinformation. The study evaluates a wide range of countermeasures in an international comparison and encourages a focus on hybrid and structural interventions. In order to strengthen the information space in the long term, the architecture and economic incentives of platforms are increasingly coming into focus.

The study examines the systemic causes, dissemination mechanisms, and counterstrategies of disinformation on online platforms. Beyond traditional fact-checking, it analyzes how economic incentive structures in the ad tech market and engagement-based algorithms systematically favor the dissemination of polarizing content.

A key finding is that disinformation is not a random anomaly, but a profitable externality of the attention economy. The business model of platforms prioritizes emotional content, while the opaque ecosystem of programmatic advertising unintentionally refinances disinformation actors. Technological trends such as generative AI (including deepfakes, but also “AI slop”) and coordinated cross-platform strategies exacerbate this dynamic by lowering production costs and circumventing moderation filters. Psychologically, users' vulnerability is reinforced less by knowledge deficits than by mechanisms such as “identity-protective motivated reasoning” and the passive “news finds me” effect, which can ultimately facilitate real damage, including stochastic terrorism.

With regard to countermeasures, the evaluation of approaches such as prebunking, friction, and community notes reveals their limitations in terms of scalability, consensus building, and possible negative effects (“skepticism paradox”). In the regulatory section, the paper contrasts the process-oriented, risk-based model of the EU Digital Services Act (DSA) with repressive, content-focused “fake news” laws in other geopolitical contexts. Finally, the study concludes that it is necessary to shift the focus from individual media literacy to structural interventions in order to prevent the monetization of disinformation and enforce algorithmic transparency.