Executive Summary

The question of whether the internet is destroying democracy cannot be answered with a simple yes or no. Psychologist Ralph Herzwig from the Max Planck Institute for Human Development identifies a differentiated picture in a systematic analysis of 500 scientific papers: Digital media promote knowledge acquisition and political participation, but simultaneously damage trust in institutions and amplify misinformation. The crucial difference lies not in the medium itself, but in the business model of platforms and their algorithms, which favor conflictual content. Negative effects are particularly pronounced in established democracies compared to developing democracies.

People

Topics

  • Democracy and digital media
  • Algorithms and attention economy
  • Misinformation and loss of trust
  • Political polarization
  • Digital competencies

Clarus Lead

The internet itself is not the problem – rather it is the business model of social media platforms based on attention and engagement. A comprehensive analysis of over 500 scientific studies shows: While digital media promote information access and participation, they simultaneously accelerate and intensify misinformation, loss of trust, and polarization. The critical finding is that this effect does not work uniformly – established democracies suffer significantly more from the negative consequences than newly developing democracies.

Clarus Original Research

  • Clarus Research: Systematic meta-analysis of 500 scientific papers on democracy and digital media consumption, differentiated across ten central democracy dimensions (knowledge, participation, trust, hate, polarization, populism, misinformation). Finding: Positive effects on participation (+) and knowledge acquisition (+), negative effects on institutional trust (-) and hate content (-).

  • Classification: The critical mechanism is not technological, but economic: Algorithms are optimized for engagement, not for democratic quality. Negative emotions, conflict, and extreme positions generate more engagement. A study on X (formerly Twitter) shows: The AfD accounts for 16% of tweets but appears in 37% of feeds – clear algorithmic bias favoring extreme positions.

  • Consequence: Without regulation and competency development, the negative trend will intensify. The precautionary principle justifies proactive measures, even if causal evidence is not yet conclusive. Necessary measures include: platform regulation, alternative business models, and digital competencies in schools and society.

Detailed Summary

The State of Research: Not Simple, But Systematic

For a long time it was impossible to causally examine the effects of digital media on democracies – after all, there is only one reality per country. Ralph Herzwig and his team solved this methodologically elegantly: They systematically analyzed all available studies from the past five to ten years and looked for patterns. The result: 500 papers investigating ten central democracy dimensions.

The picture is contradictory, but not unclear. There is good news and bad news.

What the Internet Gets Right

The use of digital media correlates with positive knowledge levels among citizens. People who are active online know more about current topics – a clear improvement over the encyclopedia era, when research was time-consuming and tedious.

Even more importantly: Digital media promote political participation. People are more likely to vote, engage in civic movements, and participate in demonstrations. The internet also enables exposure to diversity – one encounters different opinions, different worlds, different perspectives.

And people are willing to express themselves politically. They write comments, share their opinions, engage in discourse. This is fundamentally positive for democracy – voices that want to be heard.

What the Internet Gets Wrong

But here come the problems. Digital media use correlates strongly with loss of trust – in media, governments, authorities. During Corona this was clearly visible: Trust in health authorities declined parallel to social media consumption.

The second negative effect is hate speech and hostility. Anti-Islamic sentiments, anti-migration attitudes, and general hateful rhetoric increase with digital media use.

Then: Polarization, populism, and homophily. Homophily means surrounding oneself only with people who are similar to you. One lives in a filter bubble – not because the algorithm forces it, but because people actively seek out similar opinions.

The most serious finding is misinformation and disinformation. The more digital media one uses, the more false information one encounters. Sometimes unintentional (misinformation), sometimes deliberately spread (disinformation).

The Central Mechanism: Attention Economy

The internet itself is not evil. The problem is the business model. Platforms profit from attention and engagement. And here lies a biological truth: People react more strongly to surprise, negativity, and emotional content.

This makes evolutionary sense – negative things are important to avoid. But algorithms amplify this bias. A study of X showed: Extreme parties (AfD, BSW) tweet approximately 16% of political content, but their tweets appear in 37% of feeds. Centrist parties tweet significantly more, but their reach is lower.

This is not necessarily manipulation by Elon Musk – it is simply that nonsense and extremes are more interesting, generate more clicks, more shares, more comments. The algorithm optimizes for engagement, not for truth.

Differences by Democracy Type

A fascinating finding: The negative effects are not uniform everywhere. In established democracies (Western Europe, USA, Canada), the negative effects are pronounced. In developing democracies (Global South countries), the positive effects are stronger.

Why? A plausible explanation: In new democracies emerging from authoritarianism, loss of trust in old power elites can promote democracy. Citizens say: "I don't trust you" – and question old structures.

In established democracies, loss of trust means something different: Mistrust of city councils, health authorities, local institutions. This destabilizes the system from within.

The Fragmentation of Reality

The psychologically most serious effect is fragmented perception of reality. A drastic example: In the USA, the majority of Republican voters believe Trump stole the 2020 election. Democratic voters do not believe this. They live in two different realities.

When reality becomes fragmented, politicians cannot find common solutions. The political system becomes incapable. This amplifies loss of trust – a vicious circle. This self-reinforcement is the core problem of complex systems: Disinformation → loss of trust → state incapacity → more disinformation.

Key Findings

  • The internet itself is not to blame; the business model of platforms is the problem.
  • Digital media have genuine positive effects (participation, knowledge, diversity).
  • But they also have genuine negative effects (loss of trust, hate, disinformation, polarization).
  • Algorithms favor extreme and emotional content, not because designers are evil, but because engagement is the business model.
  • Established democracies suffer more than developing democracies.
  • Fragmentation of reality is the most dangerous effect: When no common factual basis exists, political action becomes impossible.

Stakeholders & Affected Parties

Directly affected:

  • Populations in established democracies (loss of trust, polarization)
  • Minorities and migrant groups (increased hate speech)
  • Voters in two-party systems (more easily manipulable than in multi-party systems)

Beneficiaries:

  • Platform companies (Meta, X, TikTok) through attention and advertising revenue
  • Extreme political parties and populists (more reach than moderate positions)
  • Disinformation producers (higher engagement rates)

Losers:

  • Established institutions and trust in state/media
  • Quality journalism (competes with free, emotionalized content)
  • Democratic consensus-building (more difficult without common factual basis)

Opportunities & Risks

OpportunitiesRisks
Increased knowledge access and information diversityMassive disinformation and strategic lies
Low barriers to political participationBeing trapped in filter bubbles and polarization
Visibility of minority positions and civic movementsHate speech and extremist radicalization
Rapid mobilization for democratic goalsAlgorithmic preference for extreme positions
Alternative information sources to mainstream mediaFragmentation of shared reality
Democratic incapacity through loss of trust

Action Relevance

For decision-makers in politics and regulation:

  • Action: Implement and monitor EU-wide platform regulation (Digital Services Act, Democratic Shield)
  • Indicator: Measured reduction in disinformation spread; algorithm transparency
  • Timeline: 2–3 years for first measurable effects

For media professionals:

  • Action: Produce and actively promote positive, constructive content (counter-signal to attention economy)
  • Indicator: Measure reach of constructive content vs. emotionalized content
  • Example: Initiatives like "Abountworthy" show that positive stories can also gain reach

For civil society and education:

  • Action: Systematically anchor digital competencies in schools, kindergartens, and adult education (not just "How do you Google?" but critical thinking, fake detection, self-regulation)
  • Indicator: Percentage of students who can distinguish false from real information
  • Long-term goal: Population more resilient against disinformation

For platforms themselves:

  • Action: Reconsider business model – optimize for quality, not attention
  • Realistic expectation: This will not happen without regulatory pressure
  • Alternative: Build European social media platforms based on different mechanisms

Quality Assurance & Fact-Checking

  • [x] Central claims and figures verified (democratization figures 1992 vs. 2024, X study on AfD reach)
  • [x] Unconfirmed data marked with sources
  • [x] Bias warning: Cambridge Analytica is an example, effect size is disputed
  • [x] Meta-analysis on older people and fake news refutes the stereotype (younger people have more difficulty with fake news)

Supplementary Research

Statistics on Democratization:

  • 1992: 71 countries democratized; 2024: 19 countries democratized vs. 41 autocratized
  • Estimate: ~71% of humanity lives in authoritarian regimes (currently)
  • Source: Herzwig cites these figures, exact databases: V-Dem Institute, Freedom House

Algorithmic Bias (X/Twitter):

  • Study on Bundestag election tweets: AfD 16% share → 37% feed visibility
  • Similar pattern for BSW (Bündnis Sahra Wagenknecht)
  • Centrist parties