“There’s a very real danger that the digital public sphere will serve only one political project in the future: authoritarianism“

Three academics spoke at Publix about disinformation as a symptom of democratic disintegration. What responsibilities do social media and formal politics have in this situation? And how should journalists deal with X, the Meta Group, and European digital legislation?

Legal scholar, Clara Iglesias-Keller, political scientist,Jeanette Hofmann, and researcher of right-wing media, Curd Knüpfer, are or were engaged in research at the Weizenbaum Institute, focussing on campaigns of disinformation. What follows are selected passages from their 90-minute podium discussion. The entire interview can be seen here.
Questions were posed on behalf of Publix by director, Maria Exner. The transcription of the discussion has been attentively edited and linguistically adapted for this written version.

With the federal elections in four weeks’ time, why did you feel it was important, as academics, to express your views on disinformation?

Clara Iglesias-Keller: There’s an awful lot at stake. Curbing right-wing extremist politics in Europe and in Germany is our most urgent job at the moment. How does communication work on the extreme right? What do we know about disinformation? As academics, we have an enormous amount of empirical evidence that paints a very nuanced and, at times, even a somewhat contradictory picture. We know, for example, that voters rarely change their voting behaviour due to disinformation. With particular political groups, however, disinformation is very effective. Add to that the fact that digital platforms have been openly siding with the Trump administration in the last two months.

Are you referring to Mark Zuckerberg’s decision to stop working with fact checkers?

Iglesias-Keller: Meta exemplifies platforms getting rid of precisely the measures that were designed to offer some safeguard to digital communications. Anyone hoping to defend democracy and the democratic process must now work with the possibility that these platforms are no longer willing to cooperate. There’s a very real danger that the digital public sphere will serve only one political project in the future: authoritarianism

According to a survey by The Bertelsmann Stiftung last year, 81 percent of Germans consider disinformation a significant threat to our democracy. How did disinformation come to be such a huge problem?

Jeanette Hofmann: Back in 2016, we had the Brexit referendum and a US election. In both cases, the result was completely unexpected, and everyone found themselves asking how it was possible. Disinformation we were told, foreign meddling, and voter manipulation. And how exactly did it happen? Digital platforms, social media. The culprit had been found and it was digital communication. Unprotected, it was susceptible to manipulation by Russia, China, and others. The problem is that framing things this way has us, in many instances, barking up completely the wrong tree.

How so?

Hofmann: First of all, disinformation has been linked to online platforms since then. Secondly, we think of it as coming from elsewhere. In so doing, we overlook important sources in our own country, which provoke the spread of disinformation. Finally, we miscalculate just how few people come directly into contact with disinformation via social media, at least in Germany. The few reports we have on the subject give results for Germany of between two and seven percent of the population. Where disinformation does gain traction – such as was the case with the “Pizza Gate” conspiracy for example – it is, unfortunately, traditional media outlets that are responsible for the real spread of such stories.

As things currently stand, four weeks before the elections, how much disinformation reaches someone like myself, who gets their information from traditional media outlets as well as from online platforms?

Hofmann: People don’t actually see that much disinformation on the platforms themselves. What they do see are the headlines produced when, for example, Elon Musk does a Nazi salute at Donald Trump’s inauguration ceremony. News coverage ensures that everybody hears about incidents like these. And the same thing frequently applies to disinformation, even when it comes to truly outrageous claims. However, it is important to note that disinformation is not necessarily influential simply because it is widely spread.

What is the goal of disinformation if not to have people believe false factual claims?

Hofmann: It’s a misconception to think that everyone who consumes – and perhaps even actively participates in spreading – misinformation necessarily believes what they’re seeing or reading. Particularly in populist and right-wing political circles, disinformation serves as a means of signposting one’s belonging to a particular political group. It’s a way of expressing one’s loyalty, usually to male – it’s almost always male – politicians. It’s a way for people to explicitly reject political elites, and elites from particular fields of expertise, as well as quality journalism and, basically, experts of every kind. In such cases, disinformation lets people take a side.

You said that it’s a mistake to think that disinformation is exclusively a problem on social media.

Hofmann: We saw this already during Trump’s first term, with new disinformation spread by the president every day. And Friedrich Merz’s claims that it was impossible to get a dentist’s appointment because of refugees went to show that it was possible for a member of a well-established political party to do something very similar here in Germany.

Those claims were immediately refuted by many media outlets though.

Curd Knüpfer: With someone like Friedrich Merz, this type of public scrutiny still works. But the established media won’t be able to consistently do so, not anymore. If lies go unexamined or, indeed, if they are deliberately spread, we will have sacrificed one of the most fundamental functions of the public sphere.

Hofmann: This is where democracy is really coming under fire.

Some of the major platforms are disabling measures designed to prevent the spread of hate speech. Users seem to find it more difficult now to unfollow the official accounts of President Trump. Meanwhile, search results on Instagram for key words like #abortion have disappeared. As academics in this field, have you been surprised by the events of the last few months, or had you already imaged that online platforms would go down this route?

Curd Knüpfer: We have long overlooked the fact that behind these platforms – which have become an important part of the infrastructure of deliberation for our democracy – what we find are simply business models. Online platforms are neither free from ideology nor free from market interests. Under the current political circumstances, the owners of these platforms are changing sides. It is clear to them that with Trump in office, one no longer regulates together with the state. The Trump administration is clearly indicating which measures it finds inconvenient, and Zuckerberg tows the line.

Social media started by promising us that it would make private communication easier. But political communication is at the centre of the current debate around such platforms. Your research indicates that the Right profits more from social media than other political groups.

Knüpfer: If I had to boil the research down to one point, I would say that the Right has an advantage. “The Right” as an ideology and as a political project. For one thing, social media benefits those with reductive solutions and a very emotive form of communication. Their messages come through much more clearly in this field of discourse. And that advantage also applies at the organisational level.

What’s your take on the influence of TikTok in this context, particularly on younger voters?

Hofmann: The results of the state elections in Thuringia, Saxony, and Brandenburg were very interesting. First-time voters, initially green, left-wing, and concerned for the environment, suddenly swung to the right, frequently voting for the AfD. This was generally explained away by claiming that the AfD have a strong presence on TikTok, that the younger generation are on TikTok a lot, and that it was therefore TikTok’s fault. This really falls short of the mark.

So how do you explain the younger generation’s swing to the right?

Hofmann: The phenomenon of disinformation should be seen within the context of declining democracy. For a number of years now, political science research has been looking at how approval for democracy and trust in democratic institutions has been declining in many countries. Especially in the new federal states that were formed after the wall fell, a lot of people are disappointed, and with good reason. The connections can be seen, for example, in the fact that there are a lot of AfD voters in places where public infrastructure is completely falling apart, where public transport is practically inexistent, or where the closest doctor is 50 kilometres away. I would look less to social networks like TikTok and more to what people think about democratic institutions and what they actually provide.

Currently, time spent trying to curb disinformation is mostly invested in four measures: questionable content on social media is designated as such; certain content is linked to fact checks; content is moderated, which includes deleting content; attempts are made to provide media education. Do these measures make sense? And, above all, is it enough?

Iglesias-Keller: I think they all make sense, yes. However, we need to see them with nuance, considering their limitations as well. Fact checks, for instance, provide the public with reliable information and are essential for pluralism. But in contexts of democratic erosion, they can be limited, since people accept disinformation despite of it being truth or false. They do it because it fits their political beliefs. And I’d be happy to add one more to the list though. We desperately need to discuss the concentration of economic and political power in the hands of a few entrepreneurs. With Elon Musk having bought X and Jeff Bezos the Washington Post, two tech billionaires now own important media companies. Historically, we used to have property rights that were designed to prevent such concentration of market power. There needs to be political action here.

The “Digital Services Act" (DSA) — an EU regulation of online platforms — came into effect in January. What does it do to tackle these problems?

Iglesias-Keller: The DSA is an important step for Europe, but the regulation has its limits, of course. For example, it is the responsibility of the platforms themselves to both articulate what risks to democracy are posed by their content, and to take appropriate measures. Moreover, whether such platforms are allowed to use our personal data to create profiles detailing our political beliefs, and to show us political advertising accordingly, is not examined by the regulation in the slightest.

Hofmann: Nor do we know if the European Commission will act should a given platform not take appropriate measures to minimize risks for younger users. Will platforms be issued with genuinely painful penalties? What will the commission do if Donald Trump demands the regulation be lifted, and threatens to implement punitive tariffs if it doesn’t comply? That is something we will have to keep an eye on.

Knüpfer: One problem with the current form of regulation is the fundamental premise on which it hinges. Namely, that these huge digital platforms will always exist, as if this were a law of nature. They exist and, therefore, must be regulated. The issue could have been approached completely differently: the formation of huge digital platforms as one corporate entity should be prevented. What do we need to do to have a major digital platform where our greatest hope is not merely that a fine will be high enough to impress someone in Silicon Valley?

Audience Question: You mentioned the fact that further dissemination through established media outlets is a problem when it comes to disinformation. But what can editorials do better, given that they cannot simply ignore a Nazi salute made by Elon Musk?

Knüpfer: It’s not a question of reporting who was there and what happened. Rather, the question is primarily one of indicating why he gave a Nazi salute on stage. That will quickly raise the issue of who benefits from it, and what role that plays. What he did was neither arbitrary nor merely provocative. As such, we have to provide context and look at the circumstances.

Photo: Norman Posselt

More to read

Subscribe to our newsletter!