Kaia Nisser & Andreas Önnerfors, Fojo Media Institute © 2024
In this second part of our conversation with Nina Jankowicz we focused on what possibly lies ahead of us in the light of the incoming US “broligarchy” committed to disrupt, dismantle and discredit work against disinformation in the public interest. We discussed the role of tech bosses and social media giants, aspects of identity-based disinformation, the case of Sweden and possible ways forward.
The threat of the second Trump-presidency
From the harassment Jankowicz experienced, it emerges that the American right has an obsession with the so-called “censorship-industrial complex” as it is called in an official report from the “Committee on the Judiciary and the Select Subcommittee on the Weaponization of the Federal Government”, headed by Republican congressman from Ohio, Jim Jordan. The report reads like a surreal conspiratorial fairytale wherein requesting huge internet platforms to moderate malign or fabricated content (and obvious dis-/misinformation) amounts to censoring Americans and suppressing the truth.
To Jankowicz, the work of the Subcommittee is “absurd because what he’s doing is weaponizing the federal government in order to quash public interest research” which “they say are working, conspiring with social media platforms to censor conservatives, even though the data shows that that’s not happening”. What ASP has tracked in this area of controversy is the “information laundering cycle where a fringe blogger will make a crazy allegation that spurs an investigation by this committee who has subpoena power” (see box below and the ASP’s report on the topic). This power is then used “so they can request documents that public institutions like the University of Washington are required to provide to third-party entities like a group called America First Legal”. The pressure against independent institutions increases when “you throw in tech moguls like Elon Musk, who is very close to Jim Jordan, and it gets even more complex. And it’s a lot of pressure for academic institutions and small nonprofits to withstand”.
During the second Trump presidency, there will likely to be more to come. Jankowicz told us about a “video of Trump saying that on day one, if he were reelected, he would ban the terms mis- and disinformation, ban the government from funding any research of disinformation, and he’d instruct the DOJ, the Department of Justice, to investigate anyone involved with this censorship”. And after his re-election, “Trump, Musk, and Marc Andreessen, who’s a tech investor, have all reshared this video” […] It’s not just putting pressure on these academics and researchers. It could become a matter of the scary way”. In an episode of Alyssa Milano’s podcast, ‘Sorry Not Sorry’ (aired in June), Jankowicz was asked if she believed we now live in a post-truth era or if the truth still matters. She responded that truth means we perhaps don’t always have the answer and have different opinions, but “we still share that reality”. In the light of recent developments and increasing divisions, does Jankowicz still believe in this ‘shared reality’? She confessed, that the question was “a hard one … especially given the political realities in in the United States right now there and I don’t say this as a partisan: there are the majority of the Republican party and frankly some Democratic voters as well who voted a split ticket – seem to either be subscribed to a different version of reality or they just don’t care about the lies and are putting their own self-interest interest above the truth. And that’s really worrisome to me”.
The question now becomes “how do you educate people who just don’t seem to care that someone is lying and maybe they maybe they don’t need to be educated but it’s about kind of returning to a shared set of values which I think have been totally flipped on their head by Trump. The normalization of lying, the normalization of the sort of racist, sexist rhetoric that he prefers. Nine, ten years ago, all of that would have been shocking to us. And now we don’t even blink an eye”. It is still important “to have the kind of base idea about what is reality. And nobody can escape from that but it seems to be harder and harder to do that and we maybe switch from a you know rational approach to information to a trust approach and but I don’t have a solution”. Considering the threat that Trump poses, what are the global repercussions of this trend? To Jankowicz, Trump’s rhetoric “created a permission structure for other would-be autocrats around the world. And we saw this during his first term as well. It empowered Orbán, it empowered Modi, it empowered Bolsonaro. I mean, who knows what generation of Trump wannabes are coming down the pike in Europe or elsewhere?”. This sort of permission structure, Jankowicz adds, “will replicate itself around the world particularly as other autocratic leaders are raised up in the image of Trump”.
Responsibility of Very Large Online Platforms (VLOP:s)
We were curious what Jankowicz had to say both in terms of the role of social media and the relationship between a lot of fact-checking and disinformation initiatives with big tech. To her, “it just feels like a lost cause, honestly, because Musk has created this race to the bottom for the tech companies where he has decimated what Twitter used to be so much that if Zuckerberg or any of the other tech executives just put a modicum of effort forward, it looks like they’re doing a really great job”. Users on all platforms have suffered in declining trust and safety as these companies have “rolled back a lot of their policies on disinformation, even disinformation that harms public health and public safety or directly denies the result of an election”.
The problem is algorithmic amplification. This answer can be found in the formula: “the more enraging, the more emotional the content, the more engaging the content”. This leads “to people posting things that are either fact-free or light on facts, or are just emotional narratives meant to drive people to like, share, comment, and push that up further in kind of the virality of the algorithm. They’re doing this knowingly”. Another blow is the restriction of “access to data by third-party organizations”. One example is the monetization of API (‘Application Programming Interface’) by X, obstructing large-scale research into the platform combined with threats of prosecution. But other platforms aren’t better: “Facebook killed CrowdTangle, which was the social listening service that we were able to use on Facebook and Instagram. TikTok’s data access has always been poor. So nothing has changed too much there”. The proximity of Musk and Trump, branded by Jankowicz’s friend and colleague Dr. Julie Posetti as the ‘bro-ligarchy’, can be likened to Russia, where media ownership and political loyalty systematically are amalgamated.
The “symbiosis between fact-checking organizations and of disinformation researchers and the platforms” had always made Jankowicz feel “a little bit uncomfortable”. Plenty of “organizations around the world in high-risk environments have their lifeline through Facebook” but “sometimes you wonder if they’re going to be as critical as they need to be of the platforms or potentially find themselves out of a grant that is going to be funding their work for many years”. Another example was the partnership between the Atlantic Council and Facebook aiming at preventing abuse of the platform during elections. Again, Jankowicz has “the same worry when there is that special kind of relationship and the access to data isn’t being made available to everyone, that, you know, perhaps you’re not as critical as you might be otherwise”.
Elon Musk’s grotesque remarks related to the racist riots in England in the summer of 2024 (and the push back against such harmful and inciting comments that came from EU officials) were defended by vice-president elect Vance’s threat to withdraw support from NATO if Europe tries to regulate Musk’s platforms. In light of this we were curious about Jankowicz’ opinion on legislation like the EU Digital Services Act (DSA).
To Jankowicz, such regulations will create a more civil internet: “I think from what I understand the Republicans view the DSA and certain other regimes that have been introduced, in particular the Online Safety Act in the UK and the exact same name of the same bill in Australia, although different provisions, they view that as draconian anti-free speech law … when in reality, what it’s doing is empowering more people to speak. Because if platforms have a responsibility to deal with racist content, terrorist content, sexist content that already goes against their terms of service anyway, then we’re going to have a more civil internet where people can actually speak their minds”. Considering the current development, Jankowicz encourages the EU to stand strong and thinks even Musk “would be silly to withdraw” from the EU market despite his ongoing threats against European lawmakers.
A gendered (dis-)information war about identities?
Considering that political mobilization during the US presidential elections and conversations thereafter ran along gendered lines (such as the current rise of the 4B movement in a Western context) we were interested in Jankowicz’s view on the matter. Her book “How to Be a Woman Online: Surviving Abuse and Harassment and How to Fight Back” (2022) could serve as inspiration as she battled these issues firsthand. She told us that she was on her way to a conference organized by the EU External Action Service, “Identity-Based Disinformation in FIMI: countering the weaponisation of who we are” where phenomena such as gendered disinformation were going to be addressed. In the US, and particularly during the presidential elections, “misogyny was still very, very, very strong” and “the manosphere played a huge part” with figures such as Andrew Tate exercising huge influence on conversations happening among young people. The disturbing fact is that such conversations “replicate online when there are no barriers and no consequences”. So what does Jankowicz recommend against growing gender polarization online?
“In the face of the platforms not doing very much, because they’ve never really done very much for women, and in an environment where this sort of thing is becoming normalized, the only thing that I can see to do and this is incredibly depressing but it’s to teach women how to better protect themselves online and I – do not get me wrong – I do not believe the onus should fall on women to protect themselves, but if we are going to maintain or gain ground in making sure that our ideas are heard it’s the first thing that we have to do. And that’s why my second book is all about online safety. It needs to be updated now because of Musk and what he’s done to the internet”.
There is a continued need to “pressure tech platforms to treat women like equal citizens online. Because right now we are second-class citizens. I feel pretty firmly about that. And bills like the Online Safety Act in the UK and Australia’s version which preceded the UK’s are quite good I think in balancing the free speech implications of hate speech and misogynistic speech online but it’s still something that not enough governments are talking about and tackling”. One key area Jankowicz is deeply concerned about is deep fake pornography since it demonstrates that “we’re going to repeat all the same mistakes that we made when tech was booming and we’re not going to think proactively about how these technologies might be used to harm”. Deep fake pornography “has exploded and been used against women, almost entirely women and young girls and teachers as well. So I don’t think it is a very bright future in the next couple of years, particularly the resurgence or explosion in manosphere rhetoric that Trump exemplifies”.
In a choice between returning to more analogue modes of communication and staying online, Jankowicz tends “toward giving women the armor they need to continue to exist in spaces that are dominated by men whether that is at a table that’s a national security discussion or something like Twitter” (which she now has left for Bluesky). Her book was about going through “just very basic online security and operational security steps that I think every woman needs to know. And when I was going through the worst of what I was going through, this former intelligence official who was helping me with personal security and scanning the dark web for threats against me said it’s really good that you do what you do and you’re an expert in what you’re an expert in because if you hadn’t taken all these precautions ahead of time, you would have been hacked you would have been doxxed even earlier. Your personal information and pictures would be out there like people were actively trying to find nude images of me”.
For women it is important to recognize “what the risks are so that you can be prepared for them. Because I think a lot of women, maybe they know and they recognize it, but they’re not necessarily prepared in all the ways that they could be”. One such rule is to preserve content “that would have helped with legal cases and things like that”, pointing at the importance of digital archiving. The risk is that we are loosing a generation of younger women who chose to opt out being in public: “It means that our positions and our lived experiences aren’t going to be represented to the public if women are self-selecting out of public-facing positions. That’s the real implication”. Jankowicz remains undeterred here: “And it’s one of the reasons I have felt so strongly about continuing to hold my ground online, even though I wish that I could just turn it off sometimes. But I’m not going to let the bad guys win”.
But the entire topic extends also to intersectional identities facing increased online abuse. The very idea of diversity is attacked by “domestic disinformers and abusers”, but also “used by adversaries of democracy who look at the diverse societies that we have and see cleavages. And in some cases, those cleavages are cleavages that we’ve created ourselves and not healed ourselves, like racism in the United States, for instance”.
Is Sweden a good example in the fight against disinformation?
To Jankowicz, Sweden’s informational preparedness in our current times stands out as a positive: “you also have a population that is better primed, that trusts in their government a little bit more, a lot more than the United States does or other Western democracies do. And I think that matters.” Information literacy can be seen as part of societal resilience, even if “it’s hard to compare a population that is more homogeneous and smaller to a diverse population like the United States. But you’ve got really strong media, you’ve got trust in government and that means that putting an agency like that [The Psychological Defence Agency MPF] at the helm of these efforts is likely to bear more than it would in some other countries. But I think they’re [MPF] admirable.”
Sweden, Finland and Estonia in particular “are viewed as, rightly so, as kind of the pinnacle of counter-disinformation work, but I don’t think it’s as plug-and-play for other democracies as people may think it is”. There are many lessons to be learned, “step one is getting, working on that trust, which has fallen so far far in the United States”. There, it wasn’t even possible to agree upon policy coordination against disinformation. Trust building has to start with removing gridlocks in Congress, where “the opposite party has demonized the other for so long in the United States that people see, in particular, the federal government is just not responsive to their needs at all”. And it’s not realistic to legislate good governance, “we can’t fact check our way out of the face”.
Community-based responses
’Throughout our interview, we returned to the question of how to mitigate the ongoing information disorder. To Fojo and its project to boost fact-checking and verification skills in journalism, it was important to ask Jankowicz about her opinion: “I think it’s hugely important for journalists and I think it plays an important role for the public as well to understand that fact-checking is part of the process of reporting out stories. And I think more journalistic organizations need to pull back the curtain of how that process works and earn readers and viewers and listeners trust”. However, previously there was a tendency “to rely on fact-checking as the panacea to the disinformation problem” and those engaged in these practices frequently seemed to preach to the choir, “it can’t be the sole solution. It has to be part of a set of solutions”.
Jankowicz suggests it is in civil society where fact-checking or the broader ecosystem of “accountability projects” should live since “civil society organizations [CSOs], when they are local, have a better understanding of what makes their community tick”. For example, the ASP’s information literacy efforts will not be effective if they are “parachuting in from Washington to Idaho to do this stuff. We’re going to find a local partner that we’re going to do it with. And we’ll design the curriculum hand in hand with them because what appeals to somebody in Idaho is going to be very different than what appeals to somebody in West Virginia, what appeals to somebody in Florida”. However, “it is harder to do it that way rather than just a plug and play parachute curriculum”.
But it is also about strengthening individual resilience in a volatile information environment. Jankowicz pointed at two recent studies: one in which AI-technology (a chatbot called DebunkBot) made people question their own conspiratorial beliefs and another where participants were inoculated against emotional manipulation techniques. To understand these techniques makes us all better equipped: “If you’re getting really mad or worked up about something, whether that’s positively or negatively, you might be being manipulated. And just having that awareness about how you’re consuming that information, whether it’s, you know, through mainstream media, fringe media, social media, understanding that, and then doing a little bit of legwork on the back end, seeing if anybody else is reporting that story the same way, seeing if you can find another source for it, doing a Google image search to see if an image has been misattributed, all this sort of stuff slows you down a little bit and makes your media consumption so that you’re not just kind of reacting emotionally to that. So that gives me hope”.
However Jankowicz, not least after the US elections, also thinks “less about technological solutions or new social media platforms or any reliance on the platforms at all” but rather “about this kind of community-based approach and getting back to basics and putting people face-to-face across tables to talk about things … that’s not a scalable solution necessarily, but it’s one that I think we need to return to”. One example that strikes her is a social platform developed to boost local community building in Vermont (“a very weird but cool state”), the Front Porch Forum. The point with such initiatives would be that “we can return that humanity and that local based connection to these interactions. And it’s just about figuring out the right way to do it for each community”.
A lot depends on the restoring of interpersonal relations. It is difficult to pick up an argument when people display a conspiratorial mindset or “very entrenched political views that are very different than my own”. If they are close, Jankowicz just tries “to approach them with love and say, this is why I believe what I believe. Can we find some common ground here? And when you do it with somebody you love and you’re not getting anywhere, it’s really frustrating. It’s frustrating for both sides, I think”. It is here a Chatbot like DebunkBot comes in, a “neutral arbiter of AI is one that we should be exploiting more. But of course, there’s a lot of baked-in biases in large language models, so I’m not sure that it’s the best, foolproof solution but worth exploring”.
Community-building requires meeting on the level and parting on the square, whereas “much of what happens on the internet happens because people are so desensitized and don’t see a human behind the screen”. Jankowicz suggests we should be more open towards those we meet online and connect on issues that unite us: “and so when we talk about community building and town halls or like civic kind of gatherings like that, I do think that it reminds people of what we have in common”, even in strongly contested areas such as immigration since many of us share existential experiences. The crucial effort depends on “re-injecting humanity as much as possible, even if you have to do it in this kind of digital ecosystem, the ways that I’ve had to do can be really fruitful. It just is really hard to scale, especially as an individual”.
Diversity and disinformation
In the last part of our conversation, we returned to the question of identity-based disinformation, not least since Quran burnings in Sweden and the campaign against Swedish social services (“LVU-campaign”) have revealed how minority-issues can be abused to stoke societal tensions. Information trust in society has to move from top-town vertical to more inclusive bottom-up horizontal approaches, it seems. To address such challenges, Jankowicz referred to studies by communications scholar and sociologist Damon Centola who extensively has researched the similarities between contagion and the spread of (reliable) information. Social change doesn’t only happen when new ideas are induced top-down (for instance through community leaders) but when disseminated on different points of social interaction. Thus, to build information trust in vulnerable communities (or in general, where localized disinformation is prevalent) would require a more comprehensive approach.
So what can we do to remove institutional blindness among many actors in the information ecosystem ranging from government agencies to academia and newsrooms? Jankowicz admits that she has no good answer to that, since a lot of her work has focused on policy advice. However it is important to her to shift towards “the bootstrap civil society work” such as impactful training where you meet audiences whose behaviour actually might change as a result. More out-of-the-box approaches are needed “because the slate is so blank we are we have to start from scratch and it’ll be, in a way it’s exciting, right? It’s an opportunity to see what might work”. Placemaking is important in our digitally interconnected societies and “getting back to basics” might for instance imply “libraries as kind of the central space in communities and there’s some interesting media literacy research that’s been done via libraries as well”. To Jankowicz it is crucial to “empower local organizations rather than kind of a top-down approach either from the government or from you know media organizations” but it will take research and hard work to build such communities.
To close, let’s return to the Fleming-palace, in the shadows of the royal castle, where we had our conversation. Around 1900 it turned into Stockholm’s main telegraph station: a Swedish hub for the first global electronic communication network. Today it is a coworking space for knowledge-based companies and organizations living in the digital age. Maybe it is places like this, where people meet and exchange ideas in reality, where change can start towards a more civilized societal discourse.
The information laundering cycle is a process by which false or misleading information is spread and legitimized. Here’s how it typically works:
Origin: A fringe blogger or a less credible source makes a sensational or false claim.
Amplification: This claim is picked up by fringe media outlets and shared on social media platforms.
Mainstreaming: As the claim gains traction, it is picked up by more mainstream media outlets (with a pipeline from low-quality alternative to high-quality established media), giving it a veneer of credibility.
Political Use: Politicians or public figures may then reference these claims, further legitimizing them and using them for political gain.
Institutional Pressure: Investigations or lawsuits may be launched based on these claims, putting pressure on institutions and organizations to respond or retreat. This cycle allows false information to gain credibility and influence public opinion, making it challenging to combat disinformation effectively.
The information laundering cycle. Adapted from ASP and Nina Jankowicz with assistance from Microsoft Copilot.
Summary of part 2 (prepared by Microsoft Copilot, checked by Andreas Önnerfors) Nina Jankowicz, a global disinformation expert, shares her experiences and insights in an interview with Kaia Nisser and Andreas Önnerfors from Fojo Media Institute, Sweden. Key Points: Threat of a Second Trump Presidency: Jankowicz discusses the potential impact of a second Trump term on disinformation efforts, highlighting the "censorship-industrial complex" narrative pushed by the right. She expresses concern over the normalization of lying and the erosion of shared reality in the U.S. Responsibility of Very Large Online Platforms (VLOPs): The role of social media companies in combating disinformation is examined, with criticism of their current efforts. Jankowicz points out the problematic relationship between fact-checking organizations and tech platforms, which may hinder critical assessments. Gendered Disinformation and Online Safety: The conversation addresses the gendered nature of online abuse and disinformation. Jankowicz emphasizes the need for women to protect themselves online and calls for tech platforms to treat women as equal citizens. Global Repercussions and Autocratic Trends: The influence of Trump’s rhetoric on global autocratic leaders is discussed, with examples from Hungary, India, and Brazil. Jankowicz warns of the potential rise of similar leaders in Europe and elsewhere. Legislation and Regulation: The EU Digital Services Act (DSA) and other regulatory efforts are seen as steps towards a more civil internet. Jankowicz supports these regulations despite opposition from some U.S. political figures. Community-Based Responses: The importance of local, community-based approaches to combating disinformation is highlighted. Jankowicz advocates for building individual resilience and fostering face-to-face interactions to counteract the effects of disinformation. Sweden’s Role in Disinformation Preparedness: Sweden is praised for its strong media and government trust, which contribute to its effective counter-disinformation efforts. Jankowicz notes that while Sweden’s model may not be directly applicable to other countries, there are valuable lessons to be learned. Diversity and Disinformation: The document discusses how identity-based disinformation exploits societal divisions. Jankowicz calls for more inclusive, bottom-up approaches to building information trust in diverse communities. Conclusion: The conversation with Nina Jankowicz underscores the multifaceted nature of the fight against disinformation, emphasizing the need for robust regulatory frameworks, community-based initiatives, and greater online safety measures, especially for women. It also highlights the global implications of disinformation and the importance of maintaining a shared reality in democratic societies