Anders Fogh Rasmussen, who served as Denmark's Prime Minister and as NATO's Secretary General, discusses disinformation campaigns, media literacy, and the race to develop 5G networks.
Anders Fogh Rasmussen served as Prime Minister of Denmark from 2001 to 2009. Later that year, the longtime Danish parliamentarian became Secretary General of the North Atlantic Treaty Organization, a post he served in until 2014. He now is head of Rasmussen Global and a board member of the Alliance of Democracies, a Copenhagen-based organization he founded in 2017 to promote democracy and free markets.
He spoke in early September with Lindsay Lloyd, the Bradford M. Freeman Director of the Human Freedom Initiative at the George W. Bush Institute, Chris Walsh, Senior Program Manager for the Human Freedom Initiative, and William McKenzie, Senior Editorial Advisor at the Bush Institute. They discussed the challenge that disinformation campaigns present democracies, the ways in which consumers can detect fake news, and the need for Western democracies to prevail over China in the development of technologies like the creation of 5G networks. In the end, he contends, new technologies can help strengthen democracies.
What do you see as the biggest threats from disinformation in terms of elections, both in the United States and elsewhere? And where is this disinformation coming from?
I consider disinformation a new kind of warfare. It comes from many sources. The first player in this field was Russia, but we have seen other autocratic governments picking up the playbook.
During the COVID-19 pandemic, we have seen an increase in disinformation campaigns from China. But I’m also seeing that North Korea, Iran, and Venezuela have adopted parts of this playbook. I think we will see disinformation campaigns as a major challenge, at least, for free societies in the coming years.
During the COVID-19 pandemic, we have seen an increase in disinformation campaigns from China. But I’m also seeing that North Korea, Iran, and Venezuela have adopted parts of this playbook. I think we will see disinformation campaigns as a major challenge, at least, for free societies in the coming years.
More specifically on Europe, what are you seeing in terms of disinformation in places like Poland, Hungary, or the Baltic countries, which obviously are of great concern to Denmark and Europe more broadly?
We have seen attempts to derail election campaigns and referendum campaigns all over Europe. I don’t think any society is shielded from attempts to do disinformation campaigns.
We saw that during the Brexit referendum in the UK. We saw it in the so-called “Catalonia referendum” in Spain. We saw it in North Macedonia where they had a referendum on solving the name dispute with Greece. We saw it even in Italy that is already a very pro-Russian society. Nevertheless, Russia tried to polarize the political discourse and debates with the aim to strengthen the extremist parties in Italy, and they succeeded. It’s all over the place in Europe.
The Alliance for Democracies has talked about how some domestic actors are using a “Cross-border Disinformation Playbook” to create and accentuate divisions in authoritarian and democratic societies. What is this playbook? And how might democracies best respond to it?
We have seen that domestic actors use the same tools as state actors, such as Russia. They try to polarize the domestic debate in the run-up to elections with the aim to impact the final election outcome.
How do we counter those attempts to derail a free and fair democratic process? We have to use a lot of tools. Firstly, we have to raise awareness so that the public is aware of the risk of disinformation campaigns. The more awareness, the more resilient the electorate will be.
We have to deploy what I would call “election task forces” to monitor the election campaigns. Our organization’s Transatlantic Commission on Election Integrity has deployed task forces, for instance, in Ukraine and other countries to monitor the election campaign and to detect disinformation campaigns.
Other things we should include are the tech companies and developing innovative tools to detect not only fake news, but also disinformation campaigns. We should do what we can to counter the fake narratives.
We should counter disinformation, for instance, by setting up alternative media. I have recommended to the Baltic States to set up Russian-speaking media to counter the disinformation from Russia.
The most efficient tool would be for democratically-elected leaders to increase their own credibility by actually delivering on what they promised to voters during the election campaign.
What, if anything, might we have learned from the pandemic in terms of disinformation?
China has been quite sophisticated in using the pandemic as a disguise for a real disinformation campaign. We all know that COVID-19 originates from China, but China has tried to profit from this tragedy by using “health diplomacy.”
They have helped vulnerable countries with medical equipment and in other ways to change the narrative into a more positive story about China quickly regaining control of the situation. And they helped vulnerable European countries like Italy. They have also helped countries in the Balkans, and have gained some popularity. They have tried to spread fake news about American engagement in spreading this virus.
We have learned from this COVID-19 pandemic that we should be very quick countering false narratives. Maybe we should set up a center to counter them.
Your organization’s “Disinformation Diaries” tries to teach people about media literacy. What can consumers of information do to learn more about the reliability of information?
We should start in schools. When I went to school, we were taught how to look at traditional ads with critical eyes to see what’s behind them. We should do exactly the same: teach our children how to distinguish between true information and false information. We should start in schools to stimulate a critical approach to everything we read and hear.
When I went to school, we were taught how to look at traditional ads with critical eyes to see what’s behind them. We should do exactly the same: teach our children how to distinguish between true information and false information.
You mentioned it would be good if social media companies had tools to help us as consumers discern the reliability of information. What might some of those tools be?
There may be a wide range of tools, but one tool would be to develop algorithms that could detect fake news. The Transatlantic Commission on Election Integrity developed three videos showing President Trump. We showed a true Trump. We showed a Trump where an actor pretended to speak like Trump. And we showed Trump with an electronic voice, which sounded like him.
We then asked a group of people to detect the true Trump and the fake Trump. They couldn’t.
Today, we have techniques and algorithms that could detect if something is fake news and then we should label it and say, “There’s a clear risk. Here we are watching something that may be fake news.” Then, let people judge.
The same could be done when we’re speaking about more traditional media. We have met the companies that have specialized in labeling the trustworthiness of different media. They look at the transparency of a media: Does it disclose a conflict of interest? Where did they obtain the information? How is the information verified? Do they allow links to sources? Then you can label a media so that we, as consumers of news, can judge ourselves, whether we believe it’s true or whether it might be fake news.
Beyond social media companies, what can or should private organizations do to combat disinformation? For example, you’re on the board of NewsGuard. They have SWAT teams that try to scour for emerging websites that might be promoting misinformation.
NewsGuard is an excellent examiner. It’s a private organization whose purpose is to label media according to their trustworthiness. It is a bipartisan organization that uses objective criteria to label different media. They don’t distinguish between liberal and conservative media. We look into concrete criteria, objective criteria. Of course, it is not a 100 percent guarantee, but that’s the best private organizations can do.
I’d like to revisit something you mentioned earlier about China making inroads into some European democracies. One point of struggle has been over 5G networks. China is forcing these European nations to consider how willing they are to buy these types of advanced technologies from a superpower whose values they don’t necessarily share and may even oppose. So how should liberal democracies respond to the new cyber challenges posed by this 5G networks and infrastructure?
The best overall solution would be to create an alliance of democracies. That’s why I established my foundation in 2017.
One of its purposes is to develop global norms and standards for deployment of and use of information and communication technology. In very concrete terms, we are trying to develop common norms and standards within the European Union.
We need to set global norms and standards for the use of information and also for the use of artificial intelligence. If we don’t do that, then the autocratic regimes in Moscow, Beijing, and other places will dictate how to use those new technologies.
Individual European countries have already decided not to accept Huawei as an operator within information and communication technology. And I think that’s the right way forward. Maybe you shouldn’t mention specific companies, but you can set up a number of standards that Huawei clearly does not fulfill.
I consider it, from a security point of view, unacceptable to allow a big tech company with close links to the Chinese Communist Party and that is supported by the government, including financially. That would be to allow a spy through our backdoor.
We should aim for a common European approach to Huawei. And I think that role will be the final outcome of this. We have seen how the UK has now excluded Huawei from each network or will do that. My own country, Denmark, has already decided not to accept Huawei. Other European countries do not accept Huawei. We lack a couple of the bigger players in Europe, but I think eventually we will agree on common standards that will be the death sentence of Huawei in Europe.
Having said that, I also think we need European-American cooperation here. If we start a competition between Europe and the U.S., we will eventually lose that race. We need to set global norms and standards for the use of information and also for the use of artificial intelligence. If we don’t do that, then the autocratic regimes in Moscow, Beijing, and other places will dictate how to use those new technologies.
You mentioned artificial intelligence. When you think about artificial intelligence and other new technologies, what impact are they likely to have on democracies around the world? They have certainly led to positive innovations, but they also have produced some economic disruptions and other challenges.
Basically, new technology is a big step forward to strengthen democracies. The new information and communication technology, including social media, have made our democracies even more democratic. As citizens, we all have a chance to communicate directly with our people without the filters we have been used to with traditional media.
The new information and communication technology, including social media, have made our democracies even more democratic.
Of course, they raise a lot of new challenges that we have discussed.
But instead of looking at the new technology as a problem, we should use it as a solution to the problem. We should use the new technology to counter disinformation. And when it comes to artificial intelligence, it’s important for the world’s free societies to set international norms and standards that allow protection of individual rights and protection of privacy.
We can never accept the Chinese approach where they will use artificial intelligence to strengthen their supervision of people. That’s why the U.S. and Europe should collaborate. I would compare this with the famous Manhattan Project or the moon landing project. The U.S. and Europe have a vital interest in investing huge amounts of money in being sure that we are winning this technological race and not the Chinese.
How do societies that believe in free speech balance these challenges arising from the evolutions in information?
That’s a very important question. I have seen many left-wing people who have used the situation as a leverage for restricting free speech and for limiting people’s rights to speak their minds. We have also seen religious forces globally, including within some of the more extreme Islamic States, trying to restrict free speech.
It’s very important to strike the right balance between your right to speak your mind freely, but it is not your right to spread fake news. Now, I don’t have a clear answer as to how we could strike that balance the right way. But I would caution against the appointment of commissioners or agencies to monitor our debate and detect disinformation. It’s not always easy to say what’s information and what’s disinformation.
Basically, we should let people judge for themselves. But we should give people better tools to judge what’s fake and what’s true. And then we should let people decide for themselves.