A national counter-disinformation strategy, revisited
On reality czars, rebuilding our capacity, and plain language.
|Marc Ambinder||Feb 4|
Right, so it’s probably time to look at the prospects for a national counter-disinformation strategy. Six months ago, I took some chalk and outlined what such a plan might looks like and what it should encompass and what it should avoid. I’ll reproduce a few paragraphs here:
First, here’s what it must avoid. It should not focus on specific content, or proscribe punitive measures, or threaten regulation, or be put in a position where bureaucrats would have to make judgment calls about the relative harm of a particular speech act. It should not focus on what Stanley Fish has called “the tug-of-war between balance and principle.” A counter-disinformation strategy would not REGULATE lies and disinformation. It would follow the example of Taiwan: completely transparency, “tolerant of differences and dissent, democratic and free.”
The mission statement should instead focus on building a nationwide capacity to counter-disinformation by targeting its spread, by providing the mechanism to interrupt the network effects that allow it to zap from platform to platform so rapidly, by rebuilding a shared sense of truth around a select set of issues that are deemed critical to democracy and to the smooth functions of government. I would choose three subjects: the integrity of elections, public health, and national security emergencies. This is the only way to reach into of the most efficient vectors for the spread of misinformation
For pandemics, Ron Klain, Joe Biden’s former chief of staff and Ebola czar, proposed creating a Public Health Emergency Management agency, which would marry logistics (which FEMA does well) with health expertise (which CDC does well.). But what about communication? During the Ebola crisis, the National Security Council coordinated “messaging” among government agencies. But “messaging” is a small part of a counter-disinformation strategy. When Ebola hit in 2013 and 2014, the disinformation architecture that Russia built (and which sophisticated companies, brands and politicians now emulate), existed in clapboard form.
So now Ron Klain is once again the chief of staff and he has surrounded himself with able communicators. The priority is the pandemic. Broad stroke national strategy documents will come later.
As we move into the liminal phase between despair and hope, there has been some reckoning with the idea again, as well as a healthy dollop of criticism. Efforts to counter disinformation about the 2020 elections were numerous – I was part of one – but we don’t really know if they worked, except to say that one metric we might use – turnout – is best explained by so many interconnected variables. And, of course, because the moment the very issue of election integrity became sectarian, it seemed not to matter at all what any entity – academic, federal (CISA) or otherwise did.
In a withering rebuke to techo-pessimists, Matt Welch of Reason offers a persuasive argument as to why “truth” itself should not be, and cannot be, in practical terms, the goal of disinformation campaigns.
There's a reason why U.S. officials can't gin up the courage to call the century-old Turkish genocide of more than 1 million Armenians a "genocide," yet are currently characterizing China's brutal, though non-mass-murderous, suppression of its Uighur minority with a G-word even while several human rights groups do not (see also: "states that sponsor terrorism"). The Food Pyramid and its antecedents have been many things, but revealed truth is not one of them. The Centers for Disease Control, name-checked in Roose's article, changed its recommendations on masks based more on behavioral effects than science. War is a perpetual lie-making machine, and that includes the War on Drugs.
The messy reality of overlapping bureaucracies and their conflicting interests may be one reason why pundit imagineers are tempted by "centralization" and the notion of a "czar." It's the eternal lure of a single magic wand. And about as childish.
But surely there is value, even to the bureaucracy, in providing help to those searching for truth and common ground, even if the truth in any particular situation is recognize a political value as contested! This is not an argument against a national strategy; it is an argument in favor of a type of public communication: do not appeal to the public you imagine – appeal to the public as it is.
Do not use gauzy or hazy language to convey a set of facts that are gauzy and hazy; doing this is an immediate cue to your audience that you are lying or shading.
Here’s an example: do the coronavirus vaccines reduce community transmission of Covid-19? Before today’s Astro-Zeneca news, virtually every infectious disease specialist – almost all of them – would say that most of the vaccines approved globally probably do… – that is, they make it harder for an individual person to be infected; for those who are still infected, the vaccines probably make infected individuals less likely to quickly shed virus.
Doctors say this because many other vaccines – not all, but many – reduce the rate of transmission. Doctors say this because the vaccines’ mechanisms of action reduce the production of coronaviruses that can move out of your nose and mouth.
So as a public health communicator, you can say: “We don’t know, so you should still mask up and stay socially distant.”
Or, you can say, as Johns Hopkins University’s specialists do, “It is likely they reduce the risk of virus transmission but probably not completely in everyone.” So, you should still mask up and stay socially distant.
Every bone in your body might tell you that the first comment would facilitate greater compliance. But there’s just no evidence of that. And it’s not the full truth. You don’t know, but you’re pretty sure… and, in fact, if you’re pretty sure, you can give people comfort to some folks with vaccine hesitancy, and if you’re pretty sure, you can communicate correctly about how science works: it is an iterative process, where evidence and assumptions both mix to lead to conclusions. If you’re going to be honest, you’re also going to acknowledge that science and social utility are linked; and that social utility is linked to the political imperatives of the moment. Johns Hopkins has it right.
This is an argument against truculence and expert-speak. It is an argument in favor of saying:
We think this is how you might be feeling.
So, this is what we know,
this is what we think,
and this is why we think it,
and this is what we think you should do,
because this is what effects your actions might have.
Interestingly enough, that little construct covers a lot of ground. It acknowledges uncertainty. It sanctions expertise, but it allows everyone to examine the grounding for the expertise. It does not wrap the expert in a shroud of prestige games. It is inclusive. When you’re aiming your attention at a community that has closed its mind, this is the way to open it: merely practice cognitive empathy.
Writing in the Times, Kevin Roose surveyed a number of counter-disinformation specialists and pulled from them a number of provocative ideas. A truth commission. A reality czar. Mandatory algorithmic audits. More attention to the root causes of alienation.
National counter-disinformation strategies are as numerous as state-sponsored disinformation campaigns. There is no paradigm. China heavily censors. Sweden and Denmark developed national action plans, proactively reached out to platforms, and invested heavily in its intelligence services. Virtually all major media companies and political parties bought in to these plans. (It is hard to see the GOP agreeing to anything like that here.) Taiwan and Singapore have advanced digital cultures; the former’s plans are developed, highly technocratic and leverage national unity beliefs. The latter relies on early childhood education as a foundation, and the coercive power of the state higher up the chain.
What might work in the United States?
Well -- I have no particular objection to policy czars, but I would rather see action taken to redress current deficits in our truth accounts, ones that don’t require much more than a policy directive and political appointee attention-spans. We once had a robust local news infrastructure, an army of people whose literal jobs are to find out and discover hard truths. We have lost 40,000 truth-tellers to technological change, corporate pressure, changing habits, the death of print – I could go on, but we can also create the conditions for repopulation. I don’t know how this would work, exactly, but if we set as a national goal the training and peopling of local newsrooms – maybe, 40,000 new local journalism jobs by 2030, and we ask people like Jon Ralston and Tina Rosenberg to help figure out how to sustain enterprises after initial funding, we’ll have made a start. The federal government will have to be involved at some level, and that will create the usual blowback and backlash and elite-resentment arguments. That’s fine because, in the end, I don’t have a big objection to an association between the state and a program that produces people who will hold it to account!
From my earlier post, here are some other proposals:
It would gather the best political and social science about countering misinformation into one place and offer grants for further research.
It would allow the private sector to train thousands of ordinary individuals on open source digital forensics tools, and deploy them, like a special forces detachment, to train election officials, campaign officials, companies, small businesses, and others, on how what they can do.
It would encourage the platforms to create an open-access repository of accounts and claims that fall into the category of harmful misinformation; it would encourage platforms to share, as quickly as possible, evidence of coordinated inauthentic activity.
It would fund and encourage start-ups who work on flagging and tracking disinformation campaigns. It would provide significant tax incentives for those who started news companies in news deserts at a local level.
It would provide money to state and local officials to boost their communications budget in order to develop in-house resources to fight against malicious information locally.
It would develop and implement national crisis communication plans and serve as the focal point and coordination center for informatics during future disasters.
What are your ideas? What’s doable? What’s not doable unless conditions are met? What should be the priority? I’d like to hear your thoughts.