Acting Under Secretary for Public Diplomacy and Public Affairs
There are also dangers in accepting a post-truth paradigm. Communicators, experts, and officials may feel overwhelmed and succumb to inaction or, worse, be seduced into adopting “post-truth techniques” that appeal only to emotion and sideline facts or challenging audiences’ beliefs.
Author: Ambassador Bruce Wharton, Acting Under Secretary of State for Public Diplomacy and Public Affairs
Source: Department of State
Date: March 20, 2017
- There has been much discussion in the media, academia, and within the U.S. government about living in a “post-truth” or “post-factual” society and how to operate in it.
- In such a world, the public policy debate is framed largely by what “feels” true and what correlates with people’s pre-existing set of beliefs and prejudices, which can often be disconnected from actual facts and the specifics of policy.
- . . .in our age, social media has exacerbated the problem, accelerating the speed at which false stories spread, creating “digital wildfires” of misinformation. By the time a false story is out there, it is often too late to mount an effective rebuttal based on facts.
- Compounding the problem is the active work of non-state and state actors who aim not only to disseminate misinformation but, most damaging, to erode trust in traditional sources of information.
- . . . I would like to contest the view that we are living in a “post-truth” society — if by that we mean truth and facts no longer matter. Facts do exist. They are out there; we cannot operate without them.
- There are also dangers in accepting a post-truth paradigm. Communicators, experts, and officials may feel overwhelmed and succumb to inaction or, worse, be seduced into adopting “post-truth techniques” that appeal only to emotion and sideline facts or challenging audiences’ beliefs.
- There is also the temptation to counter the barrage of misinformation by attempting to rebut every false story, but this is a losing proposition. There are too many of them, they spread too quickly, and there are too few of us to chase them.
- The way to counter pseudo-facts and misinformation is to present a compelling narrative of our own, one that is true, defensible, and based on the enduring values and goals that people share, not the least of which is strengthening our collective security and prosperity. To gain credibility and make our narrative relevant, we must also listen to and acknowledge our audiences’ underlying fears, grievances, and beliefs.
- But it is not just a matter of telling a good story; the narrative must be tied to action.
- In short, we’ve got to “walk the talk,” or risk losing credibility. This is not to say countering disinformation is easy. It requires strategic thought, creative tactics, and sustained investment.
Case study – State’s approach to fighting extremist ideology
- One approach aimed at mass appeal was the $15 million “Shared Values” campaign featuring Muslims living happily in the United States. As well intended as this was, the messaging did not acknowledge underlying grievances and was not considered effective in reaching young Muslim audiences overseas.
- Another idea you may remember from a just few years ago was the “Welcome to ISIS Land” video, which went viral for all the wrong reasons.
- . . . the former Center for Strategic Counterterrorism Communications (CSCC), which was established in 2010 to counter extremist ideology, could point to the size of its Facebook and Twitter followings — and the number of death threats and efforts to shut down its accounts were evidence that the center had gotten under the skin of ISIS — but it could not measure effectiveness.
- The CSCC was also under resourced. Its budget hovered in the range of $5-6 million per year, while the Pentagon was spending about $150 million on similar efforts and the CIA even more.
- This experience provided us with a wealth of valuable lessons for charting a new way forward in countering false narratives, including:
Not imitating the enemy,
Having a credible message based on facts and evidence that acknowledge underlying grievances,
Partnering with credible, independent, trusted messengers,
Using technology to identify the right audiences and the best approaches for reaching them,
Employing analytics to evaluate effectiveness and feeding that information back into the process, and
Securing political and bureaucratic support, including sufficient funding and personnel.
- On the technology front, I am particularly enthusiastic about the potential to use tools such as social graph analysis (SGA) to help us identify credible individuals who drive and shape online opinion within each country. Network analysis can provide information in two critical areas: 1) topics important to people in target audiences and 2) the most uniquely influential people within those topical clusters.
A new approach
- The beneficiary of these lessons is the State Department’s new Global Engagement Center (GEC), which is legislatively given the task “to lead, synchronize, and coordinate efforts of the Federal Government to recognize, understand, expose, and counter foreign state and non-state propaganda and disinformation efforts aimed at undermining United States national security interests.”
- In terms of resources, the GEC is funded at approximately $16 million dollars for FY-17 and is slated to have an additional $19.8 million in supplemental funding in FY-18. Further, Congress has authorized – although not mandated – the Department of Defense to transfer up to $60 million a year, in both FY-17 and FY-18, to support GEC activities.
- For instance, the GEC’s “Defectors” campaign used content from 14 Coalition countries that highlighted the lived experiences of ISIS defectors and the effects of their recruitment on their families. In just one week, the campaign reached 2.4 million people who watched over one million minutes of video. Ultimately, the Defectors campaign reached seven million individuals and garnered 780,000 “click-throughs” from people identified as being at risk for recruitment by violent extremists. Despite the impressive numbers, the cost of this data-driven campaign was only $15,000.
- . . . we must remember that many other PD tools play a vital role in sharing the truth, such as educational and cultural exchanges, youth initiatives, and English teaching programs.
- Training and education programs that both cultivate a questioning mindset and build the skills of information consumers to separate the wheat from the chaff are vital. One way we are doing this is through TechCamps focused on disinformation.
- To be truly effective, however, we must start at a younger age. A recent study by Stanford showed that students at most grade levels cannot tell the difference between fake and real news as they often lack the critical thinking skills needed to separate truth from misinformation.
Donald M. Bishop is the Bren Chair of Strategic Communications at Marine Corps University in Quantico, Virginia.
Mr. Bishop served as a Foreign Service Officer – first in the U.S. Information Agency and then in the Department of State – for 31 years. Specializing in Public Diplomacy, political-military affairs, and East Asia, he attained the rank of Minister-Counselor in the career service. He was President of the Public Diplomacy Council from 2013 to 2015 and is now a member of the Board of Directors.
…click authors name for more info
Author: Donald M. Bishop
We welcome comments from our readers that advocate and shed light on the subject of public diplomacy. We avoid discussion that is politically partisan, commercial in nature or offensive. To prevent inappropriate comments and spam we screen each comment before publishing it, so please excuse us if you do not see your remark right away.