Our problem isn’t ‘fake news.’ Our problems are trust and manipulation.

“Propaganda is the executive arm of the invisible government.” — Edward Bernays, Propaganda (1928) “Fake news” is merely a symptom of greater social ills. Our real problems: trust and manipulation. Our untrusted — and untrustworthy — institutions are vulnerable to manipulation by a slough of bad guys, from trolls and ideologues to Russians and terrorists, all operating under varying motives […] The post Our problem isn’t ‘fake news.’ Our problems are trust and manipulation. appeared first on BuzzMachine .

“Propaganda is the executive arm of the invisible government.”
— Edward Bernays, Propaganda (1928)

“Fake news” is merely a symptom of greater social ills. Our real problems: trust and manipulation. Our untrusted — and untrustworthy — institutions are vulnerable to manipulation by a slough of bad guys, from trolls and ideologues to Russians and terrorists, all operating under varying motives but similar methods.

Trust is the longer-term problem — decades- or even a century-long. But if we don’t grapple with the immediate and urgent problem of manipulation, those institutions may not live to reinvent themselves and earn the public’s trust back with greater inclusion, equity, transparency, responsiveness, and honesty. At the News Integrity Initiative, we will begin to address both needs.

Here I want to examine the emergency of manipulation with a series of suggestions about the defenses needed by many sectors — not just news and media but also platforms, technology companies, brands, marketing, government, politics, education. These include:

  • Awareness. As Storyful asks, “Who’s your 4chan correspondent?” If we do not understand how we are being manipulated, we become the manipulators’ agents.
  • Starving the manipulators of money but more importantly of attention, exposing their methods without giving unwarranted attention to their messages.
  • Learning from the success of the manipulators’ methods and co-opting them to bring facts, journalism, and truth to the public conversation where it occurs.
  • Bringing greater transparency and accountability to our institutions. In journalism’s case, this means showing our work, recognizing the danger of speed (a game the manipulators will always win), and learning to listen to the public to reflect and serve communities’ distinct needs. In the platforms’ case, it means accounting for quality in algorithmic decisions and helping users better judge the sources of information. In the ad industry’s case, it means bringing tools to bear so we can hold brands, agencies, and networks responsible for what they choose to support.

I will explore these suggestions in greater detail after first examining the mechanisms and motives of manipulation. I claim no expertise in this; I’m just sharing my learning as it occurs.

I became convinced that manipulation is the immediate emergency thanks to danah boyd of Data & Society, which recently issued an excellent report by Alice Marwick and Rebecca Lewis, “Media Manipulation and Disinformation Online.” They catalog who is trying to manipulate media — trolls, ideologues, hate groups, conspiracy theorists, gamergaters; where and how they do it — via blogs, sites, message boards, and social media; and why they do it — for money, for hate, for power, to cause polarization, to disrupt institutions, or for the lulz.

In discussing manipulation, it is important to also examine Russian means and methods. For that, I recommend two provocative reports: the NATO Defense College’s “Handbook of Russian Information Warfare” by Keir Giles and the RAND Corporation’s “The Russian ‘Firehose of Falsehood’ Propaganda Model” by Christopher Paul and Miriam Matthews.

Says danah: “Our media, our tools, and our politics are being leveraged to help breed polarization by countless actors who can leverage these systems for personal, economic, and ideological gain. Sometimes, it’s for the lulz. Sometimes, the goals are much more disturbing.”

In short: We are being used.

Russian manipulation

I found the NATO manual particularly worrying, for in examining what Russia has done to manipulate information in Ukraine and elsewhere, we see the script for much of what is happening now in the United States. I’m not suggesting Russia is behind this all but instead that all the manipulators learn from each other while we in media do not.

NATO emphasizes that Russia does not think of this as cyber warfare but instead as information warfare. “Russia refers to ‘information space,’” Rand says, “and includes in this space both computer and human information processing, in effect the cognitive domain.” Human psychology, that is. Thus, Russia’s weapons work neatly not only online but also in mainstream media, enabling it to “steal, plant, interdict, manipulate, distort or destroy information.”

“Information,” says the author of a Russian paper NATO cites, “has become the same kind of weapon as a missile, a bomb, and so on [but it] allows you to use a very small amount of matter or energy to begin, monitor, and control processes whose matter and energy parameters are many orders of magnitude larger.”

Russia has weaponized a new chain reaction in social media. To what end?

“The main aim of information-psychological conflict is regime change,” says another Russian paper, “by means of influence on the mass consciousness of the population — directing people so that the population of the victim country is induced to support the aggressor, acting against its own interests.”

Sound familiar? Sound chilling? See, our problem is not just some crappy content containing lies and stupidity. The problem is a powerful strategy to manipulate you.

Our institutions help them. “Russia seeks to influence foreign decision-making by supplying polluted information,” NATO says, “exploiting the fact that Western elected representatives receive and are sensitive to the same information flows as their voters.” That is, when they play along, the journalism, the open internet, and the free speech we cherish are used against us. “Even responsible media reporting can inadvertently lend authority to false Russian arguments.” Therein lies the most insidious danger that danah boyd warns against: playing into their hands by giving them attention and calling it news.

Their goal is polarization — inside a nation and among its allies — and getting a country to eat its own institutions. Their tactics, in the words of former NATO press officer Ben Nimmo, aim to “‘dismiss, distract, dismay’ and can be achieved by exploiting vulnerabilities in the target society, particularly freedom of expression and democratic principles.” They use “‘mass information armies’ conducting direct dialogue with people on the internet” and describe information weapons as “more dangerous than nuclear ones.” Or as the Russian authors of a paper NATO cites say: “The mass media today can stir up chaos and confusion in government and military management of any country and instill ideas of violence, treachery, and immorality and demoralize the public.”

Feel demoralized these days? Then it’s working.

The Russian paper on information-psychological warfare that NATO quotes lists Russia’s key tactics, which— like their goals and outcomes — will sound eerily familiar:

The primary methods of manipulating information used by the mass media in the interests of information-psychological confrontation objectives are:

Direct lies for the purpose of disinformation….;

Concealing critically important information;

Burying valuable information in a mass of information dross…;

Terminological substitution: use of concepts and terms whose meaning is unclear or has undergone qualitative change, which makes it harder to form a true picture of events; [see “fake news”]

Introducing taboos on specific forms of information or categories of news…; [see “political correctness”]

Providing negative information, which is more readily accepted by the audience than positive.

More tactics: Trolls and bots are used to create a sense of public opinion so it is picked up by media. Journalists are harassed and intimidated, also by trolls and bots. They exploit volume: “When information volume is low,” says RAND, “recipients tend to favor experts, but when information volume is high, recipients tend to favor information from other uses.” And they exploit speed: “Russian propaganda has the agility to be first,” Rand observes. “It takes less time to make up facts than it does to verify them.” And the first impression sets the agenda.

At the highest level, they attack truth. “Multiple untruths, not necessarily consistent, are in part designed to undermine trust in the existence of objective truth, whether from media or from official sources,” says NATO. “This contributes to eroding the comparative advantages of liberal democratic societies when seeking to counter disinformation.” [My emphasis]

What do we in journalism do in response? We fact-check. We debunk. We cover them. But that’s precisely what they want us to do for then we give them attention. Says former U.S. Ambassador to Ukraine Geoffrey Pyatt: “You could spend every hour of every day trying to bat down every lie, to the point where you don’t achieve anything else. And that’s exactly what the Kremlin wants.”

Western manipulation

Again, I am not saying the Russia is behind all media manipulation. Far from it. But as Hillary Clinton suggested, the Macedonian fake news factory that went after her learned their tricks somewhere. Trolls and manipulators learn from each other and so we must learn about them ourselves.

In their Data & Society report, Marwick and Lewis do considerable forensic research into the dissemination of pro-Trump populist messages, which spread (1) “through memes shared on blogs and Facebook, through Twitter bots, through YouTube channels;” sometimes passing through even (2) the Trump’s own Twitter account; until they are then (3) “propagated by a far-right hyper-partisan press rooted in conspiracy theories and disinformation” (read: Breitbart et al); until (4) “they influenced the agenda of mainstream news sources.” From 4chan and 8chan to Alex Jones to Breitbart to Trump to Fox to CNN to you.

Just as “fake news” is a sloppy label, so is “alt-right.” Marwick and Lewis dissect that worm into its constituent segments: “an amalgam of conspiracy theorists, techno-libertarians, white nationalists, Men’s Rights advocates, trolls, anti-feminists, anti-immigration activists, and bored young people.” They “leverage both the techniques of participatory culture and the affordances of social media to spread their various beliefs” and “target vulnerabilities in the news media ecosystem to increase the visibility of and audience for their messages.” What ties them together is some measure of belief — anti-establishment, anti-multiculturalism, anti-globalism, anti-feminism, anti-Semitic, anti-political correctness, and nationalist and racist ideologies. But what mostly links them is their techniques. As trolls, they aim for reaction for reaction’s sake. They mock the “type of tragedy-of-the-week moral panic perpetrated by talk shows and cable news,” as observed by net scholar Whitney Phillips. And they exploit Poe’s Law, playing “with ambiguity in such a way that the audience is never quite sure whether or not they are serious.”

They hack social media, media, and ultimately attention and democracy.

And therein lies the paradoxical vice in which we find ourselves: When we address, check, and attack them, we feed them with attention. Hillary Clinton learned the hard way that “by addressing [fringe] ideas, she also gave them new visibility and legitimacy.” She “inadvertently cemented their importance.” Say Marwick and Lewis: “By getting the media to cover certain stories, even by debunking them, media manipulators are able to influence the public agenda.”

And it is only going to get worse. At a World Economic Forum discussion on the topic that I facilitated in San Francisco, I heard a few frightening predictions: First, the bad guys’ next targets will be “pillars of society” — doctors, pastors, auditors, judges. Second, communities will devolve into “belief tribes” where anyone who disagrees with an orthodoxy of opinion will be branded a shill. Third, augmented reality will make it easier to fake not just text and photo but also audio and video and thus identity. And fourth, what I am coming to fear greatly: a coming Luddite rebellion against technology will separate us into “connected and disconnected tribes.”

At another WEF discussion, I heard from executives of NGOs, governments, banks, consumer brands, pharma, accounting, and media that they are beginning to recognize the emergency we face. Good.

So WTF do we do?

We in media and other institutions must develop new strategies that account for the very new tactics undertaken by our new enemies. We must go far beyond where we are now.

Today, some are tackling falsehoods by fact-checking. Some want to enhance the public’s critical thinking through so-called news literacy. Some are compiling lists and signals of vice (NII is collaborating with Storyful and Moat in one such effort) and of virtue in sources. Google is seeking to account for the reliability, authority, and quality of sources in its ranking. Facebook is killing the fake accounts used to mimic public conversation. (If only Twitter would get aggressive against malevolent bots and fakes.) I’ve seen no end of (Quixotic, I believe) efforts to rank sites for quality.

These are fine as far as they go, but rather than attacking the facts, sources, and accounts — merely tactics — we need to go after the real symptom (manipulation) and the real ill (trust). “The first step is to recognize that this is a nontrivial challenge,” RAND understates. Some of the suggestions I’m thinking about:

Build awareness: News media must recognize how and when they are the objects of manipulation. I so like Storyful’s idea of hiring a 4chan correspondent that we at NII are thinking of underwriting just such a journalist to help news organizations understand what is happening to them, giving them advance notice of the campaigns against them.

I also want to see all the experts I quote above — and others on my growing list — school media and other sectors in how they are being manipulated and what they could do in defense. Without that, they become trolls’ toys and Putin’s playthings.

Share intelligence: Besides 4chan correspondents, major newsrooms should have threat officers whose job it is to recognize manipulation before it affects news coverage and veracity. These threat officers should be communicating with their counterparts in newsrooms elsewhere. At the WEF meetings, I was struck how major brands staff war rooms to deal with disinformation attacks yet they don’t share information among themselves. So let’s establish networks of security executives in and across sectors to share intelligence, threat assessments, warnings, best practices, and lessons. To borrow the framing of NII supporter Craig Newmark, these could be NATOs for journalism, media, brands, and so on, established to both inform and protect each other. NII would be eager to help such efforts.

Starve them: There are lots of efforts underway — some linked above — to starve manipulators of their economic support through advertising, helping ad networks, agencies, and brands avoid putting their money behind the bad guys (and, I hope, choosing instead to support quality in media). We also need to put the so-called recommendations engines (Revcontent, Adblade, News Max, as well as Taboola and Outbrain) on the spot for supporting and profiting fake news — likewise for the publishers that distribute their dross. Even this takes us only so far.

The tougher challenge — especially for news organizations — is starving the manipulators of what they crave and feed upon: attention. I can hear journalists object that they have to cover what people are talking about. But what if people aren’t talking about it; bots are? What if the only reason people end up talking is because a polluted wellspring of manipulation rose from a few fanatics on 4chan to Infowars to Breitbart to Fox to MSM and — thanks to the work of a 4chan correspondent — you now know that? I can also hear journalists argue that everything I’ve presented here makes manipulators a story worth covering and telling the public about. Yes, but only to an extent. I’ll argue that journalism should cover the manipulators’ methods but not their messages.

Get ahead of them: RAND warns that there is no hope in answering the manipulators, so it is wiser to compete with them. “Don’t direct your flow of information directly back at the firehose of falsehood; instead, point your stream at whatever the firehose is aimed at, and try to push that audience in more productive directions…. Increase the flow of persuasive information and start to compete.” In other words, if you know — thanks to your intelligence network and 4chan correspondent — that the bad guys are going to go after, say, vaccines, than get there first and set the public agenda with journalism and facts about it. Warn the public about how someone will try to manipulate them. Don’t react later. Inform first.

Learn from them: We in media continue to insist that the world has not changed around us, that citizens must come to us and our fonts of information to be informed. No! We must change how we operate, taking journalism to the public where and when their conversation occurs. We should learn from the bad guys’ lessons in spreading disinformation so we can use their techniques to spread information. We also should not assume that all our tried-and-true tools — articles, explainers, fact-checking — can counteract manipulators’ propaganda. We must experiment and learn what does and does not persuade people to favor facts and rationality.

Rebuild yourself and your trust: Finally, we move from the symptoms to ailments. Our institutions are not trusted for many reasons and we must address those reasons. Media — not just our legacy institutions but also the larger media ecosystem — must become more equitable, inclusive, reflective of, and accountable to many communities. We must become more transparent. We must learn to listen first before creating the product we call content. Brands, government, and politics must also learn to listen first. These are longer-term goals.

The post Our problem isn’t ‘fake news.’ Our problems are trust and manipulation. appeared first on BuzzMachine.

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.