My Facebook op-ed

Aftenposten asked me to adapt my Medium post about the Facebook napalm photo incident as an op-ed. Here it is in Norwegian. Here is the English text: Text: Facebook needs an editor — to stop Facebook from editing. An editor might save Facebook from making embarrassing and offensive judgments about what will offend, such as its decision […]

Aftenposten asked me to adapt my Medium post about the Facebook napalm photo incident as an op-ed. Here it is in Norwegian. Here is the English text:

Text:

Facebook needs an editor — to stop Facebook from editing.

An editor might save Facebook from making embarrassing and offensive judgments about what will offend, such as its decision last week requiring writer Tom Egeland, Aftenposten editor Espen Egil Hansen, then Norwegian Prime Minister Erna Solberg to take down a photo of great journalistic meaning and historic importance: Nick Ut’s image of Vietnamese girl Kim Phúc running from a 1972 napalm attack after tearing off her burning clothes. Only after Hansen wrote an eloquent, forceful, and front-page letter to Facebook founder Mark Zuckerberg did the service relent.

Facebook’s reflexive decision to take down the photo is a perfect example of what I would call algorithmic thinking, the mindset that dominates the kingdom that software built, Silicon Valley. Facebook’s technologists, from top down, want to formulate rules and then enable algorithms to enforce those rules. That’s not only efficient (who can afford the staff to make these decisions with more than a billion people posting every day?) but they also believe it’s fair, equally enforced for all. As they like to say in Silicon Valley, it scales.

The rule that informed the algorithm in this case was clear: If a photo portrays a child (check) who is naked (check) then the photo is rejected. The motive behind that rule could not be more virtuous: eliminating the distribution of child pornography. But in this case, of course, the naked girl did not constitute child pornography. No, the pornography here is a tool of war, which is what Ut’s photo so profoundly portrays.

Technology scales but life does not and that is a problem Facebook of all companies should recognize, for Facebook is the post-mass company. Mass media treat everyone the same because that’s what Gutenberg’s invention demands; the technology of printing scales by forcing media to publish the exact same product for thousands unto millions of readers. Facebook, on the other hand, does not treat us all alike. Like Google, it is a personal services company that gives every user a unique service, no two pages ever the same. The problem with algorithmic thinking, paradoxically, is that it continues the mass mindset, treating everyone who posts and what they post exactly the same, under a rule meant to govern every circumstance.

The solution to Facebook’s dilemma is to insert human judgment into its processes. Hansen is right that editors cannot live with Zuckerberg and company as master editor. Facebook would be wise to recognize this. It should treat editors of respected, quality news organizations differently and give them the license to make decisions. Facebook might want to consider giving editors an allocation of attention they can use to better inform their users. It should allow an editor of Hansen’s stature to violate a rule for a reason. I am not arguing for a class system, treating editors better than the masses. I am arguing only that recognizing signals of trust, authority, credibility, and quality will improve Facebook’s recommendations and service.

When there is disagreement , and there will be, Facebook needs a process in place — a person: an editor — who can negotiate on the company’s behalf. The outsider needn’t always win; this is still Facebook’s service, brand, and company and in the end it has the right to decide what it distributes just as much as Hansen has the right to decide what appears in these pages. That is not censorship; it is editing. But the outsider should at least be heard: in short, respected.

If Facebook would hire an editor, would that not be the definitive proof that Facebook is what my colleagues in media insist it is: media? We in media tend to look at the world, Godlike, in our own image. We see something that has text and images (we insist on calling that content ) with advertising (we call that our revenue) and we say it is media, under the egocentric belief that everyone wants to be like us.

Mark Zuckerberg dissents. He says Facebook is not media. I agree with him. Facebook is something else, something new: a platform to connect people, anyone to anyone, so they may do what they want. The text and images we see on Facebook’s pages (though, of course, it’s really just one endless page) is not content. It is conversation. It is sharing. Content as media people think of it is allowed in but only as a tool, a token people use in their conversations. Media are guests there.

Every time we in media insist on squeezing Facebook into our institutional pigeonhole, we miss the trees for the forest: We don’t see that Facebook is a place for people — people we need to develop relationships with and learn to serve in new ways. That, I argue, is what will save journalism and media from extinction: getting to know the needs of people as individuals and members of communities and serving them with greater relevance and value as a result. Facebook could help us learn that.

An editor inside Facebook could explain Facebook’s worldview to journalists and explain journalism’s ethics, standards, and principles to Facebook’s engineers. For its part, Facebook still refuses to fully recognize the role it plays in helping to inform society and the responsibility — like it or not — that now rests on its shoulders. What are the principles under which Facebook operates? It is up to Mark Zuckerberg to decide those principles but an editor — and an advisory board of editors — could help inform his thinking. Does Facebook want to play its role in helping to better inform the public or just let the chips fall where they may (a question journalists also need to grapple with as we decide whether we measure our worth by our audience or by our impact)? Does Facebook want to enable smart people — not just editors  but authors and prime ministers and citizens— to use its platform to make brave statements about justice? Does Facebook want to have a culture in which intelligence — human intelligence — wins over algorithms? I think it does.

So Facebook should build procedures and hire people who can help make that possible. An editor inside Facebook could sit at the table with the technologists, product, and PR people to set policies that will benefit the users and the company. An editor could help inform its products so that Facebook does a better job of enlightening its users, even fact-checking users when they are about to share the latest rumor or meme that has already been proven false through journalists’ fact-checking. An editor inside Facebook could help Facebook help the journalism survive by informing the news industry’s strategy, teaching us how we must go to our readers rather than continuing to make our readers come to us.

But an editor inside Facebook should not hire journalists, create content, or build a newsroom. That would be a conflict of interest, not to mention a bad business decision. No, an editor inside Facebook would merely help make a better, smarter Facebook for us all.

Who should do that job? Based on his wise letter to Mark Zuckerberg, I nominate Mr. Hansen.

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.