Thứ Sáu, 31 tháng 5, 2019

Want to Know How to Build a Better Democracy? Ask Wikipedia

Read more useful articles at: Tech Deeps

Pity the poor public-relations specialist hired to influence what is said about his clients on Wikipedia. The sprawling, chaotic storehouse of knowledge is governed by thousands of independent-minded volunteers committed to being neutral and allergic to self-serving manipulators.

The barriers are formidable, but so is the temptation to do some reputational polishing there. What appears on Wikipedia matters. Daily traffic to the English site has barely grown in years, but that is because Wikipedia articles are so reputable that they are baked into the Internet—particularly Google's results pages. A biographical capsule Google publishes on me, for example, has all its facts taken straight from Wikipedia, except for my age being 20, which Google came up with on its own. When YouTube tried to contain proliferating conspiracies, it turned to Wikipedia. Of course men landed on the moon, it says so right here on Wikipedia!

Attempts to influence the site are, as the recent college admissions scandal shows, sadly inevitable; there are few areas immune to power of wealth and status. How long can Wikipedia resist?

Noam Cohen


author photo

About

Noam Cohen is a journalist and author of The Know-It-Alls: The Rise of Silicon Valley as a Political Powerhouse and Social Wrecking Ball, which uses the history of computer science and Stanford University to understand the libertarian ideas promoted by tech leaders. While working for The New York Times, Cohen wrote some of the earliest articles about Wikipedia, bitcoin, Wikileaks, and Twitter. He lives with his family in Brooklyn.

Throughout Wikipedia's history people have tried to nudge the content in their favor. There have been elaborate nonprofessional campaigns to promote nationalistic causes, such as what to call the Sea of Japan/East Sea. Likewise, there have been examples of stealth editing, presumably by the subjects of Wikipedia articles, as well as contributors secretly paid to polish the reputations of certain clients. These actions are considered conflicts of interest, prohibited along with a bunch of other sketchy practices as a threat to the Wikipedia's ideal of a neutral point of view.

A recent account in The Huffington Post highlighted a novel approach by one marketing executive hired to influence what appears on Wikipedia: Instead of paid editing, Ed Sussman provides paid advocacy. Sussman, who is CEO of the marketing firm Buzzr.com, represents a range of clients, including the Axios news website, NBC, and the Facebook PR team. For NBC, he has focused on managing controversies, such as the question of whether NBC News handled allegations against Matt Lauer properly. In the case of one Facebook executive, Sussman's goal was to get an article about her published.

For his fee, Sussman does not personally publish or edit the articles his clients care about; he won't do that, he explains, because he has an obvious conflict of interest. As he writes on his Wikipedia user page: "If you ever think any of my work doesn't conform to Wikipedia policy, please let me know and I’ll do my best to fix it!"

Instead, Sussman, who is a lawyer by training, prepares drafts of revised articles, or in the case of the Facebook executive, the entire article, which he posts on the pages used to discuss how to improve Wikipedia. His work is well written and well sourced. He then tries to persuade editors to make those changes themselves. After all, a frequent concern of Wikipedia editors is that articles are too short and too thinly sourced, and Sussman is doing his part to reduce that problem.

Indeed, for many dedicated volunteers, Sussman poses few problems, because he is so transparent about his motives. On reading the HuffPo headline, one Wikipedia administrator, Swarm, wrote that the news seemed "extremely alarming, and I was ready to crucify this guy." Digging deeper, Swarm came to the opposite conclusion: "Most of the supposed 'whitewashing' seems to be mundane matters that don't harm articles at all, if not actual improvements."

The flip side of this embrace of transparency by Sussman, however, is that Wikipedia editors have tried, and in at least one case, succeeded, in transparently informing readers that the articles have been advocated for by a paid Wikipedia editor. The Axios article was edited to mention the news site had hired an advocate to "beef up its Wikipedia page (mostly with benign—if largely flattering—stats about Axios' accomplishments)." Including such a sentence, of course, somewhat defeats the purpose of hiring an advocate; the best lobbyists blend into the background.

When Wikipedia editors complain about Sussman they, in essence, say he is behaving like an overly excited, and legally trained, flack. His arguments are long and have oodles of sources. One editor, kashmiri, a non-native-English speaker, pleaded for mercy: "May I kindly ask you to be more concise? I agree English is a beautiful language, but requiring other editors to read walls of text from you on every single issue is tad daunting, sorry." While a good advocate tries to make every argument they can think of, in case one of them sticks, among Wikipedians the tactic is sometimes frowned on.

Taking a step back, what could be wrong with making a case for a client with rigor and a broad range of sources, hoping that it gets adopted by the community? It's not the careful attention that is the problem, but that the careful attention only goes to those who can pay. When different standards apply based on status and wealth, in areas as important as education and criminal justice, as well as relatively trivial ones like Wikipedia, poof, there goes the fairness crucial to a functioning democracy.

Wikipedia's approach is collective, not individualistic. To come up with a solution, the community deliberates and seeks a consensus. Those deliberations, ideally, are driven by people far removed from the issues and parties involved. There is a belief in a type of karmic justice for those who try to game the system, which played out in the Axios article. It's called the Streisand effect, so named in the wake of Barbra Streisand's attempt to suppress photos of her Malibu home. Her efforts to deny access to those images only created more interest. Imagine a world where the more you try to manipulate the system, the more you are exposed!

By contrast, we know that large social networks respond to manipulations by those who have power and ignore those who don't. Facebook, for example, fails to hire translators as genocide rages in Myanmar, yet personally apologizes in front of Congress when called out by conservatives for determining that the extreme rhetoric from a pair of Trump supporters, Diamond and Silk, was not safe for its community. Likewise, Twitter's decision to allow President Trump to break its community standards for harassment and bullying, because as president what he says is newsworthy, is the ultimate example of a two-tiered system.

Democratic presidential candidate Elizabeth Warren has witnessed firsthand how Facebook bends in the face of a powerful critic—herself. Facebook took down a Warren ad for supposed technical violations and then quickly restored it after an uproar. The experience left a bad taste: "You shouldn’t have to contact Facebook’s publicists in order for them to decide to 'allow robust debate' about Facebook," she wrote on Twitter. "They shouldn't have that much power."

Perhaps the just-the-facts folks at Wikipedia can teach us all something.


Updated 04/16/2019, 8:30 pm EDT: This story was updated to clarify Ed Sussman's work for NBC and to clarify how Wikipedians police edits to entries.



Ideas Latest

Read more useful articles at: Tech Deeps

Không có nhận xét nào:

Đăng nhận xét