If Google goes to China, will it tell the truth about Tiananmen Square? | Opinion


Google’s plan to relaunch search in China, the world’s largest market, is facing pushback from employees, human rights defenders and politicians. With good reason. The Chinese government will insist that the search engine suppress results related to the Tiananmen democracy protests of 1989, in which several hundred peaceful protesters were shot by the army.

But international norms oblige companies to treat human rights atrocities such as the Tiananmen Square massacre differently. Suppressing information about these atrocities undermines the individual and collective right to truth that is increasingly recognized in human rights law.

Google and other internet companies, particularly those that have positioned themselves as defenders of privacy and freedom of expression, need to bear witness to these atrocities, not succumb to state pressure to bury them.

In the past, religious and political partisans burned opponents’ books to suppress unwanted truths and spread falsehoods. Today’s extremists and conspiracy-mongers don’t need a match to destroy the truth; they prefer to bury it with search engine optimization. Some have mastered elevating their views to the top of Google searches, moving more credible sources down the page. It’s information warfare, but with newer weapons.

When pressured, Google can and does respond to criticism related to its human rights impacts. Two years ago, Carole Cadwalladr wrote about how Google enabled the spread of misinformation that maligned and harmed women, black people, Muslims and Jews. She cited several search results, one of which concerned the Holocaust. When she typed “did the hol” into the search box, Google offered to complete the inquiry as “did the Holocaust happen” – and the top result was a white nationalist site that denied the documented historical truth of the Holocaust.

READ ALSO  Job Opportunities - Wine Spectator

Though it publicly distanced itself from the hateful content, Google first responded that its search algorithms were objective and aimed to “provide the most relevant and useful results for our users”. But the company, along with others in the ecosystem, also subsequently changed the script so that reliable sources moved to the top.

Big tech companies can no longer avoid these controversies, as the recent Alex Jones case demonstrates. Jones shamelessly promotes false and harmful conspiracy theories, contending, for example, that the Sandy Hook elementary school massacre was a liberal hoax staged by actors – instigating sustained harassment of the parents of the murdered children. Against the loud outcry of his “alt-right” supporters, Facebook, YouTube, and others (notably belatedly, and perhaps only partially, Twitter) banned him from their sites.

These controversies often emanate from a profound difference between the way online platforms traditionally define their role and how the public actually perceives them. For most software engineers, the prevailing ethos is to create “neutral” platforms that enable a vigorous exchange of disparate views. Like other big platforms, Google is loth to decide whose views should prevail on their sites, an approach that resonates with America’s right to free speech. The public, on the other hand, imagines that Google search results are curated sources of authority or a grant of legitimacy – as they would with materials in an archive or library. On occasion, Google has even described itself in those terms – as a repository of the world’s knowledge, in the tradition of the ancient library at Alexandria.

READ ALSO  Is Google testing its own jobs search engine?

Visitors gather at a display booth for Google at a 2016 conference in Beijing.



Visitors gather at a display booth for Google at a 2016 conference in Beijing. Photograph: Andy Wong/AP

Surely, Google and the giant social networks are right to avoid the role of referee in most disputes over what is true and good. In cases of genuine views held by real people – unlike Russian hackers – who is to say they shouldn’t be heard, even if noxious to most? With their enormous power over what is seen and heard, it’s easy to envision a slippery slope toward the kind of dangerous scenarios imagined by conspiracy theorists. No company or set of companies should be able to exercise that kind of power in a democracy.

But search engines – particularly Google’s – already play an extraordinarily important role in shaping our social understanding of what is worth knowing. Google acknowledges this, in part through its membership in the Global Network Initiative, a multi-stakeholder organization dedicated to protecting and advancing freedom of expression and privacy for internet users. And modern democratic societies recognize that a shared social understanding and memory of human rights atrocities are a special case. International truth commissions and courts, starting at Nuremberg, have well established the facts of many human rights atrocities. Archives, such as the Muro de la Memoria (Wall of Memory) of the desaparecidos (the disappeared) in Chile, demonstrate the importance of sustaining a collective memory of well-documented events. This “right to truth” is increasingly recognized in human rights law, as both an individual and collective right. Allowing sites that deny human rights atrocities to “outrank” those that accurately document them imperils that right to truth and memory.

Fortunately, search engines and social networks needn’t be the arbiter of the truth of human rights claims. In deciding how to respond to malicious propaganda or state pressure to censor, they can rely on international guidelines. These guidelines, especially the UN’s Guiding Principles on Business and Human Rights, can bolster their motivation and ability to act – including in China. They can also rely on third parties, such as truth commissions and expert public investigative bodies, to act as the arbiters. The Inter-American Commission on Human Rights and similar institutions have helped develop jurisprudence on the right to truth for individual victims – and for society as a whole. Google has no obligation to search for truth, but it does have the responsibility to faithfully surface the literal fact of a human rights atrocity, such as the Holocaust and the massacre at Tiananmen Square. To do so, search engines need to create a new algorithmic script. Call it a “bearing witness” script.

A bearing witness script is wholly consistent with the engineers’ commitments and capability. Relying on factual accounts of human rights atrocities produced by expert public bodies is well aligned with democratic values and avoids the slipperiness of in-house determinations. It is, in fact, an embodiment of the new motto of Alphabet, Google’s parent company: do the right thing.

Deirdre K Mulligan is an associate professor at UC Berkeley’s School of Information, a faculty director of the Berkeley Center for Law and Technology and a founding member of the Global Network Initiative. Daniel S Griffin is a PhD student at UC Berkeley’s School of Information



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com