Google Allegedly Used Homeless People to Train Pixel Phone


Managers reportedly encouraged contractors to mischaracterize the data collection as a “selfie game,” akin to Snapchat filters such as Face Swap. College students who agreed to the scans later told the Daily News that they didn’t recall ever hearing the name Google and were simply told to play with the phone in exchange for a gift card. To entice homeless users in L.A. to consent, contractors were allegedly instructed to mention a California law that allows the gift cards to be exchanged for cash. The whole episode is, in a bleak way, an apparent attempt to diversify AI training data while paying people for their information. But the result is completely dystopian.

According to The New York Times, Google temporarily suspended the data collection, pending an internal investigation. In an emailed statement to The Atlantic, a Google spokeswoman said, “We’re taking these claims seriously and investigating them. The allegations regarding truthfulness and consent are in violation of our requirements for volunteer research studies and the training that we provided.”

It’s baffling that this purported scheme, which the Daily News’s reporting suggests commodified black and homeless Americans, was intended to reduce racial bias. But as the Harvard technologist Shoshana Zuboff has argued, people have always been the “raw materials” for Big Tech. Products such as the Pixel and the iPhone, and services such as Google and Facebook, collect our data as we use them; companies refine that data, and, with each new generation, sell us more advanced products that collect more useful data. In this framework, our habits, our choices, our likes, and our dislikes are not unlike soybeans or petroleum or iron ore—natural resources that are extracted and processed by huge firms, for massive profit.

READ ALSO  Optimize Micro-Interactions to Enhance your UX Design

Sometimes this looks like a smart thermostat getting better at predicting how cool you like your home, and sometimes it looks like a $1 trillion company allegedly offering $5 gift cards to homeless black people to better sell a $1,200 phone.

As the techlash continues, some lawmakers are seeking to empower their constituents to demand that companies such as Google pay users for their data. California and Alaska have debated legislation to charge companies for using people’s personal data. Andrew Yang, the 2020 Democratic presidential candidate, has advocated treating data as a “property right.” The Facebook co-founder Chris Hughes suggests a “data dividend,” a revenue tax on companies monetizing enormous amounts of public data, paid out to users across the country, like universal basic income.

But following that line of thinking makes it clear that we still have no ethical or economic framework for valuing data collected from people across different social contexts. Should tech companies pay more for dark-skinned subjects because they’re underrepresented in training data? If our bodies are commodities, what’s a fair price, and who should set it? The data-ownership idea is, fundamentally, limited: Even if we manage, with the help of Hughes or Yang or state legislatures, to negotiate a high price for our data, we’re still for sale.

In a backwards way, movements to pay users for the data that tech companies take from them only corroborate the process by which Silicon Valley turns our faces into commodities. Imagine an unregulated race-to-the-bottom market where companies target the most vulnerable for their data, restrained only by the alarmingly low bar for consent to improve their products. It would look a lot like paying homeless people $5 for a face scan.

READ ALSO  Pitfalls in Using a Standard “Global” C++ Function: Its C Counterpart May Be Camouflaging

We want to hear what you think about this article. Submit a letter to the editor or write to [email protected].



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com