Report: Google Contractors Used Shady Methods to Scan Dark-Skinned People’s Faces for New Pixel 4 Feature

Google’s marketing video for the Pixel 4’s face unlock given this news is, uh, not great.
Screenshot: Google

With the Pixel 4, Google will reportedly be introducing a face unlock feature akin to Apple’s Face ID for the iPhone. A few months ago, it was reported that Google was actually hitting the streets for “field research,” paying volunteers in $5 Amazon or Starbucks gift cards to scan their faces to improve the upcoming feature. Now, a Daily News investigation has found that Google contractors targeted people with ‘darker skin’—including the homeless and college students—all while playing fast and loose with consent.

For context, building facial recognition tech requires massive databases to sample from. In the past, facial recognition software hasn’t been the best at being inclusive, struggling with darker skin tones and gender. Google acknowledged this was their intent in an email to Gizmodo.

“For recent studies involving the collection of face samples for machine learning training, there are two goals. First, we want to build fairness into Pixel 4’s face unlock feature,” a Google spokesperson told us. “It’s critical we have a diverse sample, which is an important part of building an inclusive product. And second, security. Face unlock will be a powerful new security measure, and we want to make sure it protects a wide range of people as possible.”

Sure, sure. But how Google’s contractors went about it is troubling. The Daily News cited several sources who said to get the diverse data, they were instructed to target homeless populations in Atlanta, BET Awards attendees in Los Angeles, and college students across the U.S.

The contractors were employed through Randstad, a third-party staffing company. According to the Daily News, Randstad supervisors told contractors to go after people of color and mislead volunteers as to what exactly was happening. Sources say they were instructed to tell suspicious volunteers things like “Just play with the phone for a couple minutes and get a gift card,” or outright say volunteers were not being recorded. One source said they were instructed to say “oh not really” if a person asked if the software was taking a video. Other sources said they were told to distract volunteers from asking questions and to rush through the entire process to keep subjects from properly reading the accompanying consent form.

Another disturbing tidbit was contractors in California were reportedly told they could entice financially strapped people to participate by mentioning a state law that says you can trade gift cards of under $10 for cash. That would obviously incentivize contractors to prey on vulnerable populations, like the homeless or unsuspecting students. The Daily News also reported finding two contractors going undercover as students at California State University Long Beach earlier in August, offering $5 gift cards for face scans. While some students were aware of Google’s involvement, many were not.

A follow-up Daily News report also shows a picture of a long line of homeless people in Atlanta lining up to partake in Google’s field research. The photo was reportedly taken by a city employee, who then discovered the project was Google offering $5 incentives in exchange for 3D face scans.

It would be one thing if Google had its contractors explicitly state what was going on and informed participants of Google’s data retention policy. A photo obtained by the News shows Google could keep the 3D face scans for as long as five years. (Google told The Verge in July the scans would be held for only 18 months.) If you told potential volunteers this, fewer people might be inclined to participate—but at least that’s an informed choice and everything would be above board. Certainly, that’s how Google presented its field research project to The Verge.

However, the reality seems far removed from a straightforward process where participants were well-informed about what they were agreeing to. Targeting darker-skinned people without stating why, and then taking advantage of vulnerable populations? That’s skeezy no matter how you look at it. On top of that, purposefully rushing through consent forms and misleading subjects as to whether they were being recorded is plainly unethical.

As for which party—Google or Randstad—is ultimately responsible? The News report says that Google managers were regularly looped in on conference calls and issued the mandate to seek “darker skin tones.” However, it’s not clear if they were aware of the directives to target the homeless. As for Randstad, many sources cited by the News claim Randstad managers pressured them to target cash-strapped people, rush through consent forms, and mislead participants as to the purpose of the scan, saying it was a new “mini game” or “app” in need of testing.

“We’re taking these claims seriously and investigating them,” Google told Gizmodo. “The allegations regarding truthfulness and consent are in violation of our requirements for volunteer research studies and the training that we provided.”

Gizmodo reached out to Randstad but did not immediately receive a response. However, the firm did give a statement to the Daily News that said when it became aware of concerns, “the data collection project was suspended for several weeks. During that time, a number of steps were taken to ensure all policies and procedures were understood by those working on the project and enforced by the team leading it. Team members involved in the project went through a retraining program.”

Read More