Gmail Accused of Racial Profiling

It all started last year, when tech journalist Nathan Newman conducted an experiment with Gmail.

Newman's experiment, while not exactly scientifically conducted, yielded some interesting results: the email service would give very different targeted ads to people based on ethnic-sounding names. For instance, when writing about buying a car, Jake Yoder received several car dealer ads. Malik Hakim, however, also received "Bad Credit Auto Loans." In education, ethnic-sounding names were less likely to be given ads for colleges, but instead technical or vocational schools.

This year, the U.K's Telegraph decided to repeat the experiment. 

The results were stark, and similar to the original experiment – for example, an email sent by "Robert Howe" saying "Need Cash" gets foreign exchange solutions for business advertised to him; the same email sent by "Segun Akinkube" gets offered Payday Loans.  Neither of the ads repeated in each other’s preferences; Segun & Robert got completely different ads served to them, when all other factors were the same. Why? That’s a question for Google to answer.

 Google refuted Newman's findings last year, saying he used flawed methodology. "We do not select ads based on sensitive information, including ethnic inferences from names," a spokesperson said.

However, Google does use some information from your email -- that's how it targets its ads, so where does that stop? Its algorithm is still a mystery, but can be surprisingly intuitive, according to the Telegraph, especially when it sent a bride-to-be ads for maternity bridal gowns.

But Newman's own research did take into account geolocation. The Gmail algorithm is so sensitive, depending on where he was in New York City, the ads would range from inheritance loans to payday loans. 

So is Google's algorithm so smart it's learned discrimination? While we wouldn't like to think so, perhaps it has become more human, after all.
Contact Us