Chevron icon It indicates an expandable section or menu, or sometimes previous / next navigation options. HOMEPAGE

Amazon just showed us that 'unbiased' algorithms can be inadvertently racist

A Bloomberg report Thursday revealed that Amazon's same-day delivery service offered to Prime users around major US cities seems to routinely, if unintentionally, exclude black neighborhoods.

Advertisement

The maps, which you should check out on Bloomberg's site, show that in cities like Chicago, New York, and Atlanta, same-day delivery covers just about every zip code at this point — except the majority black ones.

chicago skyline night ap
Chicago was one of the cities highlighted in Bloomberg's report. Kiichiro Sato/AP

But the thing is that Amazon's defense here is entirely plausible. Director of PR Scott Stanzel wrote in an email to Tech Insider:

There are a number of factors that go into determining where we can deliver same-day. Those include distance to the nearest fulfillment center, local demand in an area, numbers of Prime members in an area, as well as the ability of our various carrier partners to deliver up to 9:00 pm every single day, even Sunday.

In other words: The data did it.

Advertisement

Amazon, in rolling out same-day delivery zip code by zip code, appears to have resorted to the data-and-algorithm-driven thinking that prevails in the tech world. The idea is that if you can feed enough discrete facts into a decision-making process, whether technological or corporate, you can not only make the most profit-maximizing decisions but erase the evils of human bias and malice.

But the problem with this thinking is it ignores the realities that there are biases involved in building any data-driven analysis, biases involved in what data gets included in the analysis, and biases inherent to a world scarred by centuries of ongoing racism and other bias. The algorithms don't self-assemble. People make them.

This has happened to other major tech companies too.

Google's image-recognition software was found to tag black people with a racial slur. Searches for typically black names have turned up ads for services that let you look up arrest records. Search images for the identity "Latina" (without SafeSearch) and you'll encounter a whole lot of porn. Facebook, Google, and Amazon don't intend racism or malice, but relying too heavily on data and algorithms can produce racist and malicious results.

Advertisement

That's because not thinking about racism and other issues of bias is not the same thing as solving them. Racism doesn't always mean being cartoonishly villainous toward people with a particular skin tone. And not being racist doesn't mean being colorblind.

Racism, as researchers have documented in countless studies and reports, is the systemic marginalization of minority communities. It's the aggregate of small and large effects that make it harder for people to find housing, accumulate wealth, avoid the criminal justice system, and succeed in school if they aren't white. Data and algorithms selected in deliberate ignorance of racism can pick up and reinforce that systemic problem.

In this instance, it appears that Amazon's rollout reflects ongoing economic disparities and segregation between white and black communities created by decades of redlining. In cities where black people are in the majority and Amazon offers same-day delivery just about everywhere, like Los Angeles, Amazon happily offers services to more black customers than white. But in cities like Boston where the rollout has been slower, white neighborhoods get served first.

Of course, denial of a premium delivery service by an online retail behemoth isn't exactly the greatest harm ever done by race-blind decision making. But it's a bad look.

Advertisement

So what can companies do to avoid problems like this? Having a diverse staff with the background to notice Hey, maybe leaving out the Bronx, South Side, and Roxbury isn't the best branding idea seems like a good start.

As Facebook's global director of diversity told Business Insider in January:

When we get a new class of hires, I say to them, "I don't want you to come in here and think that you need to use 'blind' as a suffix. That you need to describe people as 'just my colleagues' or say things like, 'I don't see race. I don't see gender. I'm colorblind. Sexual-orientation blind.' In doing so you're neutralizing a part of a person that is an asset. I want you to see those characteristics and see them as adding value."

Data and algorithms serve us in plenty of important ways. But the people who champion them should recognize that just because a numbers suggest something doesn't mean it's a good idea, especially when they picked out the numbers themselves.

Advertisement
Close icon Two crossed lines that form an 'X'. It indicates a way to close an interaction, or dismiss a notification.

Jump to

  1. Main content
  2. Search
  3. Account