Editorials

Self Driving Cars Programmed With Racial Bias

Avatar photo

11 March 2019

By Bronwen

There is a huge diversity problem in the technology sector and it doesn’t just affect revenue and creativity – it actually creates flaws in products that could have been avoided had more women and ethnic groups been involved in the designing.

What do the statistics say?

Let’s start with some statistics. In 2016, Google’s diversity report showed that 56% of its overall workforce was white and only 3% of its new hires were black. The proportion of men to women was also unsurprising, with males making up 69% of the workforce.

Graph showing race and gender proportions at Google in 2016. link

It’s not just Google whose figures look like this. In 2017, women made up just 23% of employees in tech roles at Apple and 54% of employees were white.

The whiteness and maleness of these sorts of companies can lead to defunct products such as voice recognition software that has trouble understanding female voices and this soap dispenser that doesn’t work on dark skin tones.

Systems with in-built Prejudice?

Now, as companies are beginning to seriously develop self-driving cars, research is finding that these cars are better at detecting light skin pedestrians. This, if unfixed, could lead to countless deaths and injuries of black and other dark skinned pedestrians once these cars become mainstream.

A study from the Georgia Institute of Technology found that an automated vehicle would fail to spot somebody with a darker skin tone and proceed to drive into them. According to their findings, these systems were 5% less accurate at detecting dark skin tones compared to light skin tones. However the systems tested were those created by academics rather than those of big companies such as Google.

The authors conclude the study with a warning message for tech companies who do not consider skin tones when programming recognition technology:

“We hope this study provides compelling evidence of the real problem that may arise if this source of capture bias is not considered before deploying these sort of recognition models.”

Is the technology sector’s inequality having a much wider impact? Let us know what you think below.

Like this article? Please share!