Bias in Data Science? How?

Yetti Obasade
2 min readFeb 24, 2021

By Yetunde Obasade

There is a lot of things that are interesting about data science, but recently I have come across the topic of racial and gender bias in machine learning. How does this happen? Well, one example is through algorithms that select candidate resumes for hiring. They usually train off what the hiring staff chooses. From there, these algorithms fine tune their selections. These models were found to select names that sounded “black” or ethnic far less than traditional eurocentric names. Despite candidates having similar or better credentials, these biases still existed.

Carrying over into this bias is facial recognition software. This software, which is sometimes older, is not able to recognize non-white faces very well. The same rings true for identifying female faces, especially those of color. In 2016, an AI algorithm from amazon had failed to recognize photos of Oprah, Serena Williams and Michelle Obama as women (we all know what they have in common). Joy Buolamwini, a woman who is well versed in data, along with many other things, started an organization to fight against bias in machine learning. This organization is called the Algorithmic Justice League.

So, how can we fix some of the bias in Artificial Intelligence? We can first start by diversifying the data used to train these algorithms. Data that is not inclusive can be the proponent of why these programs perform poorly. We can also diversify the room and add more people from different backgrounds to the table. These AI decisions should be made by more than just one demographic, and efforts to make these rooms inclusive need to be taken in order to improve efficiency.

Ultimately, bias cannot be completely eliminated. Even when features and factors are removed from a program, bias creeps in and still exists. It will take much fine tuning and continued data analysis to find ways of reducing and eliminating bias in artificial intelligence of the future.

For more information on this topic, check out the links below.

References

McMahan, S. (2020, July). Racism and Gender Bias in Machine Learning, Fixes not Always Easy. Racism and Gender Bias in Machine Learning, Fixes not Always Easy — Data Science Examples

Buolamwini, J. (2019, February 7). Artificial Intelligence Has a Problem With Gender and Racial Bias. Here’s How to Solve It. TIME. Retrieved from https://time.com/5520558/artificial-intelligence-racial-gender-bias/

Medium.com: Racial bias and gender bias examples in AI systems. (2018, Sept. 2). Retrieved from https://medium.com/thoughts-and-reflections/racial-bias-and-gender-bias-examples-in-ai-systems-7211e4c166a1

--

--