Machine Learning

  • Sexism in Machine Learning

    Could our technology be partly to blame for the sexism in the workforce?

    Gender inequalities in the workplace are not something new to our ears. On average women are paid less and make fewer advances in their careers. The New York Times reported that in 2015 more men named John ran S&P 1500 firms than women alone. Sexism and racism are rife throughout tech companies. The pay gap is closing but at the current rate, women will not be paid equally until the year 2119. But as of late, what goes on in these companies is not staying in these companies. Anyone using the technology can be affected by it, not just those who make it.

    The Study “Man is to Computer Programmer as Woman is to Homemaker” says the machine learning has the ability to amplify “female/male stereotypes to a disturbing extent”. Googles natural language processing program was trained using Google News articles and researches have found that the neural network called word2vec has adopted the same sexist and racial characteristics as our very own news.

    To understand more on how this can happen we need to delve into how word2vec actually works.

    The letters A, P, P, L and E form to make the word apple but it gives us little information on what an apple is or if its the company or the fruit we are talking about. Because words are simply a combination of characters that hold very little information when looked at individually, word processing can be difficult. And thus word2vec was born and a computers ability to understand human speech created. It works by compressing the words into a numeric representation of the information that you would need to recreate that same word. Training the programme to comb through Google News articles in order to learn the relationships between words.

    “King is to Man as Queen is to Woman” and “Paris is to France as Tokyo is to Japan” is an example of some of the analogies that the system could come up with. But it didn’t stop there. One would have hoped that it would exhibit little gender stereotypes, as articles the neural network was learning from were written by professional journalists. This was not the case. The neural network simply started to reflect the Google News data that is was built from. Returning relationships such as “Man is to woman as computer programmer is to homemaker,” or “Man is to architect as woman is to interior designer.”

    The effects of this can be huge.

    More and more studies are are showing this has an effect on job searches for women, showing them lower paid search options. And another shocking report claimed that a computer program used by a US court for risk assessment was biased against black prisoners.


    These flaws may seem small and easy to mitigate but they are painting a picture of what our tech industry looks like, an industry that is often out of touch with the people who are using their products. Fundamental change needs to be made or it will continue to be this way. Although a lot of progress has been made, we have so far to go. Not until the hostility to diversity is removed can we start to build incredible technology that works for everyone.


    At Quidtree we pride ourselves on being diverse. Everyone is treated equally.

Back to top button
Close
Skip to toolbar