Is your job search
gender-biased?

thumbnail

When thinking about the gender wealth gap and how to address it, one key consideration for the future comes into play: the role of AI in preserving unconscious bias.

A recent study by the University of Melbourne, commissioned by UniBank, has found a pattern of gender bias forming within recruitment algorithms. With the arrival of AI and its eventual introduction across every touch point of our lives, this research comes at a crucial time while AI systems and technologies are still being developed. Considering that most job recruitment sites already use algorithms to determine the top candidates, maybe you’re wondering how this might affect your next job search?

In order to determine gender bias in recruitment, the study assembled a recruitment panel for three roles: a Data Analyst, a Financial Officer and a Recruitment Officer, with the recruitment role being the only position stemming from a female-dominated industry. For all three roles, the panel ascertained that the suitability criteria would be based on relevant experience, keyword matches and education. Using real-world resumes, the panel ranked male resumes higher in both male-dominated and gender-balanced industries while ranking women’s lower. The criteria used for selecting best candidates, was inconsistent with the final candidates chosen – specifically with the Financial Officer role – suggesting the results were influenced by unconscious bias.

The study also attempted ‘gender-flipping’ by swapping the names on resumes. The results show that the recruitment panel only preferred the CV’s with women’s names “when their resumes had men’s experience.”

Now, here is where the process becomes relevant for every working female. When the human recruitment process is mimicked into algorithms, the bias is replicated into the data systems. So, if we don’t address the gender disparity issues in the future of technology and data training, we might be perpetuating the issue of gender bias for generations to come.

The study also refers to an incident at Amazon, uncovered by Reuters in 2018, that operated a machine-learning recruitment tool over a number of years. Amazon later discovered a flaw within its processes. Due to the male dominance within the tech industry, the Amazon system “taught itself that men were preferable.” The University of Melbourne study attributes this case to being “a consequence of the biased datasets that mirror the existing gender inequality in the workplace.”

Gender bias occurs “when specific traits tied to broader expectations of gender are applied to individuals,” regardless of whether or not a person might actually have those traits. The presence of bias in different industries, in the human hiring process and the individuals recruiting is an issue that will only be minimised over time and with changing social dynamics. But if we don’t acknowledge the flaws within our thinking before its replicated into AI, we risk perpetuating these issues for the next generations. If more equitable systems can be manufactured to make the playing-field level, we should make sure that is delivered.

In a very recent and controversial situation, Timnit Gebru, the co-lead of Google’s ethical AI team was fired after she attempted to publish a research paper. The paper uncovered a few key considerations on the development of AI language-models – including adopting discrimination via language, missing nuance in language and only developing language models based on what’s available on the internet leaving those without access to internet out of the equation. The findings veer largely away from one of Google’s largest revenue earning areas and would require big shifts within their operations to start solving some of these issues.

The situations at Google and Amazon are prime examples for how these considerations are needed to be factored in, even when it might be uncomfortable. Our futures and careers will rely heavily on AI so let’s not create more problems from the outset!

 

Important: This content has been prepared without taking account of the objectives, financial situation or needs of any particular individual. It does not constitute formal advice. Consider the appropriateness of the information in regard to your circumstances.

left
Mobile
right
Mobile

SUBSCRIBE NOW

Be on the list that gets you from A to B. Want a holiday? Want to buy a
property? Want to know how to retire a millionaire? This is the first step...