Amazon built an artificial intelligence to hire employees that discriminated against women

In 2014, Amazon decided to put a great part of its weight in its personnel and human resources hiring processes in the hands of artificial intelligences developed within the company to optimize efforts in terms of attracting talent and find the ideal candidate for each position. The philosophy behind this project was simple: employ the same automation processes that had taken the company to the top of global ecommerce to select its employees. However, they discovered that this artificial intelligence systematically discriminated against women .

By mimicking the product valuation system on its website, Amazon’s artificial intelligence valued candidates with a score between one and five stars .

“It was like the Holy Grail, everyone wanted it,” one of the employees admits to Amazon in statements to Reuters . ” They wanted it to be a system in which you entered 100 CVs to select the best five, which would be the ones that would end up hired, ” he says.However, a year after launching the project, its developers discovered that the program systematically discriminated against women and preferred to hire men, especially in jobs with a technical profile or software development .

The reason behind this discrimination seemed to be that the artificial intelligence was trained with the profiles of former candidates who had tried to work in the company during the last 10 years, which showed a male domain within the sector .

Amazon’s artificial intelligence learned that men were better candidates than women and penalized curricula that contained words like “women” or “women’s club captain.” Also, it penalized the candidates who had studied in exclusively feminine faculties.

When the engineers realized this trend, they reconfigured the logic of artificial intelligence so that it did not take those terms into account . However, there was no guarantee that the algorithm would not find other ways to discriminate against candidates based on gender.Finally, the team in charge of this project was dismantled. However, the personnel selection managers continued to use the tool , although only as a support and taking into account other considerations beyond the classification used in artificial intelligence.

Trending Jokes  Commemorative gallery for a funny meme dog stirs his fans