Tech News

How sexist is artificial intelligence?

Women are still underrepresented in AI. This can certainly have consequences – because technologies developed by male-dominated companies can reflect prejudices.

By Katharina Wilhelm, ARD Studio Los Angeles

2019: Apple offers its own credit card – the company also uses artificial intelligence (AI) to check who qualifies and under what conditions someone can get a card. At that time, there were increasing reports that women and men were apparently rated differently, for example by giving women with the same financial conditions a lower credit card limit. Even Apple co-founder Steve Wozniak experienced this, he wrote on Twitter at the time:

Another example: The goods giant Amazon had used artificial intelligence in the application process – after a while it came out that women were sorted out more often, the algorithm had a preference for male candidates.

AI excludes women

These two examples are not the only cases in which the question had to be asked: How sexist is artificial intelligence actually – and why? “If you look at the composition of the tech industry, it’s mostly white and male. These people make decisions for the rest of humanity,” says Mia Shah-Dand. She founded a consulting firm in California that works for more justice in the tech world and especially in the field of artificial intelligence.

Artificial intelligence and machine learning are used in facial recognition software, for example. There, the software turned out to be racist and sexist: Artificial intelligence was particularly bad at capturing the faces of black women – a problem that can be really dangerous in law enforcement. “What does that mean for women? These systems were not created for us. Especially for women who are not white,” complains Shah-Dand. “Without representation, these systems cannot do justice to these groups.”

AI excludes women

These two examples are not the only cases in which the question had to be asked: How sexist is artificial intelligence actually – and why? “If you look at the composition of the tech industry, it’s mostly white and male. These people make decisions for the rest of humanity,” says Mia Shah-Dand. She founded a consulting firm in California that works for more justice in the tech world and especially in the field of artificial intelligence.

Artificial intelligence and machine learning are used in facial recognition software, for example. There, the software turned out to be racist and sexist: Artificial intelligence was particularly bad at capturing the faces of black women – a problem that can be really dangerous in law enforcement. “What does that mean for women? These systems were not created for us. Especially for women who are not white,” complains Shah-Dand. “Without representation, these systems cannot do justice to these groups.”

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button