Patriarchal AI: How ChatGPT can harm a woman’s career

All AI chatbots have inbuilt biases from their training data. And that can perpetuate patriarchy, according to this post from the London School of Economics:

Take an example of something an HR manager might use ChatGPT for in their everyday work. I asked ChatGPT to write a performance review for John, an accounts manager and then subsequently for Jane, an accounts manager. The only change in my question was ‘John’ to ‘Jane’. No other details were specified.

Yet the output given by ChatGPT couldn’t have been more different.

There’s a lot more here – and this is an important read.

Leave a comment