To be fair, I used a Chinese AI picture generator app with my face and it made it more Asian looking. It’s obvious that each software has biases towards the people who made and trained it. It’s not good, but it’s expected and happening everywhere.
Ok, but she asked it to make her look professional and the only thing it changed was her race. Not the background, not her clothes. Last I checked, a university sweatshirt wasn’t exactly professional wear.
Machine learning is biased towards its training data. If the image generation algorithm (notice I’m not saying AI) is trained on photos of “” professionals " being of a certain demographic that’s what it will prefer when it’s generating an image.
So these shocking exposés should simply be this image generator was trained with biased data. But the human condition is building biases. So we’re never really going to get away from that.
To be fair, I used a Chinese AI picture generator app with my face and it made it more Asian looking. It’s obvious that each software has biases towards the people who made and trained it. It’s not good, but it’s expected and happening everywhere.
Ok, but she asked it to make her look professional and the only thing it changed was her race. Not the background, not her clothes. Last I checked, a university sweatshirt wasn’t exactly professional wear.
Machine learning is biased towards its training data. If the image generation algorithm (notice I’m not saying AI) is trained on photos of “” professionals " being of a certain demographic that’s what it will prefer when it’s generating an image.
So these shocking exposés should simply be this image generator was trained with biased data. But the human condition is building biases. So we’re never really going to get away from that.