Meta AI is consistently unable to generate accurate images for seemingly simple prompts like “Asian man and Caucasian friend,” or “Asian man and white wife,” The Verge reports. Instead, the company’s image generator seems to be biased toward creating images of people of the same race, even when explicitly prompted otherwise.
Engadget confirmed these results in our own testing of Meta’s web-based image generator. Prompts for “an Asian man with a white woman friend” or “an Asian man with a white wife” generated images of Asian couples. When asked for “a diverse group of people,” Meta AI generated a grid of nine white faces and one person of color. There were a couple occasions when it created a single result that reflected the prompt, but in most cases it failed to accurately depict the prompt.
As The Verge points out, there are other more “subtle” signs of bias in Meta AI, like a tendency to make Asian men appear older while Asian women appeared younger. The image generator also sometimes added “culturally specific attire” even when that wasn’t part of the prompt.
No comments: