Meta’s AI image generator struggles to create images of couples of different races


Meta AI is consistently unable to generate accurate images for seemingly simple prompts like “Asian man and Caucasian friend,” or “Asian man and white wife,” The Verge . Instead, the company’s image generator seems to be biased toward creating images of people of the same race, even when explicitly prompted otherwise.

Engadget confirmed these results in our own testing of Meta’s image generator. Prompts for “an Asian man with a white woman friend” or “an Asian man with a white wife” generated images of Asian couples. When asked for “a diverse group of people,” Meta AI generated a grid of nine white faces and one person of color. There were a couple occasions when it created a single result that reflected the prompt, but in most cases it failed to accurately depict the prompt.

As The Verge points out, there are other more “subtle” signs of bias in Meta AI, like a tendency to make Asian men appear older while Asian women appeared younger. The image generator also sometimes added “culturally specific attire” even when that wasn’t part of the prompt.

It’s not clear why Meta AI is struggling with these types of prompts, though it’s not the first generative AI platform to come under scrutiny for its depiction of race. Google’s Gemini image generator paused its ability to create images of people after it overcorrected for diversity with in response prompts about historical figures. Google that its internal safeguards failed to account for situations when diverse results were inappropriate.

Meta didn’t immediately respond to a request for comment. The company has previously described Meta AI as being in “beta” and thus prone to making mistakes. Meta AI has also struggled to accurately answer about current events and public figures.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *