Meta’s AI Image Generator Will Not Depict an Asian Man With a White Woman

Mike Powers


Meta released its own AI-powered image generator back in December called Imagine. Like other image generators of its kind, Imagine seems to have some very strange racial hangups. Namely, the application can’t seem to envision a world where Asian men are capable of being with white women.

Let me explain: The Verge was first to notice that Meta’s image generator did not seem to be able to generate images of white women and Asian men together. Journalist Mia Sato describes her futile attempts to generate said interracial images. Indeed, trying “dozens of times to create an image using” a variety of prompts, Sato notes that she was only successful once.

We did our own test and had even less luck. From my personal experiments with the image generator, it is basically impossible to get the application to create images of a white woman and an Asian man together. I tried every combination of prompts I could think of and was only ever successful in creating images of an Asian man and an Asian woman together. Other interracial couples could easily be generated, including a Black man and a white woman, a white man and an Asian woman, and a white woman and a Middle Eastern man, but when it came to Asian men, they were apparently shit out luck.

Image for article titled Meta's AI Image Generator Will Not Depict an Asian Man With a White Woman

Screenshot: Meta

While the image generator failed to generate images of white women and Asian men together, it did manage to respond to other, much weirder prompts, such as when I asked it to show me a dog who is friends with a robot.

Image for article titled Meta's AI Image Generator Will Not Depict an Asian Man With a White Woman

Screenshot: Meta

Or a fish who is addicted to gambling:

Image for article titled Meta's AI Image Generator Will Not Depict an Asian Man With a White Woman

Screenshot: Meta

This follows on the heels of a similar, slightly worse scandal involving Google, in which the company’s Gemini image generator was unable to generate consistent pictures of white people. The application also generated a multitude of historically inaccurate images, including Black Vikings and Black Nazis. Google ultimately had to pause the platform and promised to bring it back after the bugs had been worked out.

Right-wing social media influencers have predictably jumped all over this trend to bolster some larger, catastrophizing commentary about the excesses of liberal ideology. I won’t go in for that but, honestly, what gives? How is this a thing? Why are these companies so bad at creating platforms that deal with race? Is this some terrible marketing stunt? Are we all being punked? What is happening?

Gizmodo reached out to Meta to get more information about why its app is screwed up and will update this story if it responds.



Source link

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *