Have you ever seen an Asian person with a white person, whether that’s a mixed-race couple or two friends of different races? Seems pretty common to me — I have lots of white friends!

To Meta’s AI-powered image generator, apparently this is impossible to imagine. I tried dozens of times to create an image using prompts like “Asian man and Caucasian friend,” “Asian man and white wife,” and “Asian woman and Caucasian husband.” Only once was Meta’s image generator able to return an accurate image featuring the races I specified.

Images: Mia Sato/The Verge

Tweaking the text-based prompt didn’t seem to help. When I asked for an “Asian man and white woman smiling with a dog,” Meta’s text generator on Instagram gave me three back-to-back pictures of two Asian people. When I changed “white” to “Caucasian,” it did the same. “Asian man and Caucasian woman on wedding day” gave me an Asian man in a suit and an Asian woman in a traditional-looking garment… except upon further inspection, it appears to be a mix of a qipao and kimono. Multiculturalism is amazing.

Qipao or kimono? Up to you!
Image: Mia Sato / The Verge

The image generator also didn’t like when I asked for representations of platonic relationships, like “Asian man with Caucasian friend” and “Asian woman and white friend.” Each time, it returned images of two Asian people. When I asked for a picture of an “Asian woman with Black friend,” the AI-generated image showed two Asian women. Tweaking it to “Asian woman with African American friend” yielded more accurate results.

Interestingly, the tool performed slightly better when I specified South Asian people. It successfully created an image using the prompt “South Asian man with Caucasian wife” — before immediately creating an image of two South Asian people using the same prompt. The system also leaned heavily into stereotypes, like adding elements resembling a bindi and sari to the South Asian women it created without me asking for it.

Images: Mia Sato/The Verge

The image generator not being able to conceive of Asian people standing next to white people is egregious. But there are also more subtle indications of bias in what the system returns automatically. For example, I noticed Meta’s tool consistently represented “Asian women” as being East Asian-looking with light complexions, even though India is the most populous country in the world. It added culturally specific attire even when unprompted. It generated several older Asian men, but the Asian women were always young.

The one image it successfully created used the prompt “Asian woman with Caucasian husband” and featured a noticeably older man with a young, light-skinned Asian woman — odd since I’m not trying to wade into the age gap discourse. Immediately after, I generated another image using the same prompt, and it reverted back to showing an Asian man (also older) with an Asian woman.

Meta didn’t immediately respond to a request for comment.

Meta introduced its AI image generator tools last year, and its sticker creation tool promptly went off the rails as people made things like nude images and Nintendo characters with guns.

AI systems reflect the biases of their creators, trainers, and the data set they use. In US media, “Asian” is usually taken to mean an East Asian person, as opposed to people from other parts of the continent — perhaps it’s not surprising that Meta’s system assumes all “Asian” people look the same, when, in fact, we’re a diverse collection of people who often have little in common besides ticking the same census box.

Asians who don’t fit into the monolith are essentially erased from the cultural consciousness, and even those who do are underrepresented in mainstream media. Asians are homogenized, exoticized, and relegated to “perpetual foreigners.” Breaking type is easy in real life and impossible in Meta’s AI system. Once again, generative AI, rather than allowing the imagination to take flight, imprisons it within a formalization of society’s dumber impulses. 

Share.
Exit mobile version