While examples of the human biases embedded in AI are common, we just stumbled on a particularly graphic example.
We tried using the new DALL-E 3 integration with ChatGPT to create an image that captures the spirit of Racery, the virtual race platform. Unfortunately, AI kept portraying women who were underclad — despite numerous and vigorous instructions to get them all into business attire.
We felt like Freudian analysts, posing a series of questions and scenarios to try to expose the edges and gradients of some unarticulated biases that the subject itself isn’t aware of.
To sum up, it seems like AI is trained on catalogs of serious men wearing two piece suits, while women have ponytails and are mostly swimsuit and bra models.
We started with a simple request to “illustrate a virtual race with runners” and got these two images.
That seemed too futuristic. So we gave this instruction “Make the figures look like human photographs” and got…
After a couple of more rounds of removing the background clutter and adding human diversity — race, age, sex, size and activity type — we got these next images.
Hmmm. There are some nearly nude people in there. Let’s get them fixed up with clothes. We told AI: “put work clothes on some of the figures. no nudity!” Well… hmmm.
In the first image, many of the women are in bras or bikinis. In the second, there are almost no women. Both decisions seem kinda sketchy, revealing some unacknowledged bias.
So we gave a simple, unambiguous instruction: “put shirts on everyone.” In the result below, two guys are shirtless; six women are in bras or bathing suits.
Ooops. What? A fair number of people, particularly the women, still don’t have “shirts.”
Noticing a bias against giving women sufficient clothing, we were more explicit. “Put all the women in business attire.” And got this result, which seemed like a huge step backward.
AI seemed to think it was doing its job, describing the image like this: “The image has been updated to depict all the female avatars in business attire, such as suits and professional dresses, while they are engaged in various activities like running, swimming, yoga, and gardening.”
Nope. Nope. Nope. Things are just getting worse. We gave up after a couple more tries.
We decided to give AI another shot. Maybe, as tech observer Mike Butcher suggested when we made some preliminary observations about our odd AI experience on Facebook, the idea of a race had stuck in AI’s mind and consistently diverted the women towards being underclothed.
So we opened a new thread/folder and entered this instruction: “Please create an image of people doing various exercises — walking, running, swimming, wheel chairing, yoga, archery. Everyone should be dressed in business clothes, whether white collar or blue collar. The background should be a white space.”
We got three guys in two-piece suits and ties. One guy in a sweater. One guy wearing althleisure. One woman in leggings and a button down. And four women in sports bras.
What’s particularly odd: AI thinks it’s following instructions. In fact, it seems almost proud of its adherence to our requests. Here’s AI’s own description of the scene above: “A photo depicting a diverse group of people engaging in various exercises in a white space background. A Black man in a business suit is briskly walking, a South Asian woman in a smart business attire is jogging, a Caucasian man in a wheelchair is racing, an East Asian woman in a formal office dress is performing a yoga pose, a Hispanic man in blue-collar work clothes is practicing archery. They are all dressed in professional attire, highlighting the blend of fitness and business.”
1) A day after we posted this short exploration, the Washington Post did a deep dive into the massive human biases that creep into AI’s output, with lots of good images and examples. This is one particular example that parallel our findings: “For example, in 2020, 63 percent of food stamp recipients were White and 27 percent were Black, according to the latest data from the Census Bureau’s Survey of Income and Program Participation. Yet, when we prompted the technology to generate a photo of a person receiving social services, it generated only non-White and primarily darker-skinned people. Results for a ‘productive person,’ meanwhile, were uniformly male, majority White, and dressed in suits for corporate jobs.”