Washington’s Lottery pulls promotional AI app after it accidentally produces a nude

Mike Powers

It’s all fun until someone loses their clothes: Generative AI is sweeping the tech industry, with companies rushing to use it to solve problems that don’t exist. One might even argue that the technology causes more problems than it solves, especially with companies insisting on shoveling out products that are not ready for public consumption. The Washington State Lottery learned this the hard way earlier this week.

In a promotional campaign called “Test Drive a Win,” Washington State Lottery officials created an interactive web app to help customers visualize themselves using their winnings on their dream vacations. Users could upload a selfie, and then the AI would generate an image themed around a type of vacation while using the user’s face as the subject. The whole project went awry when the AI spit out a nude.

A woman from Olympia named Megan (last name withheld) told the Jason Rantz Show that she used the app to create a “Swimming with the Sharks” vacation image. She claims she was shocked when it produced her face on the body of a topless woman sitting on a bed. The picture shows AI-generated fish and coral decorating the scene and the Washington State Lottery logo in the lower right-hand corner, proving it came from the promotional site.

“Our tax dollars are paying for that! I was completely shocked,” Megan told Rantz on KTTH. “It’s disturbing, to say the least. I also think whoever was responsible for it should be fired.”

She provided Rantz with a censored version of the photo (below), but KTTH could not authenticate it.

Washington’s Lottery pulls promotional AI app after it accidentally produces a nude

Megan said she told a friend who knows someone who works for the lottery about the incident. Presumably, this person passed on the anecdote to lottery officials because a spokesperson later confirmed it had received the report.

Rantz also attempted to reach the lottery commission, and three hours later, the web app was offline. A spokesperson later provided Rantz with a statement saying that the commission had worked with a third-party unspecified AI provider and worked out a “comprehensive set of rules” for the AI to follow. These conditions explicitly excluded the use of nudity.

“Prior to launch, we agreed to a comprehensive set of rules to govern image creation, including that people in images be fully clothed,” the spokesperson said. “We have not seen the image that the AI provided to this user, but out of an abundance of caution we took the website down.”

Rantz asked when precisely officials received the nudity report and why they didn’t shut down the app immediately. In a later statement, the commission said it had heard about the incident earlier in the week and immediately contacted the developers to ensure the settings and safeguards were in place, which they were.

However, after further discussion, they decided that one incident like this was too many and decided to take down the website, ensuring that it would not happen again.

Source link

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *