ChatGPT's answers might soon be NSFW as OpenAI opens the door to AI porn

  • OpenAI is exploring whether to allow its AI models generate NSFW content.

  • OpenAI products such as ChatGPT and DALL-E ban users from creating explicit content.

  • AI porn is flourishing elsewhere, prompting ethical concerns about deepfakes.

OpenAI is exploring whether to allow its AI models to generate explicit content.

In a move first reported by Wired, OpenAI's recently released Model Spec document opens the door for a policy change on adult content.

"We're exploring whether we can responsibly provide the ability to generate NSFW content in age-appropriate contexts through the API and ChatGPT," the company said in the Model Spec. "We look forward to better understanding user and societal expectations of model behavior in this area."

An OpenAI representative told Business Insider: "We have no intention to create AI-generated pornography. We have strong safeguards in our products to prevent deepfakes, which are unacceptable, and we prioritize protecting children. We also believe in the importance of carefully exploring conversations about sexuality in age-appropriate contexts."

OpenAI's models that power products such as ChatGPT and DALL-E prohibit users from creating explicit content, but that hasn't stopped AI porn from flourishing elsewhere

Unstable Diffusion, the NSFW AI image generator that was booted off Kickstarter, has gained a strong following in the murky world of AI porn.

The platform takes its name from Stability AI's open-source AI image generator, Stable Diffusion. Unlike Stability's platform, it has minimal content restrictions and allows users to create pornographic images.

Last year its CEO and cofounder Arman Chaudhry told Business Insider the program was generating more than 500,000 images every day. Chaudhry says Unstable founded the original discord group "as a refuge for artists who wanted to create AI art without limitations, including those related to adult content."

However, experts have raised issues with AI tech being used for explicit content, as users can theoretically create deepfake pornography or content depicting minors engaging in sexual acts.

While platforms like Unstable claim to have content filters to prevent people from using them this way, these filters aren't always foolproof.

Read the original article on Business Insider