OpenAI, the Microsoft-backed artificial intelligence company behind the popular image generator DALL-E, has launched a new tool to detect whether digital images have been created by AI.
Authentication has become a major concern in the fast development of AI, with authorities worried about the proliferation of deep fakes that could disrupt society.
OpenAI’s image detection classifier, which is being tested, can assess the likelihood that a given image originated from one of the company’s generative AI models like DALL-E 3.
OpenAI said that during internal testing on an earlier version, the tool accurately detected around 98% of DALL-E 3 images while incorrectly flagging less than 0.5% of non-AI images
However, the company warned that modified DALL-E 3 images were harder to identify, and that the tool currently flags only about five to 10% of images generated by other AI models.