OpenAI announces new tool to detect its own AI images

openai.jpg

As AI use becomes more prevalent, there's a louder call from people for regulation and control. Nowadays, AI platforms can even be used to create images and music, which some people don't like. So, OpenAI is coming up with methods to detect images created by its own AI engine.

According to the press release (via Reuters), OpenAI has announced a new tool to detect images created using its Dall-E 3 AI engine. The company claimed that the new detection tool can identify AI images made by the Dall-E 3 engine with 98% accuracy. Mind you, that's during internal testing. In other words, it implies a controlled test using images with known characteristics of AI creation. As such, we can probably consider it accurate.

Besides that, the tool can perform image modifications like compression, cropping and saturation changes. OpenAI also plans to add tamper-resistant watermarks to images created via its AI engine. Specifically, the company wants to add a 'signal' to photos or videos that would be hard to remove, likely to make it easier to detect AI images.

Sometimes, it can be hard to tell if something is made by AI, so such new detection tools could be vital to prevent AI abuse in the future. But what do you think? Please share your thoughts about the topic on our Facebook page, and stay tuned to TechNave for more news like this.