Be The Light
Be The Light

Meta’s Facebook and Instagram platforms to label all AI-generated fake images.

February 7, 2024 ENKI Team

Meta plans to introduce technology capable of detecting and labeling images generated by other companies’ AI tools on its platforms, including Facebook, Instagram, and Threads.

While Meta already labels AI-generated images from its own systems, it aims to extend this capability to images from other sources to combat AI fakery. However, an AI expert cautioned that such tools may be easily circumvented.

Although the technology is still in development and not fully mature, Meta intends to expand its labeling of AI fakes in the coming months to encourage industry-wide efforts to address the issue.

Prof. Soheil Feizi from the University of Maryland’s Reliable AI Lab raised concerns about the effectiveness of Meta’s AI detection system, suggesting it could be easily circumvented by lightweight image processing techniques, leading to high false positive rates.

While Meta’s tool won’t address AI-generated audio and video, users will be asked to label their own content, with potential penalties for non-compliance.

Sir Nick Clegg acknowledged the difficulty in detecting AI-generated text and admitted to the limitations of Meta’s current media policy.

The Oversight Board criticized Meta’s policy on manipulated media as incoherent and lacking justification, particularly in light of a recent ruling regarding a video of President Joe Biden. Despite the criticism, Sir Nick broadly agreed with the ruling and acknowledged the need for updated policies to address the increasing prevalence of synthetic and hybrid content.

Since January, Meta has required political adverts using digitally altered media to be labeled accordingly.

BE THE LIGHT!

Plan your new project or tech strategy to help secure your business
future in a rapidly changing business environment.

info@enki.tech

ENKI Inc.

100 Wilshire Blvd, Santa Monica, CA 90401

(213) 814-2332

ENKI Inc. © 2015 – 2024. All rights reserved.
contact-section