Meta's New AI Training Policy Using Public User Data
Meta will use all public user data, including texts, photos, and comments, to train its AI from June 26.
This decision has sparked controversy for several reasons:
Privacy Invasion
Many users worry that their photos and texts will be used without explicit consent, which is a privacy invasion. Although users have agreed to the terms and conditions, most are unaware of this specific use of their data.
Opt-Out Mechanism
Meta has allowed users to request that their data not be used for AI training. However, **the process is complicated **and designed to discourage use, often called a "dark pattern" or "deceptive design". The forms are hard to find and complete, and there's no guarantee the request will be accepted.
Request Limitations
Even if a user's request to exclude their data is approved, it won’t stop photos they're in, uploaded by friends who haven’t made the same request, from being used for AI training. The request applies to the user’s account, not to the individual.
Future of Privacy
It is anticipated that users may have to pay for privacy in the future. The more one pays, the more anonymous one can be. This is seen as the price for using "free" platforms.
Impact in Europe
It remains to be seen how this measure will affect Meta in Europe, especially regarding GDPR and other privacy regulations. Users’ perception of Meta as an "evil company" is growing.
(Image Credit Ricardo Tayar López)