Meta to Launch AI-Powered 'Adult Classifier' Tool on Instagram to Protect Young Users
Social media can be a risky place for young people, sometimes spiraling out of control and leading them into trouble. In response, Meta is ramping up its efforts to protect teens on Instagram. The company has announced plans to roll out an AI-powered 'adult classifier' tool next year. This tool will use AI to analyze user behavior, like the accounts they follow and the content they interact with, to help verify if they’re under 18 and ensure a safer experience for younger users.
Regardless of the age a user has provided, the AI will immediately change their account to a teen account with tighter privacy constraints if it believes they are underage.
This action raises concerns over the AI's accuracy and the possibility of false positives, even if it is intended to protect young users.
Meta has not yet revealed the precision rate of the tool, and it is unclear how users who are incorrectly reported as underage would be able to appeal.
Meta will also put safeguards in place to stop teenagers from manually altering their age in order to further strengthen age verification. Users will have to provide a government-issued ID or share a video selfie as identification if they try to do as this.
Parents and US lawmakers are becoming more concerned about the effects of social media on youth, which is why Meta made this decision.
As Meta tightens its grip on underage users, a new era of digital surveillance begins. While the intention is noble, the potential for misuse and overreach looms large. Only time will tell whether this bold move will truly protect young people or create a new set of problems.