Instagram’s age verification stops teens from impersonating adults

Placeholder while loading article actions

Instagram may ask you to prove your age, but only in certain circumstances.

The photo and video sharing platform announced from Thursday that it would start testing new age verification tools if accounts listed as under 18 tried to change their age to over 18. 18 years old.

But users can still misrepresent their dates of birth when creating an account.

The change comes as Instagram’s parent company Meta faces increased scrutiny for the presence of children under 13 on its apps. A federal privacy law prohibits the collection of data on people under 13 without parental permission, but only if the platforms know it is happening, allowing companies to look the other way, say the privacy advocates. State and federal lawmakers have proposed a variety of bills — including the Senate Child Online Safety Act, the Senate Child and Teen Online Privacy Protection Act, and the California’s Age-Appropriate Design Code Act – which would dramatically increase tech companies’ legal responsibility to protect children online.

Your kids’ apps are spying on them

Thursday’s change isn’t meant to keep young children away from Instagram, Meta director of data governance and public policy Erica Finkle said, but to ensure teenage accounts reflect real age and receive the right guarantees. Teen accounts can’t receive direct messages from adults they aren’t connected with, for example, and they’re protected against certain types of ad targeting. Instagram also recently began prompting teenage accounts to turn on the “pause” feature, which reminds them to stop scrolling through the platform’s infinitely scrolling feeds.

“I’ve worked with policy makers, with regulators, and we all share the same goals,” Finkle said. “What really drives all of this, regardless of the specific legislation, is to make sure teenagers have appropriate online experiences and are safe and protected.”

Instagram rolls out teen safety features a day before its CEO testifies before Congress

Adult accounts who attempting to change their age to under 18 already had to provide identification for verification. Today, the company is experimenting with new ways to verify age. People can still send accepted ID, which Meta says it will store securely and delete within 30 days. They can ask three Instagram friends to vouch for their age – these friends must be adults, must respond within three days, and cannot vouch for anyone else. Or, they can submit a “video selfie” and Meta will use digital identity company Yoti’s AI to guess their age, the company says.

Yoti says her AI is trained on images of the faces of people around the world who have given “explicit and revocable” consent.

For parents and others concerned about young people on Instagram, this decision seems incomplete, says Irene Ly, policy adviser for Common Sense Media, an organization that advocates for child-friendly media policies. In 2021, leaked documents suggested that Meta had buried internal research into Instagram’s harmful effects on young women, according to Frances Haugen, whistleblower and former Meta employee.

“While it is good that Instagram is trying to experiment with user age verification with technology that would not compromise user privacy, Instagram still needs to make changes to the design of the platform so that ‘it’s safer for younger users,’ Ly said. “It won’t address the fact that their algorithm amplifies harmful content that promotes eating disorders, self-harm, or substance abuse.”

Meta spokeswoman Faith Eischen pointed to the app’s existing guidelines for algorithm-recommended content and its Sensitive Content Control, which allows people to reduce or increase the amount of guns, drugs, naked bodies and violence they see on Instagram.

For teens, navigating Instagram’s mental health pitfalls is part of everyday life

Even the idea that Instagram’s age-guessing AI protects privacy is debatable, says Mutale Nkonde, founder of algorithmic justice organization AI for the People and a member of the content moderation advisory board of Tik Tok. Meta is known to share data internally to build detailed profiles about its users, she said. How can people be sure that Meta and Yoti aren’t using their video selfies for anything other than age verification? In addition, Nkonde said that one of the reasons Instagram can run its age verification test in the United States is that, unlike the European Union, companies can collect data from people under the age of 18. 16 years old. She asked: Why are teenagers the face guinea pigs of Meta-digitization partnership?

“Children are a protected class, so every precaution should be taken to protect them rather than using them to test new technology,” she said. “This is an inappropriate use case.”

Eischen said Yoti is a respected company with in-house research that supports the accuracy and fairness of its AI. The system never recognizes faces, only estimates their age, she noted. And Meta will never use the data it collects for age verification for any other purpose, she said.

About Rhonda Lee

Check Also

History of SUD linked to increased risk of death from other diseases

Patients hospitalized with a substance use disorder (SUD) are much more likely to die of …