Photo by Alexander Shatov on Unsplash
So What Exactly are Instagram Teen Accounts?
Instagram Teen Accounts will be automatically applied to individuals under the age of sixteen seeking to obtain an Instagram account. Teen Accounts will be more restrictive than regular Instagram accounts because content will be limited to Meta’s sensitive content control, which will hide more information relating to self harm, suicide, and eating disorders. Meta does not address specifically how it hopes to accomplish this goal with content creators working around current content restrictions by using varying characters and emojis to convey their message, e.g., gun emoji or $uiC!de. However, Meta claims it has spent the last two years creating effective tools to regulate sensitive content for Teen Accounts.
Additionally, Meta claims to be “building technology to proactively find accounts belonging to teens, even if the account lists an adult birthday.” Again, this does not provide any facts or support for its claim. Meta has expanded age verification tools and technology since 2022 and has tested tools, including video selfies and social vouching, where mutual followers vouch for the user’s purported age. These tools have not been rolled out across the application, but tests were expanded to countries in Europe, Mexico, Canada, South Korea, Australia, and Japan in March of 2023. Meta’s age verification page mentions these tests will be rolled out to additional countries in the preceding months, but Meta has not posted an update regarding age verification tools or tests since March 2023.
Further, Meta is introducing time limit reminders and sleep mode for Teen Accounts. Time limit reminders will send the user notifications telling them to leave the app after sixty minutes of use each day. Although Meta does not elaborate further, time limit reminders appear to be limited to notifications and do not actually stop app use after a certain amount of time. Additionally, sleep mode will be activated from 10:00 p.m. to 7:00 a.m. and will mute notifications and send auto-replies to direct messages that the user receives. While Meta does not expand on what will be said in these auto-reply messages, it is presumed that the message will refer to the sleep mode time and note that the direct message notification has been muted until the next morning.
The Legal Springboard for Teen Accounts
In 2023, a lawsuit was filed by thirty-three states in a class action against Meta for allegedly exploiting and manipulating teens and children. The core of the complaint filed against Meta alleges that its business model focuses on maximizing young users’ time spent on the platform through psychologically manipulative product features. The Plaintiffs claim there is both internal and publicly available research that shows Meta is aware of the substantial harm being done to youths’ physical and mental health as a result of its social media platforms. Additionally, the Plaintiffs contend that Meta intentionally misrepresents the harm to the public and refuses to abandon its use of these harmful features.
The implementation of Instagram Teen Accounts is likely a response to the alleged harm Instagram has caused to young users. Meta even describes the features of Teen Accounts as including “built-in protections” that can reassure parents that their teen is having a safe experience. One source of the harm is teen interaction with sensitive content, which Meta defines as including content discussing suicide, self harm, and eating disorders. Although these features correlate closely with the issues brought up in the lawsuit, it is unclear whether the features sufficiently respond to the issues.
Meta attempted to minimize teen use on its platform in two ways. First, Meta set a notification to remind teen users to leave the app after scrolling for sixty minutes each day. However, it is unclear whether this truly addresses the concern in a meaningful way. Teens can easily ignore notifications, and without anything else to supplement the notification, there is no incentive to leave the app. Second, Meta added additional supervision tools. However, these tools are not enabled by default. If enabled, teen users will lose access to the app automatically after hitting a time quota set by the parents. This option would substantively address the issue of limiting time spent on Instagram. By limiting access to the platform, teen users will necessarily be exposed to less content. Therefore, teens would be exposed to less sensitive content and harmful information. This comes with a caveat: a parent would need to enable and configure this feature. This places responsibility on the parents to reduce the risk of harm.
Arguably, Meta should control and set limits on Teen Accounts as an expert on what is an appropriate amount of time for a teen to be engaging with its platform. However, this issue clearly requires a balance between personal and public interest. Meta gains value from teen users in the form of data collection and advertisements. This value is gained to the detriment of the public as teen users are exposed to more sensitive content at an earlier age, which can greatly affect their mental health as they engage with the app. Meta added another key feature that automatically places teens’ accounts in the most restrictive setting of Instagram’s content control. This setting will make it more difficult for teens to come across potentially sensitive content or accounts during their use of the platform. Accounts that are marked as potentially sensitive will appear lower in search results and be removed completely in certain cases.
For posts regarding self-harm, Meta demonstrates its stance and purpose for Teen Accounts’ limited content: “This is an important story, and can help destigmatize these issues, but it’s a complex topic and isn’t necessarily suitable for all young people.” Therefore, the new content policies would actually block the post from showing up on a teen’s feed even if the post is from someone they follow. However, this could create a different risk of harm where a suicidal or otherwise struggling teen is now automatically blocked and unable to get support from their peers. Meta addresses this by saying that it would still provide expert resources to that user when it detects the post is about content like self-harm or eating disorders.
Meta most likely implemented this feature in response to studies tracing self-harm and suicidal thoughts back to Instagram, along with other mental health issues. It remains to be seen if these restrictions on sensitive content will actually reduce the harm teens experience using Instagram. Harm reduction will depend on whether Meta can effectively identify and restrict sensitive content and whether the restriction of sensitive content will actually result in mitigating harm to teens’ mental health.
Business Implications of Instagram Teen Accounts and Ensuing Litigation
While Meta may care about the teens using its platforms, it ultimately cares about its business. The recent class action filed in August 2024 alleges that Meta “deliberately entice[s] minors into a harmful obsession with social media.” These allegations really hit hard for the company. Meta is now trying to protect its reputation and business by showing the world it wants to protect teens using its services, as evidenced by its September launch of Instagram Teen Accounts.
It is too early to tell whether the introduction of Teen Accounts will sufficiently address consumer concerns regarding teen use of Meta’s social media platforms. However, the issues brought up in the August lawsuit and subsequent changes being added to Instagram account management will inherently affect the business implications of running a social media platform like Instagram. Players like Meta and other social media operators, investors in these companies, and consumers will be affected by these new changes.
Despite the uncertainty regarding Meta’s success with Instagram Teen Accounts, the changes aiming to protect teen users will hopefully be a positive first step in protecting these vulnerable users. The changes may make parents and investors feel more comfortable allowing their teens to use the platform and to invest in Meta’s efforts. In fact, Meta’s stock price over the last three months indicates a warm reception to Meta’s changes.
Conclusion
It is unclear whether Meta will be able to make a substantial impact on current teen use of Instagram accounts or anticipate how teens will use Instagram Teen Accounts. While it is a positive step that Teen Accounts will filter sensitive content depicting harmful topics or violence, Meta’s ability to accomplish this goal is less optimistic. Additionally, the pending class action lawsuit against Meta may preclude Meta from taking substantive action to protect teens on Instagram in the meanwhile.
*The views expressed in this article do not represent the views of Santa Clara University.
Kommentare