Social media platforms have been facing criticism for their influence on young minds for at least a decade, and now Meta, Instagram’s and Facebook’s parent company, is implementing new features to actively address such concerns, such as the impact on teens’ mental health, across its platforms. These features include new parental supervision tools and privacy features.

Many adults would probably agree that this sounds like a generally good direction to direct efforts in, but the proposed new tools have raised questions. The main concern is that many of the new features require the choice to opt-in from the child, and that this will perhaps hamper the features’ effectiveness. 

Instagram will now send a notice to the child after they’ve blocked someone on the platform and encourage them to involve their parent to supervise their account. You may be confused (I know I was) as to what the rationale behind this is, but Meta explains that it’s at this moment that kids and teens are open to seeking parental guidance.

Leaving it to the parents

Now, personally, I can see a lot of kids and teens simply choosing to not opt in (I know I wouldn’t have), but what happens if a child does opt in? Well, then the parents gain some control over their account, allowing them to impose a time limit, see and track who their teen/child follows or is followed by, and monitor how much time their kid spends on Instagram. However, the parents will not be privy to message content, which will remain private and encrypted. 

While some parents may not like the thought that some parts of their child’s account remains off limits to them, this may be an effort to make compromises and discussions easier between parents and their children about online safety in a transparent way – a child gets to use social media, their parent(s) gets to see an outline of their usage, and the child is aware that parent has this information. This is also what Meta itself is insinuating, claiming it seeks solutions “that balance teen safety and autonomy.”

However, this is likely not the last of such efforts by Meta, as these new features join already-implemented parental supervision tools and resources. Currently, Meta has not disclosed how many under-18 users have opted into these features, so we don’t know the exact scale of engagement with them. 

Also, these features are only available in the United States, Canada, and the UK, though Meta has said that it will be rolling them out to more regions in the coming months (you can check out our pick of the best parental control software in the meantime).

While we don’t have opt-in figures yet, criticism has persisted despite this announcement. Experts say this is a useful tool for families who are already communicative and on top of their children’s social media time. However, this isn’t the reality for many families (I can attest to that, my parents had no idea what I did online). 

One such commonly-raised issue is that technically kids under thirteen aren’t allowed on social media, but that’s easily circumnavigated by just lying about your age when you sign up. Another is that social media platforms, in tandem with technology overall, are evolving so rapidly that the average adult may not have time to fully understand each new development, especially if they have children, and how these developments may be impacting the psychological development of their child.

These critics say that such problems of a more fundamental nature are being left unaddressed and left to parents to figure out. A lot of these features also require the children themselves to turn on, perhaps expecting a level of knowledge and self-control that you don’t see from adults.

It seems to me, the psychological problems caused and exacerbated by social media affect many demographics and age groups, with children being one of the most affected. 

Social media companies still have plenty of incentive to keep us on their platforms for as long as possible, and this desire to keep us engaged appears to conflict with their responsibilities to safeguard their users – especially vulnerable ones. It seems if users really want to make sure they – and their loved ones – remain safe online, then the responsibility once again falls on the user.

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums