The EU Could Follow Australia With a Social Media Under-16 Clampdown Impacting CX

minors social media protection

Australia has been in the headlines this week as teenagers go to court to win their social media access back The move comes after the Aussie government passed a law banning under-16s from accessing social services.

The country’s Online Safety Act kicks in on December 10th. It applies to all social media, including Facebook, Instagram, TikTok and Reddit. In action, platforms must take “reasonable steps to prevent Australians under 16 from creating or maintaining accounts.”

Now Europe could follow in its footsteps, blocking access to social sites for the young, without parental consent, being the key difference. But there’s still likely to be a major argument about implementing the laws.

The European Approach to Social Management

In Europe, one area of focus is on the social media CEOs like Elon Musk, with one suggestion making them personally liable should their platforms consistently violate the EU’s provisions, whenever they go live.

A report into the protection of minors online aims to fill the gaps in current EU legislation, and the rules, if enabled, could fit into an update to the digital services act.

The report highlights key areas that prey on minors’ curiosity and lack of self-control including “”Stresses that “features such as ‘infinite scrolling’, ‘auto play’, ‘pull to refresh’, disappearing stories… excessive push notifications, gambling-like mechanics (loot boxes) and harmful gamification practices are aimed, by design, at influencing minors’ decision-making, drawing them in with manipulative strategies that are aimed at increasing their engagement and the amount of time and money they spend online and can increase addiction.”

While the UK is outside of such EU legislation, it might be a tempting move for the ruling Labour Party to help clamp down on social media for bullying, gang activity and so on. Other suggestion benefits are a boost to mental health.

The UK could update its own Online Safety Act, which came into effect in 2023 (updated this year), and put the onus on social media firms to protect minors from illegal and harmful content online.

The Customer Experience Angle

Minors are always a precious section of society, with advertising rules preventing marketing that makes “direct appeals” to them. Even so, today’s children typically have bank accounts, smartphones and other avenues to create a relationship that could be a long and valuable one for businesses.

The younger elements of Generation Z are getting used to AI and automation in customer service, in ways that even the older members of their cohort may find challenging to keep up. And when we get to Generation Alpha, born after 2010, they will be AI native in ways we cannot comprehend.

If the social media rules come into force, conversations outside of social will be critical. And if the rules spread omnichannel, businesses will need a fresh layer of protection at all contact points.

Brands and businesses can use age gates and identity verification as reasonable steps to establish age, while in the real world, AI cameras can estimate age to prevent minors buying products they shouldn’t or entering licensed premises.

Marketers already follow multiple guardrails, but as AI takes over more messaging, it will need to be aware of the risks of addressing the youth audience, with added checks in place.

Similarly, any AI-powered online conversations with minors must be loaded with prime directives to avoid harmful suggestions or advice.