The Drum Awards Festival - Official Deadline

-d -h -min -sec

Digital Transformation Meta Policy & Regulation

New EU probe into Meta’s ‘addictive’ features & age verification may put brands on edge

Author

By Kendra Barnett, Associate Editor

May 16, 2024 | 13 min read

The European Commission on Thursday launched a formal investigation into the tech titan’s ‘rabbit-hole’-like features, age verification protocols and default privacy settings for minors.

girl on her phone

The EU is looking into Meta's age verification policies and its platforms' potentially addictive features / Christian Lue

Meta-owned Facebook and Instagram are facing a new investigation by The European Commission, the EU’s politically independent executive arm, over concerns that the platforms have not adequately addressed risks to children.

In particular, the EU is raising issue with the platforms’ potentially addictive design features. The probe will also assess the adequacy and effectiveness of Meta’s age-verification tools and other measures intended to prevent minors from accessing inappropriate content.

“The Commission is concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioral addictions in children, as well as create so-called ‘rabbit-hole effects,’” the Commission wrote in a statement published on Thursday. “The Commission is also concerned about age-assurance and verification methods put in place by Meta,” it continued.

The investigation follows Meta’s submission of a risk assessment report last September and is part of broader regulatory efforts under the EU’s sweeping Digital Services Act (DSA), which took effect last August and aims to establish legislative guardrails for online content and enshrine greater accountability for digital platforms.

The Commission said in its statement on Thursday that the DSA obliges Meta to assess and mitigate the risks posed to children by its platforms’ interfaces. The organization also said that it has concerns about Meta’s default privacy settings for minors and whether Meta’s age verification tools are “reasonable, proportionate and effective.”

“We are not convinced that it has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram,” said Thierry Breton, commissioner for Internal Market of the EU. “We are sparing no effort to protect our children,” he added.

Given Facebook’s and Instagram’s classification as very large online platforms (VLOPs) under the DSA, Meta faces heightened scrutiny and a mandate to mitigate systemic risks – particularly those affecting minors’ mental health.

The formal probe will enable the Commission to conduct office inspections and apply interim measures. Violations of the DSA can result in substantial penalties – in this case, potentially up to 6% of Meta’s global annual revenue.

The investigation is part of a broader effort to crack down on online safety and privacy in the EU. Last month, the bloc initiated a similar probe into TikTok over concerns about addictive design features and the video-sharing app’s impact on users’ mental health. Meanwhile, Meta is already facing scrutiny around election integrity on Facebook and Instagram, with upcoming European Parliament elections imparting a sense of urgency to the matter.

Of course, proving that Meta has breached the DSA may be a tall order. “It is very difficult to prove that design itself is ‘addictive,’ even if minors, for whatever reason, get addicted. It is equally difficult to posit, from a regulatory perspective, that any other design would be less addictive or what such a concept could look like,” says Irina Tsukerman, a US national security lawyer and the president of Scarab Rising, a media and security consultancy, puts it.

She adds that concerns about Meta’s age verification regime may be similarly difficult to navigate, considering the variety of ways in which safeguards can be circumvented.

In response to Thursday’s announcement of the investigation, Meta highlighted the efforts it’s made in recent years to expand minors’ safety and privacy on its platforms. “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools, features and resources designed to protect them,” a Meta spokesperson told The Drum.

These features include a range of parental supervision tools, ‘Take A Break’ notifications encouraging young users to leave the Instagram app, ‘Quiet Mode’ for silencing notifications during set time periods and ‘Nudges,’ which suggest to minors that it might be time to look at something different if they’ve been engaging with content on a single topic for an extended period of time.

The company has also made an effort to improve its age verification systems in recent years in light of growing concern among both consumers and lawmakers about kids’ online safety and age-appropriate content and online experiences. In 2022, Meta introduced new ways to ensure users are honest in the age verification process – requiring an image of a government ID, a video selfie or age confirmation provided by friends.

The tech giant has also begun using AI to detect age misrepresentation and to accurately determine whether a user is a teenager. The technology aims to prevent minors from accessing Facebook Dating, restrict adults from messaging teenagers and ensure that teen users aren’t being reached by restricted advertising content.

The company claims that these measures have been effective, preventing 96% of teens who tried to edit their birthdays from an under-18 date to an over-18 date on Instagram from doing so.

“This is a challenge the whole industry is facing, which is why we’re continuing to advance industry-wide solutions to age-assurance that are applied to all apps teens access,” the Meta spokesperson said. “We look forward to sharing details of our work with the European Commission.”

But for Meta, the pressure is building. The Commission already has another open investigation into Meta – launched April 30, the proceedings are evaluating deceptive advertising, political content, data access rules and notice and action mechanisms on both Facebook and Instagram.

Plus, across the pond, a series of highly publicized US Congressional hearings over digital safety, data privacy and content moderation over the last couple of years have put Meta – and CEO Mark Zuckerberg – on blast. In January of this year, during one such hearing, Zuckerberg was pressured to issue an impromptu apology on the Senate floor to families that had lost loved ones in the wake of online harassment or exploitation.

Despite Meta’s assurances about the efforts it has made to mitigate risks to young users, the EU’s investigation into Facebook and Instagram should serve as a cautionary signal to other platforms, some experts say.

“The Commission’s investigation sends a clear signal to companies that fall under the DSA: assurances that they are taking obligations ‘very seriously’ are not going to cut it. The obligations in this regulation, including those related to protecting minors and assessing risks, need to be documented, detailed and transparent,” says Calli Schroeder, senior counsel and global privacy counsel at the Electronic Privacy Information Center (EPIC), a Washington, DC-based digital rights think tank.

Considering the scrutiny that Meta has faced in recent years – especially in regard to children’s safety – Schroeder says, “It would be shocking if Meta has not adequately completed and documented their assessments and policies around the issues raised.”

But in some experts’ estimation, the tech titan has done its due diligence. “I get the sense that the EU wants to be seen as a tough arbiter of digital safety and protections, and hence, it will be looking to make some statement judgments as quickly as possible. However, I believe Meta has accounted for this kind of challenge,” says Bill Fisher, a principal analyst at eMarketer specializing in media and marketing. “Its platforms were always likely to be some of the first in the crosshairs, so it likely has legal arguments in place and financial contingency to account for.”

Even if Meta is in full compliance with the DSA and is thoroughly prepared for a formal investigation, it’s still possible that the publicity of the probe could spook users and advertisers – the latter being, of course, the backbone of the company’s revenue generation.

As Tsukerman puts it: “Even the process of investigations is costly for Meta both in terms of legal expenses and business and reputational risk, which leads to likely opportunity costs … [It] may have a chilling effect even on the existing consumer and advertising base due to the concerns about the unclear impact of these investigations, as well as bad optics of being accused of pushing an addictive, socially harmful product.”

Of course, for many advertisers, “the audience reach that Meta’s platforms offer is just too big to ignore,” says eMarketer’s Fisher.

Nonetheless, he suggests that brands may be on edge about their investments with Meta in light of a major regulatory investigation like this one. “[They’ll] likely take a wait-and-see approach,” he says. “They may begin to think about contingency plans, but then other social platforms aren't exactly pleasing to the regulators’ eyes. So, it will be business as normal for now.“

Suggested newsletters for you

Daily Briefing

Daily

Catch up on the most important stories of the day, curated by our editorial team.

Ads of the Week

Wednesday

See the best ads of the last week - all in one place.

The Drum Insider

Once a month

Learn how to pitch to our editors and get published on The Drum.

EPIC’s Schroeder suggests that Meta will not be the only one to have to contend with how its advertising business fits into broader regulatory concerns around online safety and privacy. “The ad portion of this investigation reflects a much bigger referendum on the online advertising ecosystem and its increasing expansion into surveillance capitalism,” she says.

“The ad industry seems unwilling to even consider moving away from models that essentially harass people with advertisements and, in this case, make them pay for the privilege of escaping that harassment on a single platform. Depending on how this investigation moves forward, they may have no choice.”

But in some experts’ view, the resolution of regulators’ concerns about online safety and privacy will remain in diametric opposition to the business models of social platforms like Instagram and Facebook – no matter what.

“Regulators can push for specific limited improvements in some of these areas, but ultimately, an international social media and generally open communication system is always going to be a major risk factor and it is up to the individual users to weigh in the risks of their own content and data output…” says Tsukerman. “Meta could and should be pushed to do better in all areas, but the risks associated with it cannot be eliminated entirely or be customized to satisfy all possible audiences and concerns. The bottom line is Meta’s entire business model is antithetical to privacy, given that it directly profits from the sale of consumer data. And due to the sheer size of the company, the significant issues related to content moderators and security enforcers – and the rudimentary level of the AI mechanisms – security is likewise not a top priority or byproduct of Meta's infrastructure.”

She suggests that perhaps regulators shouldn’t attempt to eliminate all of the potential risks associated with safety and privacy on social platforms, but should instead focus their efforts on more manageable areas of enforcement.

“The most realistic and unifying point of contention is whether Meta is doing enough to eliminate deliberate violations [of regulations] such as accounts engaged in child abuse and other inappropriate or illegal content … as well as systematic and consistent enforcement of its own community standards in an impartial and transparent manner. This is … where both practical external recommendations and substantial and measurable improvements can [likely] be made…”

The Commission’s investigations into Meta and TikTok nonetheless evidence the EU’s commitment to enforcing robust child protection measures across the digital ecosystem. As the DSA reshapes online governance, the focus remains on holding major tech companies accountable for creating a safe online environment.

Under the law, there is no grace period for compliance with the DSA. Penalties for violations include three possible steps: first, a fine of up to 6% of global turnover for a VLOP; periodic penalties for delays; and, as a last resort, a temporary suspension of the service. Meta, says eMarketer’s Fisher, “would absolutely want to avoid the final stage of enforcement, even though it has threatened to pull out of Europe in the past.”

The Commission has not specified a timeline for completing its investigations into Meta.

For more, sign up for The Drum’s daily newsletter here.

Digital Transformation Meta Policy & Regulation

More from Digital Transformation

View all

Trending

Industry insights

View all
Add your own content +