Credit: NurPhoto / Getty Images

Two recent investigations have shed light on parent-managed child influencer accounts, suggesting that Meta’s tools for monetizing content and subscription models are contributing to child exploitation online.

According to a report from the Wall Street Journal, Meta’s safety team flagged instances where adult account holders were using Facebook and Instagram’s paid subscription features to profit from exploitative content involving their own children. Internal records showed numerous “parent-managed minor accounts” on Instagram selling exclusive content via subscriptions. The content often showcased young children in revealing attire, offering videos of children in suggestive poses or dances. The report from the Wall Street Journal also noted that the parents running these accounts encouraged inappropriate conversations and interactions with followers.

The safety team recommended banning accounts focusing on child models or implementing new registration and monitoring requirements for child-centric accounts. However, instead of adopting these suggestions, the company opted to rely on an automated system aimed at identifying and banning potential predators before they could subscribe. Employees expressed doubts about the system’s effectiveness, stating that bans could be easily circumvented.

SEE ALSO:

What parents need to inform their kids about explicit deepfakes

Simultaneously, the New York Times published a report on the profitable business of mother-managed Instagram accounts, confirming instances of accounts selling exclusive photos and chat sessions involving their children. According to the Times, more suggestive posts garnered higher engagement, with male subscribers known to flatter, bully, and even blackmail families to obtain more explicit images. Some followers had prior convictions for sex offenses. Child influencer accounts reported earning substantial amounts from monthly subscriptions and interactions with followers.

The investigation by the Times also revealed a significant number of adult male accounts engaging with underage content creators. In the case of popular influencers, 75 to 90 percent of followers were men, with millions of male connections traceable among the analyzed child accounts.

As Meta spokesperson Andy Stone informed the New York Times, “We prevent accounts displaying potentially suspicious behavior from utilizing our monetization tools, and we intend to restrict such accounts from accessing subscription content.” Stone also mentioned to the Wall Street Journal that the automated system was part of the company’s ongoing safety efforts.

Despite the platform’s moderation endeavors, these accounts and their questionable practices have persisted, with banned accounts returning, sexually explicit searches and usernames eluding detection systems, and Meta content surfacing on external platforms frequented by child predators, according to the Wall Street Journal.

Last year, Meta introduced new verification and subscription features, expanded monetization tools for creators, including incentives for popular content, and new gifting options. Meta has periodically adjusted its content monetization strategies, such as pausing Reels Play, a feature allowing users to earn from Reels videos upon reaching specific view thresholds.

Meta has faced criticism for its perceived inefficiency in curbing harmful content on its platforms. Amid ongoing government probes into social media’s adverse effects on children, the company has faced multiple lawsuits over alleged complicity in child exploitation. A lawsuit in December accused Meta of facilitating a “marketplace for predators.” Meta formed a child safety task force last June after an internal investigation in 2020 found 500,000 child Instagram accounts engaging in daily “inappropriate” interactions.

Meta is not the sole social media platform accused of inadequately addressing child sexual abuse material. In November 2022, an investigation by Forbes revealed private TikTok accounts sharing such content and targeting minor users despite the platform’s “zero tolerance” policy.

Instagram’s content monetization policies stipulate that all content on the platform must adhere to the Terms of Use and Community Guidelines, which include rules against sexual, violent, profane, or hateful content. While the policy does not explicitly mention restrictions for minor accounts, Meta has established separate guidelines prohibiting forms of child exploitation in general.

These investigations come in response to growing calls from various online communities to halt the dissemination of child sexual abuse material through child modeling accounts and seemingly innocent pages managed by child “influencers.” Online activists, including TikTok accounts like child safety activist @mom.uncharted, have documented the proliferation of such accounts across social media platforms and even confronted predominantly male followers about their behavior. Exposing the parents behind these accounts has prompted other family vloggers to remove content featuring their children, pushing back against the allure of “sharenting.” Meanwhile, policymakers are grappling with the regulation and rights of child influencers in a multi-billion dollar industry.

Despite appeals from parents, activists, and lawmakers for legislative and cultural action, the absence of robust regulations, uncertainty about permissible content, and loopholes in moderation practices have allowed these accounts to flourish across various platforms.

Topics
Social Good
Social Media
Family & Parenting

Chase sits in front of a green framed window, wearing a cheetah print shirt and looking to her right. On the window's glass pane reads 'Ricas's Tostadas' in red lettering.

Chase DiBenedetto
Social Good Reporter

Chase joined Mashable’s Social Good team in 2020, covering online stories about digital activism, climate justice, accessibility, and media representation. Her work also touches on how these conversations manifest in politics, popular culture, and fandom. Sometimes she’s very funny.

Shares:

Leave a Reply

Your email address will not be published. Required fields are marked *