Leading tech industry giants, including Google and Meta (formerly Facebook), have pledged their collaboration in a collective effort to combat the scourge of online child sexual abuse and exploitation. This commitment was unveiled on Tuesday as they introduced the “Lantern” program, a groundbreaking initiative to address this critical issue.
Child sexual abuse on the internet has long been a significant concern, both for regulatory authorities and tech companies alike. This new program, Lantern, signifies a proactive step to demonstrate the industry’s dedication to protecting children and young people online.
Under the Lantern program, major technology companies will cooperate by sharing signals that indicate activity violating their respective policies against child exploitation. These shared signals will empower platforms to more rapidly identify, remove, and report harmful content related to child abuse and exploitation.
The signals in question can encompass email addresses, specific hashtags, or keywords commonly associated with the buying and selling of illicit materials related to child exploitation or the grooming of vulnerable youngsters for exploitation.
“The absence of a consistent procedure for companies to collaborate against predatory actors who attempt to evade detection across services has been a significant issue,” noted Sean Litton, Executive Director of the Tech Coalition, which brings these tech companies together to tackle the problem. He further emphasized that “Lantern fills this gap and shines a light on cross-platform attempts at online child sexual exploitation and abuse, helping to make the internet safer for kids.”
Other key platforms participating in the Tech Coalition alongside Google and Meta are Mega, a privacy-focused platform from New Zealand, as well as Snap and Discord.
According to the Tech Coalition, in a testing phase of the program involving data provided by Mega, Meta removed over 10,000 Facebook pages, profiles, and Instagram accounts connected to child sexual abuse. Meta reported these accounts to the National Center for Missing & Exploited Children in the United States and also shared this information with other platforms, enabling them to conduct their independent investigations.
Antigone Davis, the Global Head of Safety at Meta, emphasized the necessity for the tech industry to work collectively to protect children across the various apps and websites they use. She noted, “Predators don’t limit their attempts to harm children to individual platforms.”
The announcement of the Lantern program coincided with testimony before a Senate committee in Washington, D.C., by a former senior engineer at Meta. In his testimony, he alleged that senior officials, including Mark Zuckerberg, disregarded his warnings about the safety of minors using the company’s services.
Arturo Bejar, who conducted an internal poll of 13–15-year-olds on Instagram, informed lawmakers that 13% of the participants reported experiencing unsolicited sexual advances within the previous seven days on the platform. He criticized Meta for failing to adequately address these issues and expressed concerns about the harm children face on the platform.