Microsoft launches tool to identify child sexual predators in online chat rooms

Image: Two boys using laptop in dark room
Copyright Cavan Images Getty Images
Copyright Cavan Images Getty Images
By Olivia Solon with NBC News Tech and Science News
Share this articleComments
Share this articleClose Button

The tool, codenamed Project Artemis, is designed to look for patterns of communication used by predators to target children

ADVERTISEMENT

Microsoft has developed an automated system to identify when sexual predators are trying to groom children within the chat features of video games and messaging apps, the company announced Wednesday.

The tool, codenamed Project Artemis, is designed to look for patterns of communication used by predators to target children. If these patterns are detected, the system flags the conversation to a content reviewer who can determine whether to contact law enforcement.

Courtney Gregoire, Microsoft's chief digital safety officer, who oversaw the project, said in a blog postthat Artemis was a "significant step forward" but "by no means a panacea."

"Child sexual exploitation and abuse online and the detection of online child grooming are weighty problems," she said. "But we are not deterred by the complexity and intricacy of such issues."

Microsoft has been testing Artemis on Xbox Live and the chat feature of Skype. Starting Jan. 10, it will be licensed for free to other companies through the nonprofit Thorn, which builds tools to prevent the sexual exploitation of children.

The tool comes as technology companies are developing artificial intelligence programs to combat a variety of challenges posed by both the scale and the anonymity of the internet. Facebook has worked on AI to stop revenge porn, while Google has used it to find extremism on YouTube.

Games and apps that are popular with minors have become hunting grounds for sexual predators who often pose as children and try to build rapport with young targets. In October, authorities in New Jersey announced the arrest of 19 people on charges of trying to lure children for sex through social media and chat apps following a sting operation.

Microsoft created Artemis in conjunction with the online children's game Roblox, messaging app Kik and the Meet Group, which makes dating and friendship apps including Skout, MeetMe and Lovoo. The collaboration started in November 2018 at a Microsoft hackathon focused on child safety.

Artemis builds on an automated system Microsoft started using in 2015 to identify grooming on Xbox Live, looking for patterns of keywords and phrases associated with grooming. These include sexual interactions, as well as manipulation techniques such as detachment from friends and family.

The system analyzes conversations and assigns them an overall score indicating the likelihood that grooming is happening. If that score is high enough, the conversation will be sent to moderators for review. Those employees look at the conversation and decide if there is an imminent threat that needs referring to law enforcement or, if the moderator identifies a request for child sexual exploitation or abuse imagery, the National Center for Missing and Exploited Children is contacted.

The system will also flag cases that might not meet the threshold of an imminent threat or exploitation but violate the company's terms of services. In these cases, a user could have their account deactivated or suspended.

The way Artemis has been developed and licensed is similar to PhotoDNA, a technology developed by a British internet security company that helps law enforcement and technology companies find and remove known images of child sexual exploitation. PhotoDNA converts illegal images into a digital signature known as a "hash" which can be used to find copies of the same image when they are uploaded somewhere else. The technology is used by more than 150 companies and organizations including Google, Facebook, Twitter and Microsoft.

For Artemis, developers and engineers from Microsoft and the partners involved fed historical examples of patterns of grooming they had identified on their platforms into a machine learning model to improve its ability to predict potential grooming scenarios, even if the conversation hadn't yet become overtly sexual. It is common for grooming to start on one platform before moving to a different platform or a messaging app.

Share this articleComments

You might also like

Apple launches faster chips, MacBook Pro laptops and cheaper Airpods - what are the upgrades?

What is the metaverse and why is Facebook betting big on it?

Euronews Debates | Profit vs public good: How can innovation benefit everyone?