Most sex crimes against children are committed by people the children know, rather than strangers.Even those companies with state-of-the-art defenses spend far more time trying to stop online bullying and attempts to sneak profanity past automatic word filters than they do fending off sex predators.Still, as the Skout case showed, there are several recent trends that have heightened the concerns of child-safety experts: the rise of smartphones, which are harder for parents to monitor; location-oriented services, which are the darling of Net companies seeking more ad revenue from local businesses; and the rapid proliferation in phone and tablet apps, which don't always make clear what data they are using and distributing.
As a result, many threats are eliminated without human intervention and moderators at the company are notified later.
Sites that operate with such software still should have one professional on safety patrol for every 2,000 users online at the same time, said Sacramento-based Metaverse Mod Squad, a moderating service.
Facebook's software likewise depends on relationship analysis and archives of real chats that preceded sex assaults, Chief Security Officer Joe Sullivan told Reuters in the company's most expansive comments on the subject to date.
Like most of its peers, Facebook generally avoids discussing its safety practices to discourage scare stories, because it doesn't catch many wrongdoers, and to sidestep privacy concerns.
Metaverse Chief Executive Amy Pritchard said that in five years her staff only intercepted something 'terrifying' once, about a month ago, when a man on a discussion board for a major media company was asking for the email address of a young site user.
Software recognised that the same person had been making similar requests of others and flagged the account for Metaverse moderators.
Users could be unnerved about the extent to which their conversations are reviewed, at least by computer programs.'We've never wanted to set up an environment where we have employees looking at private communications, so it's really important that we use technology that has a very low false-positive rate,' he said.
In addition, Facebook doesn't probe deeply into what it thinks are pre-existing relationships.
From a business perspective, however, there are powerful reasons not to be so restrictive, starting with teen expectations of more freedom of expression as they age.
If they don't find it on one site, they will somewhere else.
But instead of looking just at one set of messages it will examine whether a user has asked for contact information from dozens of people or tried to develop multiple deeper and potentially sexual relationship, a process known as grooming.