Google And Microsoft Clamp Down On Child Exploitation Search
Google and Microsoft have reworked their search algorithms to make child abuse content harder to find, though some experts still argue that it will make little difference to those who actually hunt out the illegal imagery. Announced at a UK internet safety summit, the new search code has apparently already removed around 100,000 Google results related to the sexual abuse of children, with Bing expected to do similar. However, there are lingering concerns that those interested in such content will continue to find it on so-called darknets.
"They don't go on to Google to search for images" former Child Exploitation and Online Protection Centre chief Jim Gamble told the BBC of such users. "They go on to the dark corners of the internet on peer-to-peer websites."
Instead of the algorithm changes, Gamble argues, the money spent would be better prioritized on employing new child protection experts who could actively track down pedophiles.
Google's eponymous search and Microsoft's Bing will each show links to tools for reporting potential illegal images and sources of support for those impacted, when queried for related terms. There'll also be clear warnings that viewing such content is illegal.
"We're agreed that child sexual imagery is a case apart, it's illegal everywhere in the world, there's a consensus on that" Google communications director Peter Barron said of the new system. "It's absolutely right that we identify this stuff, we remove it and we report it to the authorities."
While neither company is claiming total victory, Barron suggested that the changes would make it "much much more difficult to find this content online."
The new algorithms will be launched in the UK initially. However, both firms say they'll spread to other countries – in a total of 159 different languages – over the next six months.