Google isn't the only company turning evidence of child pornography over to the authorities, with Microsoft tipping police in Pennsylvania on a OneDrive user's illegal material in the cloud. Two images were spotted by Microsoft's automated content tracking tools, PhotoDNA, which maintain a specific watch for offensive photos of children, and led to the man being arrested at the end of July.
According to the BBC, the man admitted to trading such images on Kik messenger. Since it's an open investigation full details have not been released, but Microsoft's system supposedly caught the content when it was saved to OneDrive and then man in question then tried to email the photos from a Live.com address.
Google made headlines earlier this month when it revealed it had spotted offensive images in a user's Gmail and reported the illegal content to the authorities. The actions led to questions from some about what degree of privacy users could expect, though Google's terms and conditions are clear that content monitoring does on.
Similarly, Microsoft says it's clear in its own policies that it uses "automated technologies to detect child pornography or abusive behavior that might harm the system, our customers, or others."
That technology is called PhotoDNA, and it's actually used by both firms. By taking an image and converting it to black & white, then splitting it into a grid, a "digital fingerprint" of the tone changes in each section can be established. The process is low-intensity enough to be done quickly, which means OneDrive can run it on all images saved in the cloud.
If a suspect shot is identified, the companies report it to the National Center for Missing and Exploited Children, which then liaises with law enforcement.
Google is said to have its own homegrown tools in addition to PhotoDNA, and to use them simultaneously. The company previously gave a vague description of the fingerprinting tools it used to identify the offender, light on technical detail.