Google's email-scanning practices are used to fight evil as well as target ads. The company revealed Monday that it's created a digital database of images displaying child sexual abuse, which it compares to images sent via Gmail.
In that case, images of child pornography are matched against its database and reported to law enforcement, Google said late Monday.
Reports by the BBC and KHOU 11 in Houston reported that John Henry Skillern had been arrested for possession of child pornography, based on pornographic images that Skillem allegedly emailed. Google forwarded the images to the National Center for Missing and Exploited Children, who contacted the police.
That Google will scan your content to show you advertising has been well known; the company's terms of service already notify users that their emails are being analyzed. (To its credit, Google's updates to its TOS are highlighted in green.)
"Our automated systems analyze your content (including emails) to provide you personally relevant product features, such as customized search results, tailored advertising, and spam and malware detection," Google's TOS reads. "This analysis occurs as the content is sent, received, and when it is stored."
Google uses special filters for illegal imagery
Child pornography, however, receives special treatment. When PCWorld asked for comment about how child pornography is addressed by Google, the company responded that is filtered both in Gmail and in search requests.
"Sadly all Internet companies have to deal with child sexual abuse," Google said in a statement. " It's why Google actively removes illegal imagery from our services--including search and Gmail--and immediately reports abuse to NCMEC. This evidence is regularly used to convict criminals. Each child sexual abuse image is given a unique digital fingerprint which enables our systems to identify those pictures, including in Gmail. It is important to remember that we only use this technology to identify child sexual abuse imagery, not other email content that could be associated with criminal activity (for example using email to plot a burglary)."
Federal law requires businesses made aware of sexual abuse hosted on their servers to be reported to the NCMEC CyberTipLine, which is operated by a number of law-enforcement agencies. In 2013, 10,498 notices of suspected child abuse images were sent to 169 service providers, NCMEC said. "NCMEC makes all CyberTipline reports available to appropriate law-enforcement agencies for review and possible investigation," NCMEC said in a statement.
Google has typically worked with law enforcement and regulatory agencies, developing a Content ID database to help speed take-down requests for copyrighted information from YouTube, for example. With ContentID, the content holders themselves can notify Google and ask for a copy of Frozen, for example, to be removed.
Google has taken a similar approach to images of child pornography at issue in the Skillern case. Google itself compiles a database of images of possible abuse that have been brought to the company's attention, sources close to the company said. Those images are then reviewed by a human employee. If confirmed to show abuse, the company assigns each image a unique digital fingerprint, which is entered into its database. Images sent via Gmail are then compared to the image database on file and then presumably sent to the NCMEC CyperTipline, as required by law.
Google hasn't said anything about photos that are uploaded to Google Drive, and then shared via email or other means. But we all know that most cloud services are not inviolable. Google is simply taking action to shut down Gmail as a distribution medium for the darkest corners of the Internet.
This story, "How Google handles child pornography in Gmail, search" was originally published by PCWorld.