Most users know that Google routinely scans the contents of emails, including images, to feed its advertising and to identify malware. But many may not be aware the company is also scanning users’ accounts looking for illegal activity — namely, matching images in emails against its known database of illegal and pornographic images of children.
That bit of Google policy came to light when a Houston man was arrested on charges of having and promoting child pornography after Google told the National Center for Missing & Exploited Children that he had the images in his Gmail account. The tip-off led to the man’s arrest.
While it is hard to argue with the outcome of this particular case, the news caused alarm among researchers at security firm Sophos, who questioned whether Google was stepping into the role of a pseudo-law enforcement agency.
Chester Wisniewski, a senior security researcher at Sophos, said that Google’s “proactive” decision to tip off law enforcement makes “some of us wonder if they’re crossing the line.”
Many security firms, including Sophos, will find themselves in a situation where they come across child pornography among the files they scan for clients, Wisniewski said. And those companies report those images to the police. The difference, he said, is that Sophos and other companies do not go looking for these images in routine scans, as Google appears to have done with the software it uses in the course of routine scanning for ad keywords and malicious software.
Google declined to comment on this case but pointed to a June 2013 column in a British paper, the Telegraph, that outlined the steps Google and other major tech firms such as Microsoft, Yahoo and Apple take to identify graphic images of children and report users who share those images.
In that column, Google lead counsel David Drummond said it is up to Google and other firms to ensure that when “people try to share this disgusting content they are reported and prosecuted.”
In many cases, Google has taken a hard line against bowing to government requests for information without due process — but not, Drummond said, in this case.
“Google is in the business of making information widely available,” Drummond wrote in the column. “But there can be no free speech when it comes to images of child sexual abuse.”
Drummond and Google are by no means alone in drawing that line. Laws that increase government power to access data or censor Internet traffic often focus on cases that deal with child pornography because they are such extreme examples of the line companies must walk when evaluating when they should share customer data with law enforcement officers.