Last week, a Houston TV station reported on the arrest of a man on charges of child pornography, purely based on a tip from Google.
Most users know that Google routinely uses software to scan the contents of e-mails, including images, to feed its advertising and to identify malware. But many may not have been aware that the company is also scanning users’ accounts looking for illegal activity — namely, matching images in e-mails against its known database of illegal and pornographic images of children.
That bit of Google policy came to light last week, when a Houston man was arrested on charges of having and promoting child pornography after Google told the National Center for Missing and Exploited Children that he had the images in his Gmail account. The tipoff, according to a report from Houston television channel KHOU, led to the man’s arrest.
There’s been a lot of discussion since then on Google’s role in this case. Google is walking a fine line between respecting privacy and using its role as a guardian of your email to ensure that you are not trading in child pornography.
A recent Supreme Court case in Canada centered on a man who was found to be downloading child pornography and addressed whether Internet service providers should give up the identifying Internet protocol numbers to law enforcement without a warrant. In that case, the court ruled against providing the information, drawing praise from privacy advocates and criticism from law enforcement agencies, who called the decision a setback, because of the additional time it will take for them to get warrants.
The Texas case is triggering a similar debate in the United States over what role the companies — companies with whom we share our most private thoughts — should play in law enforcement.
Should Google, Microsoft, Apple and others be reading your email?
UPDATE – From Phys.org:
“Each child sexual abuse image is given a unique digital fingerprint which enables our systems to identify those pictures, including in Gmail,” added the spokesperson, who did not disclose technical details about the process. “It is important to remember that we only use this technology to identify child sexual abuse imagery—not other email content that could be associated with criminal activity (for example using email to plot a burglary).”