Skip to main content

To: Hiroshi Lockheimer, SVP of Android at Google

Petition: Don't Let Child Pornography Thrive on Android!

More than 65 million images of child sexual abuse -- sometimes called child pornography -- were found online last year alone. Android is now the most popular personal device that doesn’t find and report these heinous images. Google must catch up and commit to finding and reporting child sexual abuse material (CSAM) on Android -- just like Apple does.

Why is this important?

Apple recently announced an important new step in the fight against child sexual abuse: they’ll scan all new iPhones for CSAM and report it. This change means abuse can be discovered much sooner -- not days or weeks after when images are uploaded or shared -- and could save child victims and protect their privacy. These scans are done entirely by machines that exclusively look for a “digital fingerprint” of abusive material and flag it as potentially illegal -- a solution that balances the welfare and privacy of kids with that of iPhone users.

Android phones, owned and operated by Google, don’t have the same device scanning in place. Users must upload photos to a service for abusive images to be detected -- allowing millions of images to be shared stealthily and victims to go undetected for longer.

Google, stop failing kids and start scanning for CSAM on Android devices.

Partner

Updates

2021-08-12 12:51:29 -0400

100 signatures reached

2021-08-12 12:17:30 -0400

50 signatures reached

2021-08-12 12:04:46 -0400

25 signatures reached

2021-08-12 12:00:34 -0400

10 signatures reached