Apple announced plans to scan U.S. iPhones for images of child sexual abuse.
Many people are for the idea, where some security researches believe the system could be misused by the government looking to surveil their citizens.
"NeuralMatch," is the tool designed to detect known images of child sexual abuse, which will scan images before they are uploaded it iCloud. If it finds a match, the image will be reviews by an actual person. If it is confirmed to be child pornography, the user's account will be disabled and the National Center for Missing and Exploited Children will be notified.
Apple also plans to scan users' encrypted messages for sexually explicit content as a child safety measure.
What are your thoughts on this?