Apple To Unveil New Tool To Crack Down On Child Sex Abuse
NEW YORK (CBSNewYork) -- Apple says it plans to crack down on child sex abuse.
The company will now be scanning U.S. iPhones, checking for concerning videos and photos, CBS2's Kiran Dhillon reported Friday.
Apple says the new tool is called "neuralMatch." It will scan images on your cellphone before they're uploaded to the iCloud and flag anyone known for child sexual abuse.
The company also plans to scan encrypted messages for sexually explicit content.
READ MORE: Apple Rolls Out Major New Privacy Protections For iPhones, iPads
Apple says neither technology will interfere with your security or privacy.
But some tech experts have concerns.
"Everybody is behind Apple's mission here to protect children, but they're worried about that slippery slope concept. What's gonna happen if another government wants to use this technology, or even our own government, in maybe a way that isn't to identify abused children or missing children," said data security expert Matthew Radolec, senior director of incident response at Varonis.
FLASHBACK: Apple Offering Free Mail-In Repairs For Some Watch Users Experiencing Charging Issues
Radolec said in theory the concept is a great idea, but he's worried about hackers and the technology being misused.
"How could a government use this for mass surveillance? Next it could be national security or terrorism? Or even could this be misused for political gain?" Radolec said.
FLASHBACK: Apple Urges Users To Update Software To Avoid Getting Hacked
Meanwhile, New Yorkers have mixed opinions. Many say ending child sex abuse is most important, but others worry their privacy could be compromised.
"It can be used for other purposes and then our privacy gets limited and eroded away more and more like Big Brother," one person said.
"It's the right thing to do because we don't know who has what on their phones. So, it's just a safer thing to have," another said.
"I think if you're storing stuff on Apple's servers, on their database, and if it's criminal stuff, maybe they have an obligation to determine whether they have criminal stuff on their systems or not," another added.
Apple says the change will be implemented later this year as part of updates to its operating software.
CBS2's Kiran Dhillon contributed to this report.