FAQ on Apple scanning your iCloud photos
Apple has put out an FAQ addressing some of the most common concerns related to their new policy involving the review of all photos uploaded to their iCloud service.
Below are several of the top answers about their new system to spot and report abusive content.
MORE: Apple to scan U.S. iPhones for images of child sexual abuse
Q: Is Apple going to scan all the photos on my iPhone?
A: No, unless you automatically have your iPhone or Apple device backing up to the iCloud. Apple says if you have iCloud disabled, their new tool that scans for child porn does not apply to you.
Q: Does this mean Apple sees all my photos in the iCloud?
A: Again, no. The system is designed to compare images to a preset list of known illegal images of child porn or exploitation provided to them by the National Center for Missing and Exploited Children. Apple says they only get notified when an iCloud image is a match.
Q: What do they do if they find a match?
A: After a human review, Apple says they will pass a report to the National Center for Missing and Exploited Children, since having these images is considered a crime in the U.S.
FOR THE LATEST NEWS UPDATES, DOWNLOAD THE FOX 26 NEWS APP
Q: Can the system be used to detect other types of images?
A: Apple claims they won’t comply if the government demands other types of images be added to the system. They claim the government has tried to force them to scan for other types of content but that they’ve always refused.
Q: What if an Apple worker injects a different image into the system to flag people for things other than child porn?
A: Apple says it’s not possible, and that even if someone did this, all reports are hand-reviewed before being sent along to law enforcement, so they are able to intercept and spot any malicious efforts to abuse the system.
Q: Can Apple's new system cause an innocent person to be in trouble with the law?
A: Apple says the chances are one in 1 trillion per year of someone being false flagged by their new process.