• Sunday, Sep 26, 2021
  • Last Update : 06:35 am

Apple update will check iPhones for images of child sexual abuse

  • Published at 10:13 am August 6th, 2021
iPhone
New technology will allow iPhones' operating systems to match abusive photos against a database of known child sexual abuse AFP

The feature is headed to the latest Macintosh computer operating system, as well as iOS

Apple Thursday said iPhones and iPads will soon start detecting images containing child sexual abuse and reporting them as they are uploaded to its online storage in the United States, a move privacy advocates say raises concerns.

"We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material (CSAM)," Apple said in an online post.

New technology will allow software powering Apple mobile devices to match abusive photos on a user's phone against a database of known CSAM images provided by child safety organizations, then flag the images as they are uploaded to Apple's online iCloud storage, according to the company.

However, several digital rights organizations say the tweaks to Apple's operating systems create a potential "backdoor" into gadgets that could be exploited by governments or other groups.

Apple counters that it will not have direct access to the images and stressed steps it's taken to protect privacy and security.


Also Read - OP-ED: Living in a digitally dangerous world


The Silicon Valley-based tech giant said the matching of photos would be "powered by a cryptographic technology" to determine "if there is a match without revealing the result," unless the image was found to contain depictions of child sexual abuse.

Apple will report such images to the National Center for Missing and Exploited Children, which works with police, according to a statement by the company.

India McKinney and Erica Portnoy of the digital rights group Electronic Frontier Foundation said in a post that "Apple's compromise on end-to-end encryption may appease government agencies in the United States and abroad, but it is a shocking about-face for users who have relied on the company's leadership in privacy and security."

Minding messages

The new image-monitoring feature is part of a series of tools heading to Apple mobile devices, according to the company.

Apple's texting app, Messages, will use machine learning to recognize and warn children and their parents when receiving or sending sexually explicit photos, the company said in the statement.

"When receiving this type of content, the photo will be blurred and the child will be warned," Apple said.

"As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it."

Similar precautions are triggered if a child tries to send a sexually explicit photo, according to Apple.

Messages will use machine learning power on devices to analyze images attached to missives to determine whether they are sexually explicit, according to Apple.

The feature is headed to the latest Macintosh computer operating system, as well as iOS.

Personal assistant Siri, meanwhile, will be taught to "intervene" when users try to search topics related to child sexual abuse, according to Apple.

Greg Nojeim of the Center for Democracy and Technology in Washington, DC said that "Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship."

This, he said, would make users "vulnerable to abuse and scope-creep not only in the United States, but around the world."

"Apple should abandon these changes and restore its users' faith in the security and integrity of their data on Apple devices and services."

Apple has built its reputation on defending privacy on its devices and services despite pressure from politicians and police to gain access to people's data in the name of fighting crime or terrorism.

"Child exploitation is a serious problem and Apple isn't the first tech company to bend its privacy-protective stance in an attempt to combat it," McKinney and Portnoy of the EFF said.

"At the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor," they added.

53
Facebook 51
blogger sharing button blogger
buffer sharing button buffer
diaspora sharing button diaspora
digg sharing button digg
douban sharing button douban
email sharing button email
evernote sharing button evernote
flipboard sharing button flipboard
pocket sharing button getpocket
github sharing button github
gmail sharing button gmail
googlebookmarks sharing button googlebookmarks
hackernews sharing button hackernews
instapaper sharing button instapaper
line sharing button line
linkedin sharing button linkedin
livejournal sharing button livejournal
mailru sharing button mailru
medium sharing button medium
meneame sharing button meneame
messenger sharing button messenger
odnoklassniki sharing button odnoklassniki
pinterest sharing button pinterest
print sharing button print
qzone sharing button qzone
reddit sharing button reddit
refind sharing button refind
renren sharing button renren
skype sharing button skype
snapchat sharing button snapchat
surfingbird sharing button surfingbird
telegram sharing button telegram
tumblr sharing button tumblr
twitter sharing button twitter
vk sharing button vk
wechat sharing button wechat
weibo sharing button weibo
whatsapp sharing button whatsapp
wordpress sharing button wordpress
xing sharing button xing
yahoomail sharing button yahoomail