Home Technology Apple criticised for system that detects child abuse

Apple criticised for system that detects child abuse

350
0
Apple

Apple is facing criticism over a new system that finds child sexual abuse material (CSAM) on US users’ devices.

The technology will search for matches of known CSAM before the image is stored onto iCloud Photos.

But there are concerns that the technology could be expanded and used by authoritarian governments to spy on their own citizens.

WhatsApp head Will Cathcart called Apple’s move “very concerning”.

Apple said that new versions of iOS and iPadOS – due to be released later this year – will have “new applications of cryptography to help limit the spread of CSAM online while designing for user privacy”.

The system will report a match which is then manually reviewed by a human. It can then take steps to disable a user’s account and report to law enforcement.

The company says that the new technology offers “significant” privacy benefits over existing techniques – as Apple only learns about users’ photos if they have a collection of known child sex abuse material in their iCloud account.

But WhatsApp’s Mr Cathcart says the system “could very easily be used to scan private content for anything they or a government decides it wants to control. Countries, where iPhones are sold, will have different definitions on what is acceptable”.

In Other News – Alibaba to fire manager accused of rape

Chinese technology giant Alibaba will sack a manager accused of rape, according to a memo seen by the BBC.

In the letter sent to employees of the firm, chief executive Daniel Zhang said two other bosses who failed to act on the allegation have resigned.

Alibaba is working with police after a female worker said her male boss raped her in a hotel room while she was unconscious after a “drunken night”. read more

Source – BBC