Apple explains how his photo analysis system works to detect minor abuse;Will scan down and messages from idevices

Time 08/10/2022 By myhoneybakedfeedback

In the last 24 hours Apple has made an important announcement on a new photo scan system inaugurated in iOS 15.Using Machine Learning and AI, this system will scan the pictures from iPhone and iCloud, detecting images of abused children.Given the wave of concern that appeared online regarding the violation of users' intimacy and the extension of images by dictatorial states, Apple took a position and explained how the technology works.

În primul rând şansele sunt de 1 la 1 trilion ca să apară rezultate Fals-pozitive, deci să fie chemat la secţie cineva care nu are poze cu minori abuzaţi pe telefon sau în iCloud.It seems that the images will scan on the device and confronted using a database with materials provided by the National Center for kidnapped and exploited children and other children's protection organizations.

The company has promised that the whole system is designed given the confidentiality of users.Apple will not find out any information about images that do not match those reported in the database.Also, the company in Cupertino will not have access to Metata or any visual derivative from the right images, until a certain matching threshold is reached to announce the authorities.

Apple explica modul în care funcţionează sistemul său de analiză a pozelor pentru a detecta abuzuri de minori; Va scana și mesajele de pe iDevice-uri

Hashing technology used here is Neuralhash, which analyzes an image and then converts it into a unique special number.Only another almost identical image can produce the same number, which does not change according to the picture or its quality.Before an iPhone or other IDEVICE climb an image in iCloud will be created a cryptographic security voucher that will cod and the result of matching.

The pictures are approved in iCloud according to the voucher, and if you pass a certain threshold set by the Apple system it would be allowed to interpret the content of the pictures.Users can call to receive your account back if they feel that they were unfairly stuck.All these analyzes are done with the pictures in "blind" using automatic codes and systems.The human intake comes late and only if the system has drawn an alarm signal.

Security experts and Privacy-focused groups criticized these plans, saying we could wake up supervised by a government-based tool based on the Apple solution..The company in Cupertino will also include in the Messages application functions that can warn children and parents when the little ones receive explicit photos.When a message is Flagged in Messages, the picture will appear blurred, and the little one and parents will receive a warning.

Siri and Search will be updated with information for parents and children in this situation.Here Privacy organizations warn about another violation of intimacy: scanning private messages, which is practically a backdoor.All of the above practices will be implemented in the US in the first phase.