Following Apple’s announcement to introduce a system in the fight against child pornography by scanning images on users’ devices, the technology giant has come under fire for jeopardizing privacy by opening up government opportunities to monitor their citizens, reports Radio Free Europe (RFE). today. media.

The new controversy highlights the thin line between the necessities of combating criminal activity and the protection of privacy.

Last week, Apple unveiled software solutions that will scan its iPhones and other devices in an effort to prevent the storage of child pornography photos, as well as to prevent minors from sharing explicit content photos.

The U.S. company’s aggressive plan to stop children’s predators and pedophiles and ban them from using Apple’s services for illegal activities has, according to The Washington Post, targeted the technology giant against civil liberties activists and appears to be the opposite. years of privacy claims.

Apple’s decision also raises new questions about the nature of smartphones and who the devices really are, the newspaper reads, explaining that the new software will scan photos on users’ devices without their knowledge or express permission, endangering innocent users. can set.

The software uses a matching technique to compare photos stored on the iPhone with known photos of child pornography. Such a system is already used by companies like Facebook, but according to the Washington Post, photos in previous systems were only scanned after they were uploaded to servers, while in Apple’s system they were scanned on the user’s device, which is a new level of concern: freedom and privacy.

The Internet Frontier Foundation, an online privacy group, has expressed concern over Apple’s announcement, saying it could lead to further misuse of private data.

According to the Washington Post, the negative reaction of proponents of freedom and privacy on the Internet to the Apple initiative shows that privacy and security often have a complicated relationship.

Apple’s new initiative will not only be limited to photos of child pornography, but will also scan messages sent by Apple’s service to locate texts and photos that are inappropriate for minors.

If a minor receives a photo that has been identified as sexually explicit, the image is blurred and the minor is warned that the parents will be notified when they click on it.

The announced changes to Apple’s system have raised concerns that the company is installing surveillance technology that could be used by governments, but it also indicates that the decision meets with the approval of parents and the police, while Apple itself quoted experts cyber security and child protection groups that praised the company’s approach.

Mixed reactions to new features of Apple’s system show a fine line between technology companies between providing public security assistance and ensuring users’ privacy, the newspaper noted that police services have been complaining for years that smartphone coding technology is making criminal investigations more difficult. company and cyber security experts have argued that such protection is key to data security and privacy.

In the new Apple decision, the essentials for suppressing criminal activity and protecting privacy clashed, the Financial Times reported in an editorial, emphasizing that the movement of the technology giant sets a significant precedent.

With the announcement that photos were scanned on Apple devices, it took a step to open the back door of its system, the British newspaper estimates, pointing out that although collaboration of large technology companies with police and judicial services is essential is to fight crime and maintain security, many dangers.

Encrypted devices and messages benefit organized crime, terrorists, and child abusers.

On the other hand, the British newspaper pointed out, technology companies and privacy activists, with strong justification, argued that creating any kind of backdoor into the system offers the possibility that they could be abused by hackers, cybercriminals or unscrupulous governments.

Apple’s “neutralMatch” is not really such a backdoor, in the sense that it provides direct access to content, but the precedent is that its technology will now proactively scan photos on iPhones, which will be sent to law enforcement in the event of compliance with known child abuse images ..

Privacy activists warn that by allowing such a game, Apple opens up the possibility of putting itself and others under pressure from governments to do the same for other types of content, such as images of opposition protests, the Financial Times points out.

Privacy was an important part of marketing iPhones, as Apple emphasized the security architecture of its systems, which prevented the company from accessing the content of messages or other data stored on its servers, according to CNBC television.

In 2016, Apple confronted the Federal Bureau of Investigation (FBI) in a US court to protect the integrity of its coding systems during a mass murder investigation, which put it under pressure.

However, according to CNBC, with the new announcements, Apple has jeopardized its reputation in protecting privacy.

Although Apple says the system is an improvement over existing standards because it uses hardware control and sophisticated math to learn as little as possible about photos on someone’s phone or cloud account while finding child pornography, privacy activists consider the step as the beginning of change policy, which could allow foreign governments to put pressure on the enterprise to reform the system by asking it to tag photos of protests or political mimicry.

Skeptics, according to CNBC, are not worried about how the system works today and do not defend people who collect images of child exploitation, but they are worried about how it may develop in the coming years.


Follow the Cruiser through the phone for Android I iPhone.