Apple Will Scan IPhones For Child Sexual Abuse Images


    Topline

    Apple will roll out a new system in the U.S. that will help identify images of child sexual abuse through photos people upload on their iPhone even before it is uploaded on the iCloud, as technology companies face increased federal pressure to crackdown on child sexual exploitation and abuse on their platforms.

    Key Facts

    The software, which will reportedly be called the “neuralMatch” according to Business Insider, will compare images on a person’s iPhone with images on the U.S. law enforcement’s child sexual abuse database, and if it flags enough child abuse images, a review will start.

    Law enforcement will be alerted if reviewers find there is evidence the photos are illegal.

    The system would be able to check the photos stored on the iPhone before they are uploaded to iCloud servers, according to a report by Reuters.

    The story was first reported by The Financial Times.

    Forbes has reached out to Apple for comment.

    Key Background

    Last year, the U.S. Department of Justice released a set of “voluntary principles” which aimed to make technology and social media companies do more to counter child sexual exploitation and abuse. It calls on companies to put in place a thorough system to identify and take immediate action against this illegal content and report it to authorities. Microsoft created photoDNA to help companies identify child sexual abuse images on the internet. Facebook and Google also already have systems in place to review and flag potentially illegal content. Facebook said it is also working on new tools to reduce sharing child sexual abuse images on its platform.  

    Contra

    Matthew Green, who is a security researcher at John Hopkins University, said that Apple being willing to build systems that scan iPhone users’ phones for “prohibited content” could “break the dam” and lead to the U.S. government “demanding it from everyone.” He also told the Associated Press there are concerns Apple could be pressured by other international governments to scan for other information.

    Big Number

    20 million. That is the number of child sexual abuse images Facebook reported to law enforcement in 2020, according to a report by the National Council for Missing and Exploited Children. That number includes reports from both Facebook and Instagram platforms. That number rose from 16 million back in 2019.

    Further Reading

    Apple Posts $81 Billion In Revenue During Best Second Quarter Ever—Shattering Wall Street Expectations (Forbes)

    New iPhone WiFi Hack Becomes More Dangerous, Affects All iOS 14 iPhones (Forbes)



    Source link

    Previous articleEther Outperforms Bitcoin on London Hard Fork
    Next articleTwo Low-Cap Altcoins Surge 50% or More After Surprise Coinbase Listings