Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Apple sued over CSAM deprecation of iCloud


Apple is accused of refusing to implement a system that would have analyzed iCloud photos for child abuse (CSAM).

The lawsuit says that by not doing much to stop the spread of the story, it is forcing victims to relive their pain, according to The New York Times. The suit explains that Apple is announcing “best designs intended to protect children,” and failing to “implement those designs or take steps to identify and mitigate” the issue.

Apple first announced the plan in 2021, I declare that it will happen use digital signatures from the National Center for Missing and Exploited Children and other groups to identify CSAM content identified in iCloud libraries. However, it appears to be abandoning those plans after security and privacy advocates said they might create a backdoor that the government has installed.

The lawsuit is said to be from a 27-year-old woman who is suing Apple under a pseudonym. She said a family member abused her when she was a baby and shared photos of her online, and that she still receives legal notices almost daily about someone accused of possessing the photos.

Attorney James Marsh, who is involved in the case, said there are a group of 2,680 people who should be paid in the case.

TechCrunch has reached out to Apple for comment. A spokesperson for the company told The Times that the company is “actively and proactively dealing with these allegations without compromising the security and privacy of all our users.”

In August, a 9-year-old girl and her supervisor sued AppleHe blames the company for failing to deal with CSAM on iCloud.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *