Apple sued for failing to curtail child sexual abuse material on iCloud

Share This Post


The abuse began when she was still an infant. A relative molested her, took photographs and swapped the images with others online. He allowed another man to spend time with her, multiplying the abuse.

Nearly every day, the woman, now 27 and living in the Northeast, is reminded of that abuse with a law enforcement notice that someone has been charged with possessing those images. One of those notifications, which she received in late 2021, said the images had been found on a man’s MacBook in Vermont. Her lawyer later confirmed with law enforcement that the images had also been stored in Apple’s iCloud.

The notice arrived months after Apple had unveiled a tool that allowed it to scan for illegal images of sexual abuse. But it quickly abandoned that tool after facing criticism from cybersecurity experts, who said it could pave the way to other government surveillance requests.

Now, the woman, using a pseudonym, is suing Apple because she says it broke its promise to protect victims like her. Instead of using the tools that it had created to identify, remove and report images of her abuse, the lawsuit says, Apple allowed that material to proliferate, forcing victims of child sexual abuse to relive the trauma that has shaped their lives.

The lawsuit was filed late Saturday in U.S. District Court in Northern California. It says Apple’s failures mean it has been selling defective products that harmed a class of customers, namely child sexual abuse victims, because it briefly introduced “a widely touted improved design aimed at protecting children” but “then failed to implement those designs or take any measures to detect and limit” child sexual abuse material.


The suit seeks to change Apple’s practices and compensate a potential group of 2,680 victims who are eligible to be part of the case, said James Marsh, one of the attorneys involved. Under law, victims of child sexual abuse are entitled to a minimum of $150,000 in damages, which means the total award, with the typical tripling of damages being sought, could exceed $1.2 billion should a jury find Apple liable.

Discover the stories of your interest


The lawsuit is the second of its kind against Apple, but its scope and potential financial impact could force the company into a yearslong litigation process over an issue it has sought to put behind it. And it points to increasing concern that the privacy of Apple’s iCloud allows illegal material to be circulated without being as easily spotted as it would be on social media services such as Facebook. For years, Apple has reported less abusive material than its peers, capturing and reporting a small fraction of what is caught by Google and Facebook. It has defended its practice by saying it is protecting user privacy, but child safety groups have criticized it for not doing more to stop the spread of that material.

The case is the latest example of an emerging legal strategy against tech companies. For decades, Section 230 of the Communications and Decency Act has shielded companies from legal liability for what users post on their platforms. But recent rulings by the U.S. Court of Appeals for the 9th Circuit have determined that those shields can be applied only to content moderation and don’t provide blanket liability protection.

The rulings have raised hope among plaintiffs’ attorneys that tech companies could be challenged in court. In August, a 9-year-old girl sued the company in North Carolina after strangers sent her child sexual abuse videos through iCloud links and encouraged her to film and upload her own nude videos.

Apple filed a motion to dismiss the North Carolina case, saying Section 230 protects it from liability for material posted on iCloud by someone else. It also argued that iCloud couldn’t be subject to a product liability claim because it wasn’t a product, like a defective tire.

In a statement in response to the new suit, Fred Sainz, an Apple spokesperson, said: “Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.”

Sainz pointed to safety tools the company has introduced to curtail the spread of newly created illegal images, including features in its Messages app that warn children of nude content and allow people to report harmful material to Apple.

Riana Pfefferkorn, a lawyer and policy fellow at the Stanford Institute for Human-Centered Artificial Intelligence, said there are significant hurdles to any lawsuit over Apple’s policies on child sexual abuse material. She added that a victory for the plaintiffs could backfire because it could raise questions about whether the government is forcing Apple to scan for illegal material in violation of the Fourth Amendment.

The New York Times granted anonymity to the 27-year-old woman suing Apple so she could tell her story. She spoke anonymously because people have been known to seek out victims and search for their child sexual abuse material on the internet.

Her abuse started not long after she was born. An adult male family member would engage in sexual acts with her and photograph them. He was arrested after logging into a chat room and offering to swap photos of the girl with other men. He was found guilty of several felonies and sent to prison.

What she could remember of the abuse came to her in bits and pieces. One night as her mother watched an episode of “Law & Order: Special Victims Unit” about child sexual abuse, the story seemed eerily familiar. She screamed and startled her mother, who realized that she thought that the episode was about her.

“It’s not just you,” her mother told her. “There are thousands of other kids.”

As her images were found online, the authorities would notify her mother. They have commonly received a dozen or so notifications daily for more than a decade. What bothered her the most was knowing that pedophiles shared some of her photos with children to normalize abuse, a process called grooming.

“It was hard to believe there were so many out there,” she said. “They were not stopping.”

The internet turbocharged the spread of child sexual abuse material. Physical images that had once been hard to find and share became digital photos and videos that could be stored on computers and servers and shared easily.

In 2009, Microsoft worked with Hany Farid, now a professor at the University of California, Berkeley, to create a software system to recognize photos, even altered ones, and compare them against a database of known illegal images. The system, called PhotoDNA, was adopted by a number of tech companies, including Google and Facebook.

Apple declined to use PhotoDNA or do widespread scanning like its peers. The tech industry reported 36 million reports of photos and videos to the National Center for Missing & Exploited Children, the federal clearinghouse for suspected sexual abuse material. Google and Facebook each filed more than 1 million reports, but Apple made just 267.

In 2019, an investigation by the Times revealed that tech companies had failed to rein in abusive material. A bar graph the paper published detailing public companies’ reporting practices led Eric Friedman, an Apple executive responsible for fraud protection, to message a senior colleague and say he thought the company may be underreporting child sexual abuse material.

“We are the greatest platform for distributing child porn,” said Friedman in the 2020 exchange. He said that was because Apple gave priority to privacy over trust and safety.

A year later, Apple unveiled a system to scan for child sexual abuse. It said its iPhones would store a database of distinct digital signatures, which are known as hashes, that are associated with known child sexual abuse material identified by groups like the National Center for Missing & Exploited Children. It said it would compare those digital signatures against photos in a user’s iCloud storage service. The technique, which it called NeuralHash, would flag matches and forward them to the federal clearinghouse of suspected sexual abuse material.

But after cybersecurity experts warned that it would create a back door to iPhones that could give governments access, the company dropped its plan. It said it was almost impossible to scan iCloud photos without “imperiling the security and privacy of our users.”

Early this year, Sarah Gardner, the founder of a child advocacy group called the Heat Initiative, began searching for law firms with experience representing victims of child sexual abuse.

In March, the Heat team asked Marsh Law, a 17-year-old firm that focuses on representing victims of child sexual abuse, if it could bring a suit against Apple. Heat offered to provide $75,000 to support what could be a costly litigation process. It was a strategy borrowed from other advocacy campaigns against companies.

Margaret Mabie, a partner at Marsh Law, took on the case. The firm has represented thousands of victims of child sexual abuse. Mabie dug through law enforcement reports and other documents to find cases related to her clients’ images and Apple’s products, eventually building a list of more than 80 examples, including one of a Bay Area man whom law enforcement found with more than 2,000 illegal images and videos in iCloud.

The 27-year-old woman from the Northeast, who is represented by Marsh, agreed to sue Apple because, she said, she believes that Apple gave victims of child sexual abuse false hope by introducing and abandoning its NeuralHash system. An iPhone user herself, she said the company chose privacy and profit over people.

Joining the suit was a difficult decision, she said. Because her images have been downloaded by so many people, she lives in constant fear that someone might track her down and recognize her. And being publicly associated with a high-profile case could cause an uptick in trafficking of her images.

But she said she had joined because she thought it was time for Apple to change. She said the company’s inaction was heart-wrenching.



Source link

spot_img

Related Posts

US ‘Adding Sophgo’ To Blacklist Over Link To Huawei AI Chip

US Commerce Department reportedly adding China’s Sophgo to...

Amazon Workers Go On Strike Across US

Amazon staff in seven cities across US go...

Senators Ask Biden To Extend TikTok Ban Deadline

Two US senators ask president Joe Biden to...

Journalism Group Calls On Apple To Remove AI Feature

Reporters Without Borders calls on Apple to remove...
spot_img