child pornography

worst-hiding-spot-ever:-/nsfw/nope/don’t-open/you-were-warned/

Worst hiding spot ever: /NSFW/Nope/Don’t open/You were Warned/

Last Friday, a Michigan man named David Bartels was sentenced to five years in federal prison for “Possession of Child Pornography by a Person Employed by the Armed Forces Outside of the United States.” The unusual nature of the charge stems from the fact that Bartels bought and viewed the illegal material while working as a military contractor for Maytag Fuels at Naval Station Guantanamo Bay, Cuba.

Bartels had made some cursory efforts to cover his tracks, such as using the TOR browser. (This may sound simple enough, but according to the US government, only 12.3 percent of people charged with similar offenses used “the Dark Web” at all.) Bartels knew enough about tech to use Discord, Telegram, VLC, and Megasync to further his searches. And he had at least eight external USB hard drives or SSDs, plus laptops, an Apple iPad Mini, and a Samsung Galaxy Z Fold 3.

But for all his baseline technical knowledge, Bartels simultaneously showed little security awareness. He bought collections of child sex abuse material (CSAM) using PayPal, for instance. He received CSAM from other people who possessed his actual contact information. And he stored his contraband on a Western Digital 5TB hard drive under the astonishingly guilty-sounding folder hierarchy “https://arstechnica.com/NSFW/Nope/Don’t open/You were Warned/Deeper/.”

Not hard to catch

According to Bartels’ lawyer, authorities found Bartels in January 2023, after “a person he had received child porn from was caught by law enforcement. Apparently they were able to see who this individual had sent material to, one of which was Mr. Bartels.”

Worst hiding spot ever: /NSFW/Nope/Don’t open/You were Warned/ Read More »

apple-hit-with-$1.2b-lawsuit-after-killing-controversial-csam-detecting-tool

Apple hit with $1.2B lawsuit after killing controversial CSAM-detecting tool

When Apple devices are used to spread CSAM, it’s a huge problem for survivors, who allegedly face a range of harms, including “exposure to predators, sexual exploitation, dissociative behavior, withdrawal symptoms, social isolation, damage to body image and self-worth, increased risky behavior, and profound mental health issues, including but not limited to depression, anxiety, suicidal ideation, self-harm, insomnia, eating disorders, death, and other harmful effects.” One survivor told The Times she “lives in constant fear that someone might track her down and recognize her.”

Survivors suing have also incurred medical and other expenses due to Apple’s inaction, the lawsuit alleged. And those expenses will keep piling up if the court battle drags on for years and Apple’s practices remain unchanged.

Apple could win, a lawyer and policy fellow at the Stanford Institute for Human-Centered Artificial Intelligence, Riana Pfefferkorn, told The Times, as survivors face “significant hurdles” seeking liability for mishandling content that Apple says Section 230 shields. And a win for survivors could “backfire,” Pfefferkorn suggested, if Apple proves that forced scanning of devices and services violates the Fourth Amendment.

Survivors, some of whom own iPhones, think that Apple has a responsibility to protect them. In a press release, Margaret E. Mabie, a lawyer representing survivors, praised survivors for raising “a call for justice and a demand for Apple to finally take responsibility and protect these victims.”

“Thousands of brave survivors are coming forward to demand accountability from one of the most successful technology companies on the planet,” Mabie said. “Apple has not only rejected helping these victims, it has advertised the fact that it does not detect child sex abuse material on its platform or devices thereby exponentially increasing the ongoing harm caused to these victims.”

Apple hit with $1.2B lawsuit after killing controversial CSAM-detecting tool Read More »