child porn

vast-pedophile-network-shut-down-in-europol’s-largest-csam-operation

Vast pedophile network shut down in Europol’s largest CSAM operation

Europol has shut down one of the largest dark web pedophile networks in the world, prompting dozens of arrests worldwide and threatening that more are to follow.

Launched in 2021, KidFlix allowed users to join for free to preview low-quality videos depicting child sex abuse materials (CSAM). To see higher-resolution videos, users had to earn credits by sending cryptocurrency payments, uploading CSAM, or “verifying video titles and descriptions and assigning categories to videos.”

Europol seized the servers and found a total of 91,000 unique videos depicting child abuse, “many of which were previously unknown to law enforcement,” the agency said in a press release.

KidFlix going dark was the result of the biggest child sexual exploitation operation in Europol’s history, the agency said. Operation Stream, as it was dubbed, was supported by law enforcement in more than 35 countries, including the United States.

Nearly 1,400 suspected consumers of CSAM have been identified among 1.8 million global KidFlix users, and 79 have been arrested so far. According to Europol, 39 child victims were protected as a result of the sting, and more than 3,000 devices were seized.

Police identified suspects through payment data after seizing the server. Despite cryptocurrencies offering a veneer of anonymity, cops were apparently able to use sophisticated methods to trace transactions to bank details. And in some cases cops defeated user attempts to hide their identities—such as a man who made payments using his mother’s name in Spain, a local news outlet, Todo Alicante, reported. It likely helped that most suspects were already known offenders, Europol noted.

Vast pedophile network shut down in Europol’s largest CSAM operation Read More »

apple-hit-with-$1.2b-lawsuit-after-killing-controversial-csam-detecting-tool

Apple hit with $1.2B lawsuit after killing controversial CSAM-detecting tool

When Apple devices are used to spread CSAM, it’s a huge problem for survivors, who allegedly face a range of harms, including “exposure to predators, sexual exploitation, dissociative behavior, withdrawal symptoms, social isolation, damage to body image and self-worth, increased risky behavior, and profound mental health issues, including but not limited to depression, anxiety, suicidal ideation, self-harm, insomnia, eating disorders, death, and other harmful effects.” One survivor told The Times she “lives in constant fear that someone might track her down and recognize her.”

Survivors suing have also incurred medical and other expenses due to Apple’s inaction, the lawsuit alleged. And those expenses will keep piling up if the court battle drags on for years and Apple’s practices remain unchanged.

Apple could win, a lawyer and policy fellow at the Stanford Institute for Human-Centered Artificial Intelligence, Riana Pfefferkorn, told The Times, as survivors face “significant hurdles” seeking liability for mishandling content that Apple says Section 230 shields. And a win for survivors could “backfire,” Pfefferkorn suggested, if Apple proves that forced scanning of devices and services violates the Fourth Amendment.

Survivors, some of whom own iPhones, think that Apple has a responsibility to protect them. In a press release, Margaret E. Mabie, a lawyer representing survivors, praised survivors for raising “a call for justice and a demand for Apple to finally take responsibility and protect these victims.”

“Thousands of brave survivors are coming forward to demand accountability from one of the most successful technology companies on the planet,” Mabie said. “Apple has not only rejected helping these victims, it has advertised the fact that it does not detect child sex abuse material on its platform or devices thereby exponentially increasing the ongoing harm caused to these victims.”

Apple hit with $1.2B lawsuit after killing controversial CSAM-detecting tool Read More »

us:-alaska-man-busted-with-10,000+-child-sex-abuse-images-despite-his-many-encrypted-apps

US: Alaska man busted with 10,000+ child sex abuse images despite his many encrypted apps

click here —

Encryption alone won’t save you from the feds.

Stylized illustration of a padlock.

The rise in child sexual abuse material (CSAM) has been one of the darkest Internet trends, but after years of covering CSAM cases, I’ve found that few of those arrested show deep technical sophistication. (Perhaps this is simply because the technically sophisticated are better at avoiding arrest.)

Most understand that what they are doing is illegal and that password protection is required, both for their devices and online communities. Some can also use tools like TOR (The Onion Router). And, increasingly, encrypted (or at least encrypted-capable) chat apps might be in play.

But I’ve never seen anyone who, when arrested, had three Samsung Galaxy phones filled with “tens of thousands of videos and images” depicting CSAM, all of it hidden behind a secrecy-focused, password-protected app called “Calculator Photo Vault.” Nor have I seen anyone arrested for CSAM having used all of the following:

  • Potato Chat (“Use the most advanced encryption technology to ensure information security.”)
  • Enigma (“The server only stores the encrypted message, and only the users client can decrypt it.”)
  • nandbox [presumably the Messenger app] (“Free Secured Calls & Messages,”)
  • Telegram (“To this day, we have disclosed 0 bytes of user data to third parties, including governments.”)
  • TOR (“Browse Privately. Explore Freely.”)
  • Mega NZ (“We use zero-knowledge encryption.”)
  • Web-based generative AI tools/chatbots

That’s what made this week’s indictment in Alaska of a heavy vehicle driver for the US military so unusual.

According to the government, Seth Herrera not only used all of these tools to store and download CSAM, but he also created his own—and in two disturbing varieties. First, he allegedly recorded nude minor children himself and later “zoomed in on and enhanced those images using AI-powered technology.”

Secondly, he took this imagery he had created and then “turned to AI chatbots to ensure these minor victims would be depicted as if they had engaged in the type of sexual contact he wanted to see.” In other words, he created fake AI CSAM—but using imagery of real kids.

The material was allegedly stored behind password protection on his phone(s) but also on Mega and on Telegram, where Herrera is said to have “created his own public Telegram group to store his CSAM.” He also joined “multiple CSAM-related Enigma groups” and frequented dark websites with taglines like “The Only Child Porn Site you need!”

Despite all the precautions, Herrera’s home was searched and his phones were seized by Homeland Security Investigations; he was eventually arrested on August 23. In a court filing that day, a government attorney noted that Herrera “was arrested this morning with another smartphone—the same make and model as one of his previously seized devices.”

Caught anyway

The government is cagey about how, exactly, this criminal activity was unearthed, noting only that Herrera “tried to access a link containing apparent CSAM.” Presumably, this “apparent” CSAM was a government honeypot file or web-based redirect that logged the IP address and any other relevant information of anyone who clicked on it.

In the end, given that fatal click, none of the “I’ll hide it behind an encrypted app that looks like a calculator!” technical sophistication accomplished much. Forensic reviews of Herrera’s three phones now form the primary basis for the charges against him, and Herrera himself allegedly “admitted to seeing CSAM online for the past year and a half” in an interview with the feds.

Since Herrera himself has a young daughter, and since there are “six children living within his fourplex alone” on Joint Base Elmendorf-Richardson, the government has asked a judge not to release Herrera on bail before his trial.

US: Alaska man busted with 10,000+ child sex abuse images despite his many encrypted apps Read More »