Type to search

Crime Mental Health Opinion Sex Technology

Possessing CSAM Is a Felony Crime and Probably Already in Your Pocket

Share

There’s been a lot of talk about Child Sexual Abuse Material (CSAM) aka Child Pornography and child trafficking lately. It’s a horrific subject matter and one that most people view with prurient interest and much like the act of watching porn itself it’s quickly forgotten about once they’re done. But the fact is that access to CSAM/CP is  so pervasive on social media apps like KIK, WhatsApp, Discord, and Telegram that to talk about it any other way ignores the fact that virtually anyone on any of these apps could find themselves guilty of possession and would only perpetuate a problem that should not be the sole purview of law enforcement but is the sole responsibility of Big Tech — both for the technology that has made the ease of production and the rapid distribution — a few apps away on your smart phone.

RELATED: Gay Couple Arrested for Using Adopted Boys in Production of Child Porn

Prior to the COVID-19 pandemic I spent little “social” time online. Having lived primarily in New York City and Los Angeles my entire adult life I never used a dating app and had been in three consecutive (fairly) monogamous relationships from the age of 18 until 44. I had my first experiences using Grindr and KIK while executing a campaign for a series of PSAs I wrote and produced for Gilead Science’s then new PrEP drug Truvada.

RELATED: Why Aren’t We Talking About the Child Sexual Predator Backing Florida’s Republicans?

After having used them I tried both personally and found them problematic mostly because of what I found to be odd behavior: meeting people more than wiling to meet for an intimate sexual experience yet would look at you as if you had three heads if you asked them the most cursory of personal questions.

RELATED: Guys Doing Meth on Social Messaging App Kik is a Dark View Into Gay Lives: WATCH

One day a friend told me that KIK had groups where you could watch guys smoking crystal meth and having sex.

I literally snorted thinking it was a joke.

Lo and behold a few minutes later I witnessed it myself. This led to me looking more into KIK which I learned had a much bigger issue: Child Pornography. And not just a little. I would come to find out the problem was well known to law enforcement who left child predators on the app so they could get more people charged with CP possession. They called it a honey pot. Virtually all the CSAM I saw on KIK was composed of naked to partially clad children  and none of it depicted actual sexual intercourse.

RELATED: Criminals, Crystal Meth, Pedophiles, and Pornography: How Did the Teen Social Messaging App Kik Become a Haven for Depravity?

This was in 2019 and it’s still going strong and worse than ever.

A recent episode of the Darknet Diaries is dedicated to “KIK culture.”

KIK is marketed to and primarily used by teenagers 13 years-old and up. In May 2016, KIK Messenger announced that they had approximately 300 million registered users, and was used by approximately 40% of United States’ teenagers.

Cut to 2020 and as the mandated shutdowns and self isolations and quarantines began more and more people found themselves on these apps and jerking off. Only Fans stars became household names and sites like Pornhub and XHamster saw exponential increases  in users. So too did Zoom.

Zoom which became ubiquitous in households everywhere as the main vector for teachers reaching kids at home also became the space where nightclubs went to survive.

Early on one of my friends who was  gay club promoter begged me to cover her new virtual event on Zoom. Without being allowed to open she was hemorrhaging money. I was skeptical about whether a virtual event could supplant an IRL one and for me anyway I was correct. But more than that was  the discovery of something much more disturbing: virtually every gay zoom party, which proliferated at a rapid pace and also began including parties involving crystal meth, porn, and jerking off had a very prominent warning: NO CP.

I had no idea what CP was and soon learned that it stood for child pornography. This warning was everywhere. I found it disturbing. The warning alone was troublesome. The ubiquity of the warning more so since with warnings comes expectations — which in my mind normalizes things.

It got worse.

Shortly thereafter, the CEO of one of the retail clients I create adverts for asked how would a retailer re-create a virtual experience that in any way compare to an IRL online.

They can’t I thought, create the same experience,  and I still do.

And they certainly can’t on Facebook or Twitter but it occurred to me you could create something more personal and intimate in the group features on Telegram and KIK, so I began experimenting creating different ones.

 

Above: Screenshot of a PnP party group on Telegram.

In many ways these rooms and group features function much like early AOL chat rooms which made the UX fairly easy for many people. Two things were vastly different: the default settings  for all them allowed almost anyone else on the app to add you to a group without your knowledge or consent and the defaults also saved images you saw directly to your camera roll. It’s a feature that most people are familiar from WhatsApp. It’s a feature that can be a horror show when an app is deeply immersed with CSAM.

I found the problem horrific as from day one there was almost always a disturbing image left in a room or group overnight. They weren’t all CSAM but often of Zoo or Bestiality as well. And the CSAM was not what I expected. I had never seen child pornography before and as far as I had thought about it was “The Bicycle Man” episode of Different Strokes when I was a kid.

In my head I imagined it as barely legal teen agers.

I never saw that. What I did see was baby rape. Violent, screaming, bloody baby rape.

The first one I saw was when I was added to a group called 666Satan. I had logged in and immediately aware that it was something weird from all the hail Satans the members were making as the feed scrolled. Then an image appeared that looked like glitch art and which began playing as you scroll past. It was hot young red headed young guy playing with his baby brother. The members of the channel got very active and asked where did the “Estes” come from and “Hail Satan” and proclamations about him being an American hero and “NGL he’s my hero.”

Above: Pedophiles and baby rapists often use this screen shot of Estes and allegedly his baby brother as their avatar to let you know what they’re into. Apparently law enforcement does as well in an attempt to infiltrate. 

The red head goes from holding his baby brother to violently raping him. There is screaming and there is blood. I thought he killed the kid. I hit the report button as many times as possible and screen shot the room and reported it. It was traumatic.

I searched the web for information on Matthew Estes. There wasn’t much and it was difficult to find what I did.

The most prominent evidence I found was a few screenshots and warning on Twitter from a handle called @midgardsormurir (above) who is a hacker vigilante who hunts pedophiles online: “If anyone tweets or comments that they “worship” or “love” this guy, they’re a #pedophile. His name is Matthew Estes. Offense/Statute: Especially Aggravated Sexual Exploitation Of A Minor Offense/Statute: Rape Of A Child.”

There are no news reports about Estes or court records that I could find on the web or Lexis/Nexis.

 

Twitter and other social media sites remove references to him. What I was able to learn, I learned from asking people in online groups and vestiges of social media that exist.

I determined that he was a student at Bearden High School in Knoxville, Tennessee until at least December 1, 2015. As Estes would have been 17 years-old at the time of his apprehension which must have occurred prior to December 1 and the fact that both he and the victim were minors account for court documents being sealed.

Estes is thanked in the  Knoxville Neighborhood Advisory – Vol. 9, No. 10 Tuesday, March 8, 2015, “For his invaluable help leading up to the Luncheon and helping staff the Office of Neighborhoods booth.”

I found him listed on the Tennessee sex offender registry page and  a phone number to the Northeast Correctional Complex in Mountain City, Tennessee who confirmed he is an inmate.

His two offenses are below.

12/01/2015ESPECIALLY AGGRAVATED SEXUAL EXPLOITATION OF A MINOR

39-17-1005

12/01/2015RAPE OF A CHILD

39-13-522

The first means he recorded/produced and distributed child pornography. The second is clear but without court documents I know few other facts. What I  was able to learn were from anon user names who were familiar with Estes and/or had corresponded with him since he was 9 years-old on the now defunct social media site Tumblr.

Estes they say had been making videos of himself masturbating since younger than that and often at the request of “older pervs.”

He would often poll his followers for “challenges” including using a dildo on himself. At some point he began filming himself and other kids in what one self proclaimed pedophile referred to as the complete Matthew Estes library of seven videos the last featuring the aforementioned rape.

Above: a screenshot of an Instagram “worship” page to Matthew Estes.

Estes activity on Tumblr may explain his virtually scrubbed social media footprint as the site was unceremoniously shut down in 2018 after being acquired by Yahoo!

Tumblr’s lack of age verification was evident in the overwhelming number of self produced videos by what appeared to be teenagers, Estes would have been among them, which would have left Yahoo! with a surfeit of exposure and bad publicity. Yahoo! Finance said in 2018: 

Tumblr, one of the myriad digital properties now owned by Verizon, announced today that starting Dec. 17, “adult content will not be allowed on Tumblr regardless of how old you are.” That’s according to a company blog post today, which spells out in detail what’s no longer allowed. That includes “photos, videos, or GIFs that show real-life human genitals or female-presenting nipples, and any content — including photos, videos, GIFs and illustrations — that depicts sex acts.”

This follows Tumblr’s removal from the iOS App Store just a few days ago over concerns around the presence of child pornography. Indeed, Tumblr has had a bit of a colorful relationship over the years with porn, which has always constituted a significant part of its user-generated content.

After the site was bought by Yahoo five years ago, for example, Tumblr started hiding blogs that had been flagged as NSFW. They eventually disappeared from Tumblr search listings and were only able to be accessed by users logged into their accounts. And it’s been up and down from there, with adult content estimated at one point as comprising 10 to 20 percent of the site’s traffic, according to Ars Technica.

***

One of the horrible things you learn when researching sexual assault and CSAM is how many cases there are.

And then how many there are in just one city.

Or in this case how many Matthew Estes were arrested for child sexual abuse in Knoxville, Tennessee  alone. 17 at least that I found in the last 6 years. One is from last summer, that particular Matthew Estes is 34 years-old and raped his family’s teen babysitter over a two-year period. SMM: “According to a criminal complaint, the victim and Estes met at church, where her mother worked with Estes, who was a janitor.  The girl was 13 years old at the time.  Court documents say Estes’ wife was pregnant and the victim offered to babysit for the family.”

Each new level of understanding of the myriad overlapping issues involved begat new questions. Why, for example, has the proliferation of CSAM been allowed to happen at all? Why aren’t these apps being taken down or sued?

That’s when I learned about the Safe Harbor Act aka Section 230 of the laws that govern the Federal Communications Commission (FCC). Social media sites, unlike publishers, are protected by Section 230 of the Communications Decency Act also known as the Safe Harbor Act.

Section 230 says that No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider (47 U.S.C. § 230).

Online intermediaries that host or republish speech are protected against a range of laws that might otherwise be used to hold them legally responsible for what others say and do. The protected intermediaries include not only regular Internet Service Providers (ISPs), but also a range of “interactive computer service providers,” including basically any online service that publishes third-party content. Though there are important exceptions for certain criminal and intellectual property-based claims, CDA 230 creates a broad protection that has allowed innovation and free speech online to flourish.”

Billionaire Facebook investor Peter Thiel is one of the main supporters of the act and in fact it’s crucial to the survival of Meta now.

It is in fact a ludicrous and very broad reading of the law that makes no sense.

When Safe Harbor laws were initially conceived they actually involved a harbor and the ships that docked there.

An outspoken opponent of the broad application of section 230 is Sales Force CEO Marc Benioff. The Verge: [Benioff] an outspoken tech industry figure who is apparently not afraid to criticize other business leaders, spent all week bashing Facebook over the company’s refusal to moderate certain content on its platform, like political ads that contain lies. He’s on tour promoting his book, Trailblazer, and is understandably generating headlines and free promotion as a result. But one peculiar call to action that Benioff voiced two days ago on Twitter, in favor of abolishing the oft-misunderstood Section 230, may not have been all that well thought out.”

Backpage was a classified advertising site found by the federal government to be a haven for thinly veiled sex work advertisements. Its co-founder and CEO, Carl Ferrer, pleaded guilty to money laundering and aiding prostitution, with part of his plea deal requiring him to help keep the site offline for good after the government seized it in 2018.

Section 230 is currently at the heart of a rather vicious partisan debate over the extent to which platform-owning tech companies like Facebook and Twitter should moderate their platforms, with Republican politicians saying Silicon Valley has a liberal bias and unfairly punishes conservatives. Some of those lawmakers, Sen. Josh Hawley (R-MO), have called for changes to Section 230 to force tech companies like Facebook to be politically neutral.

Repealing protection of section 230 protections is vital.

To say that CSAM is pervasive on social media is an understatement. We are entering Web 3. A phase where the metaverse will make virtual spaces feel real. Companies like Marvel Entertainment and celebrities like Snoop Dog are buying real estate there. And the emergence of NFTs as everyday commodities have opened up questions relevant to CSAM.

A ruling in September rejected Pornhub’s ability to use section 230 that was seen by many as a victory, yet as more than one lawyer pointed out: “For every 1 case involving a rape tape on Pornhub, I have 50 involving rape and CSAM being disseminated on Insta and FB. Pornhub is far from perfect. But mainstream big tech is far worse and have a built-in mechanism for harassing victims directly.”

More importantly use and proliferation of social media apps like KIK during the pandemic saw unprecedented increase in users, many of who were exposed to CSAM for the first time. So while section 230 absolves any and all of these social media and messaging platforms/apps from culpability, then it de facto places onus on users who may unwittingly stumble into a vast array of criminal worlds they may not be aware exist at all. With default settings that means they could end up with numerous CSAM images on their smart phones. The Department of Justice has federally mandated 12 year sentences for possession of one CSAM image.

And users send images to other users without requesting it. One commenter on the Darknet Diary KIK podcast commented: “Kik should be shutdown. Even outside of this fairly horrific story, there was a local group which was supposedly sharing “stolen” nudes of local women. Even if you just visit any “adult” site these days there are users directing you to Kik. I literally see nobody talking about using it unless its for dodgy purposes.”

The other problem is law enforcement allows the “honey pot” of KIK to continue, means a whole generation of kids who came of age using KIK (or like Estes Tumblr) who see CSAM as de rigueur.

These are kids who have been handed a smart phone at 5 years-old.

We have already have a porn addiction problem that has reached epidemic proportions, especially among boys.

According to a 2018 report in The Chicago Tribune:

“Eleven is the average age a child is first exposed to porn. And 94 percent of kids will see porn by age 14. The younger boys get to porn, the less capacity they have for empathy for girls and women, the more likely they are to become sexual offenders, the less capacity they have to actually put the building blocks of adult life into place,” she said. “They are more interested in hookups than actual dating, so the question becomes: If you’re socializing a whole generation into porn sex, which is what we’re doing because porn is the major form of sex education today, then what kinds of fathers, partners, lawyers, judges, policemen are they going to be when they’ve had their capacity for intimacy, connections and relationships hijacked by the porn culture?”

And now these kids who have seen all conceivable legal porn by 14  are so desensitized that nothing gets them off anymore.

Then they learn about this other kind of porn. It’s so taboo that it’s illegal. How tantalizing that must be.

What do we think will happen?

Post hoc ergo propter hoc.*

Case after case of CP possession are by young men 17-19 years-old.

Raised on and addicted to porn they are essentially being arrested for being chronic masturbators.

More than one user has told me they began watching CSAM as young as 9 years-old. Watching other 9 year-olds. On KIK.

And that is still what gets them off.

A Dallas “sting” operation from 2017  sounds the all too common beats Dallas News:

Police found a computer in Nagle-Perkins’ home that contained several images of boys and girls between the ages of 8 and 12 in sexually explicit poses. The man’s wife told police Nagle-Perkins is the only person to use the computer that had the images, an arrest warrant affidavit says.

His wife also told police that she would be shocked to learn her husband was looking at images of toddlers but said she “would not be shocked to learn that her husband was viewing 13-year-old children,” the warrant says.

Baby rape is easy to relegate to the extreme and subhuman, but there’s a retinue that are “self directed” j/o vids by boys. Certainly not child abuse images.

We live in a rapey  culture.

A rapey, barely legal culture.

And digital proliferation of CSAM raises some of the same issues that NFTs do about what digital ownership even means. McKenzie Wark, talking about NFTS, says: “To think about digital objects as collectable, it may help to start by asking what it is that is actually collected. We tend to think that what is collected is a rare object. But what makes it rare? Perhaps there is more than one way to make an object rare. To make a digital object rare, it can be ‘locked’ in various ways.”

Traditional collectors of CSAM like retired BBC producer Victor Melleney, 76 who appeared in a London courtroom in January, “stored 832 indecent images of children across his laptop, desktop, two hard drives and a WIFI dongle between May 2011 and October 2018.”

The Daily Mail:

Melleney previously told the court that his porn addiction began after he retired from working with the BBC in 1996/1997 while his children were at school.

‘In the evenings if my wife was out of the country and I was alone it was something interesting to do.’

But he denied ever having any interest in indecent images of children.

‘Absolutely not,’ he said. ‘No, no sexual interest in children at all. Horrible.’

The court was told the digital content was discovered when NCA officers found two IP addresses had received indecent images of children during 2018.

Melleney was identified as the subscriber to those two IP addresses, in the boroughs of Kensington and Hammersmith and Fulham.

Jurors were told how peer to peer (P2P) technology, which facilitates file sharing, was used to download the content.

One such example of P2P technology found on Melleney’s devices was an application called ‘Vuze’.

Vuze had been installed on a Mac Pro desktop as well as a MacBook laptop and across both devices contained file names ‘indicative of child sexual abuse exploitation’.

An officer searched for the term ‘PTHC’, which stands for ‘pre-teen hard core’, and found more than 40 items. Other file names were read out in court: ‘Preteen young’, ‘underage sex’, ‘pedo’.

The prosecutor also named search terms found in Melleney’s Vuze history: ’11yo PTHC’, ‘PTHC’ and ‘pre-teen’.

Melleney explained that one of the side effects of his addiction to viewing pornography was a corresponding addiction to downloading it.

He said he would simply click a ‘select all’ button on applications like Vuze, which he used to search for and download material, and simply ‘download the lot’.

On Friday, Ruona Iguyovwe, of the Crown Prosecution Service, said that the distribution of such images as an ‘utterly sickening practice’.

She added: ‘Victor Melleney knew he was in possession of indecent images of children.

‘The search history of devices recovered during searches of Melleney’s home revealed downloads of images depicting sexual activity involving children.

‘Examination of the electronic devices also revealed software which can be used by users in sharing indecent images of children.

‘The hard drives, recovered from his home, his basement, and hidden in the pocket of a dressing gown he was wearing when he was arrested, contained sickening videos, part of a collection downloaded over years.

‘The sexual exploitation of children and the distribution of images like this is an utterly sickening practice and one which the CPS and its partners will continue to work to root out.’

But this is not the behavior of many of the younger viewers on KIK or Discord or Telegram.

The EARN IT Act, recently cleared for floor consideration by the Senate Judiciary Committee, remains a contentious bill, primarily over concerns that it might dissuade tech providers from using encryption. But amid ongoing debate about Section 230 and the role of tech platforms in our public discourse, legislation like EARN IT could, if paired with carefully crafted procedural protections, offer a model for how Congress can address bipartisan concerns about child sexual abuse material (CSAM) and other illegal content online. Opponents argue that it’s a thinly veiled attack on privacy and free speech and they may be correct to a certain extent.  But the bill is “calibrated to really stop the most detestable and despicable kinds of child abuse involving really horrific pornographic images that follow these kids all their lives,” said Sen. Richard Blumenthal (D-Conn.), who co-sponsored the legislation with Sen. Lindsey O. Graham (R-S.C.). The bill has been backed by lawmakers on both sides of the political spectrum, as well as groups representing law enforcement and sexual exploitation survivors.

But unlike some recent antitrust bills that have won the backing of some technologists, the bill’s revival has reignited a battle over the future of Internet regulation and online speech. Prominent technologists, industry groups, civil liberties advocates and LGBTQ interest groups have aggressively campaigned against it, warning that the proposal threatens to erode consumers’ privacy and could have a chilling effect on free expression online.

Gizmodo: “The bill, as advertised, would remove some long-held Section 230 protections. In particular, it would mean that a company can be found legally liable for child sexual abuse material (or CSAM) that’s uploaded to their systems by third parties. Those in favor of the bill claim removing these protections will force internet companies to take a more active role in identifying and reporting the real problem of CSAM material. If passed by both houses of Congress, the bill could potentially open these companies up to a flurry of lawsuits. Democratic Senator Richard Blumenthal, who co-sponsored the bill with Republican Senator and Trump lackey Lindsey Graham, claimed the bill “is very simply about whether tech companies should be held responsible for their complicity in the sexual abuse & exploitation of children.”

Concerns are valid but the CSAM problem is a Big Tech problem. A problem only they can solve. And they should be responsible for the content they distribute. It’s imperative to staunch the flow and rethink our entire approach to CSAM.

EARN IT is also urgent because of Apple’s eventual CSAM detection feature. Apple’s CSAM detection feature was controversial when it was announced because it involves taking hashes of iCloud Photos and comparing them to a database of hashes of known child sexual abuse imagery. Apple claims this approach allows it to report users to the authorities if they’re known to be uploading child abuse imagery without compromising the privacy of its customers more generally. It also says the encryption of user data is not affected and that the analysis be run on-device.

But critics argue that Apple’s system risks undermining Apple’s end-to-end encryption. Some referred to the system as a “backdoor” that governments around the world might strong-arm Apple into expanding into including content beyond CSAM. For its part, Apple has said that it will “not accede to any government’s request to expand it” beyond CSAM.

While the CSAM detection feature has yet to receive a new launch date, Apple has gone on to release two of the other child-protection features it announced in August. One is designed to warn children when they receive images containing nudity in Messages, while the second provides additional information when searching for terms related to child exploitation through Siri, Spotlight, or Safari Search. Both rolled out with iOS 15.2, which was released earlier this week and which appears to have prompted Apple to update its webpage.

Again this approach seems pyrrhic.

They are solutions and answers that beg more questions and demand a harm reduction approach.

In their “Myths and Facts” sheet, the bill’s supporters have said the quiet part out loud. Some of the document’s falsehoods are breathtaking, such as the statement that internet businesses are provided “blanket and unqualified immunity for sexual crimes against children.” It (falsely) reassures small business owners who dare to have websites that the government-ordered scanning they will be subject to will come “without hindering their operations or creating significant costs.” And it says that using automated tools that submit images and videos to law enforcement databases is “not at odds with preserving online privacy.”

***

Why do some grown men want to rape or molest little kids? Or even look at images of such acts? You might answer that it’s because they’re sick perverts, but “sick pervert” is neither a medical diagnosis nor a psychiatric designation. Believing that the world is simply pocked with sick perverts who are destined to rape and molest children is, in a way, to give into the inevitability of their crimes with our fingers crossed that they’ll be caught.

Most are not.

Slate:

Researchers are increasingly studying child sexual abuse as a public health issue, with a focus on identifying risk factors that may lead to abuse and protective factors that may prevent it. But compared to the many millions of dollars we spend on civil commitment, trials, imprisonment, sex offender registration, and the like, we spend almost nothing on prevention.

“We’re investing all of our money in a very small number of people,” Joan Tabachnick, a co-chair of the Prevention Committee of the Association for the Treatment of Sexual Abusers, told me. “The primary prevention part, before any child is harmed—that’s where we need to ratchet it back to. But the way we invest is completely reactive and doesn’t look at most situations of sexual abuse.”

Elizabeth Letourneau, a child sexual abuse expert at the Johns Hopkins Bloomberg School of Public Health, agrees. “We keep waiting for bad things to happen and then reacting to them,” she said. “To prevent risk we have to know what’s causing the issue, and that has to do with basic science. And there’s almost no basic science in our country that targets this problem.”

Letourneau thinks a big step would come from the creation of a single federal agency mandated to prevent childhood victimization, because different types of victimization—physical and sexual assault, for instance—often go hand in hand. “The bang for your buck is very high if you find risk factors for any kind of child neglect,” she said.

And we now we must also wrestle with the “monsters” wrought by exposure like Matthew Estes.

***

With Self Generated CSAM a new question arises: Is every child in every image experiencing the same harms?

According to Thorn Research, “The short answer is no. A young person sharing nudes with their partner has not had the same experience as a child being groomed for exploitation, and the interventions to safeguard both of these children are distinct.”

Thorn Research: Understanding sexually explicit images, self-produced by children

Thorn’s research is centered in the question: Where does harm occur at the intersection of child sexual abuse and technology, and how can that harm be prevented, combatted, and eliminated?

However, as we sought to better understand the rise in these reports, we struggled to find research that represented up-to-date attitudes in current digital ecosystems. Given the pace at which technology and platforms change, this is a critical component in delivering strategic interventions.

Consider how things have changed in recent history: 10 years ago, there was no TikTok, Instagram was about to launch, and smartphones were still only owned by about 30% of U.S. adults, compared to 81% today.

The technology is outpacing the research but we do know a few things: talking to kids in real terms works. Restricting access to devices works. Sexually educating children works. Teaching kids and adults what CSAM really is and the legal consequences work.

As one guy told me about being arrested for his activity online, “I always thought of it as fantasy. Not real.”

It’s time to change the narrative.

Photo credit: Grzegorz Walczak on Unsplash.

Appendices:

  •  Post hoc ergo propter hoc is Latin for “after this, therefore because of this” and is a common informal fallacy that states: “Since event Y followed event X, event Y must have been caused by event X.”

It is often shortened simply to post hoc fallacy. A logical fallacy of the questionable cause variety, it is subtly different from the fallacy cum hoc ergo propter hoc (‘with this, therefore because of this’), in which two events occur simultaneously or the chronological ordering is insignificant or unknown. Post hoc is a logical fallacy in which an event says to be the cause of a later event because it occurred earlier.  Post hoc is a particularly tempting error because correlation sometimes appears to suggest causality. The fallacy lies in a conclusion based solely on the order of events, rather than taking into account other factors potentially responsible for the result that might rule out the connection.

  • The Supreme Court confirmation hearings for Judge Ketanji Brown Jackson, which took place over two weeks in March, was a rare occasion to see child pornography and the laws around it discussed on the news and how the courts approach cases.

ABC News: 

Supreme Court confirmation hearings for Judge Ketanji Brown Jackson open this week amid a flurry of misleading allegations by Republican Sen. Josh Hawley that the nominee has a “long record” of letting child porn offenders “off the hook” during sentencing.

“In every single child porn case for which we can find records, Judge Jackson deviated from the federal sentencing guidelines in favor of child porn offenders,” Hawley tweeted Thursday, highlighting nine cases from her time as a federal District Court judge.

While court records show that Jackson did impose lighter sentences than federal guidelines suggested, Hawley’s insinuation neglects critical context, including the fact that the senator himself has voted to confirm at least three federal judges who also engaged in the same practice.

Federal appeals court Judges Joseph Bianco of the Second Circuit and Andrew Brasher of the Eleventh Circuit, both Trump appointees, had each previously sentenced defendants convicted of possessing child pornography to prison terms well below federal guidelines at the time they were confirmed with Hawley’s support, an ABC review of court records found.

“If and when we properly contextualize Judge Jackson’s sentencing record in federal child porn cases, it looks pretty mainstream,” wrote Doug Berman, a leading expert on sentencing law and policy at The Ohio State University School of Law.

“Federal judges nationwide typically sentence below the [child porn] guideline in roughly 2 out of 3 cases,” Berman noted on his blog, and “when deciding to go below the [child porn] guideline, typically impose sentences around 54 months below the calculated guideline minimum.”

“The guidelines are now purely advisory, and many judges of all stripes routinely find that within-guidelines sentences are unduly harsh, in particular when it comes to first-time offenders,” said Cardozo Law professor and ABC News legal analyst Kate Shaw.

The U.S. Sentencing Commission, the bipartisan body created by Congress to set federal sentencing rules, explained in its 2021 report that suggested prison terms for defendants convicted of possessing child pornography – as opposed to producing the materials – have “been subject to longstanding criticism from stakeholders and has one of the lowest rates of within-guideline range sentences each year.”

“Less than one-third (30.0%) of non-production child pornography offenders received a sentence within the guideline range in fiscal year 2019,” the report said.

“Judge Jackson’s record in these [child porn] cases does show she is quite skeptical of the ranges set by the [child porn] guidelines, but so too were prosecutors in the majority of her cases and so too are district judges nationwide (appointed by presidents of both parties),” Berman wrote.

  • Calls To Reform Section 230 have become more vocal.

Harvard Business Review:

A quarter of a century ago, in Section 230 of the 1996 Communications Decency Act, Congress implemented “safe harbor” protections against legal liability for any content users post on social-media platforms. These platforms provide lots of benefits, of course, but since 1996 we’ve learned just how much social devastation they can also bring about. What we’ve learned, the authors write, makes it clear that Section 230 is desperately out of date and needs updating, to hold social-media platforms responsible for how their sites are designed and implemented.

Read the full story.

  • KIK is still kicking.

From a press release from the United States Department of Justice dated Friday, August 19, 2022:

Abel Garcia-DeLeon was sentenced to 10 years in prison for facilitating the distribution of child sex abuse materials in connection with serving as the administrator for several chatrooms designated for sharing illegal child exploitation materials, announced United States Attorney Stephanie M. Hinds and U.S. Homeland Security Investigations (HSI) San Francisco Special Agent in Charge Tatum King. The sentence was handed down by the Hon. Yvonne Gonzalez Rogers, U.S. District Court Judge.

Garcia-DeLeon pleaded guilty to the charge on June 9, 2022.  According to the plea agreement, Garcia-DeLeon admitted that he agreed with another person to use the Kik social media application to receive and distribute images depicting minors engaged in sexually explicit conduct.  Specifically, the plea agreement states that Garcia DeLeon, along with his co-defendant, administered a private chat group on Kik.  The co-defendants organized a “Rage Bot” that instructed anyone intending to join the chat group to first send three videos depicting child pornography.  Intended users were instructed to send the videos via private messenger to “verify” themselves. After the intended user delivered to the defendants the three videos, the person would be invited to enter a second private chat group. While in the second chat group, intended users would encounter another “Rage Bot” that instructed members to post videos depicting children aged ten or younger who were engaged in sexually explicit conduct.  Intended users who did not post three videos within six minutes of being admitted to the group were removed from the group. Between May 8, 2020, and May 20, 2020, Garcia-DeLeon and his co-defendant were joined by at least ten other members of the chat group.

In his plea agreement, Garcia-DeLeon also admitted that he distributed to the chat group images of minors engaged in sexually explicit conduct.  Further, Garcia-DeLeon acknowledged he administered several other chat groups dedicated to the exchange of child pornography.  Garcia-DeLeon acknowledged that law enforcement officers searched his home on August 19, 2020, and that at that time, Garcia-DeLeon possessed on his phone at least 684 images and one video depicting minors engaged in sexually explicit conduct.

On February 9, 2021, a federal grand jury indicted Garcia-DeLeon charging him with conspiracy to receive and distribute child pornography, in violation of 18 U.S.C. §§ 2252(a)(2) and (b), and one count of distribution of child pornography, in violation of 18 U.S.C. § 2252(a)(2) and (b).  Pursuant to his plea agreement, Garcia-DeLeon pleaded guilty to the conspiracy count and the substantive distribution count was dismissed at sentencing.

 

 

 

Tags:

You Might also Like