Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
United States v Steven Anderegg
#1
Defendant Steven Anderegg is charged with four counts relating to what the court will refer to as obscene virtual child pornography.
Anderegg moves to dismiss each of the four counts, to suppress the evidence gathered from two search warrants, for a Franks hearing, and for a few other items of miscellaneous relief.  Anderegg isn’t charged with the production, distribution, or possession of child pornography as that term is used under federal law, because the charged images at issue aren’t of real children. Rather, they were generated through Stable Diffusion, AI software that generates images in response to text prompts. The charges here rely on the theory that the images constitute obscenity, which generally lies beyond the scope of the First Amendment. Roth v. United States, 354 U.S. 476, 492 (1957). But the First Amendment generally protects the right to possess obscene material in the home, Stanley v. Georgia, 394 U.S. 557 (1969), so long as it isn’t actual child pornography, Osborne v. Ohio, 495 U.S. 103, 111 (1990). These basic principles guide much of the analysis that follows. 

The court will deny Anderegg’s motions to suppress evidence from the search warrants, because the warrant application described the images and the context in which they were found with enough detail to establish probable cause for obscenity offenses. The court will dismiss the possession-of-obscenity charge following Stanley, but it will deny the motions to dismiss the other counts for reasons explained below.

Through the National Center for Missing and Exploited Children’s CyberTipline, law enforcement received tips from Instagram about a user sharing with a 15- or 16-year-old boy two AI-generated images of nude boys. The tips included an Instagram chat conversation between the user and the boy in which the user described how he created images using Stable Diffusion, software that generates images from text-based prompts. Review of the Instagram accounts revealed a third apparently AI-generated image of naked boys, as well as sexually explicit chats with minors. Review of Anderegg’s electronic devices revealed more than 13,000 AI-generated images, many of them alleged to be of children engaged in sexually explicit conduct.

2. Counts 1 and 4


Anderegg moves to dismiss Count 1, production of an obscene image of a minor engaging in sexually explicit conduct, 18 U.S.C. § 1466A(a)(1), and Count 4, possession of an obscene image of a minor engaging in sexually explicit conduct, 18 U.S.C. § 1466A(b)(1). With this motion, Anderegg makes a constitutional challenge to section § 1466A, which was enacted as part of the Prosecutorial Remedies and Tools Against the Exploitation of Children Today Act of 2003 (“PROTECT Act”), Pub. L. No. 108–21, 117 Stat. 650 (2003). Anderegg contends that under Stanley, § 1466A is unconstitutional as applied to him because he has the right to possess and produce obscene material in his own home. 394 U.S. at 568 (“We hold that the First and Fourteenth Amendments prohibit making mere private possession of obscene material a crime.”). Stanley is an exception to the general rule that obscene material isn’t protected by the First Amendment:

"Whatever may be the justifications for other statutes regulating obscenity, we do not think they reach into the privacy of one’s own home. If the First Amendment means anything, it means that a State has no business telling a man, sitting alone in his own house, what books he may read or what films he may watch. Our whole constitutional heritage rebels at the thought of giving government the power to control men’s minds."

394 U.S. at 565. Despite Stanley’s sweeping language, it doesn’t extend to every type of obscene material in the home. Under Osborne, the possession of child pornography—obscene and non-obscene alike—can be criminalized. The government contends that the Stanley exception doesn’t apply to the possession and production counts in this case because Stanley involved obscene material involving adults. It argues that “the paramount importance of protecting minors from sexual exploitation entitles the government ‘to greater leeway in the regulation of pornographic depictions of children.’” Dkt. 64, at 13 (quoting New York v. Ferber, 458 U.S. 747, 756 (1982)). Possession of child pornography in one’s own home can be criminalized despite Stanley because of the government’s compelling interest “in ‘safeguarding the physical and psychological well-being of a minor.’” Osborne, 495 U.S. at 109 (quoting Ferber, 458 U.S. at 756–57). The government argues that § 1466A is constitutional as it applies to production and possession of obscene virtual child pornography because Stanley’s exception is limited to obscene materials involving adults and because Congress has other compelling interests for banning production and possession of this material. The government gives those rationales as follows:

“concern that offenders will use AI-generated obscene material depicting children in an effort to ‘groom’ actual minors into engaging in sexual acts.” 

“concern that an offender’s engagement with AI-generated obscene material depicting children will normalize the behavior and will, in turn, create increasing risk for actual children.” 

“eradicating the market for materials showing the sexual exploitation of children.” 

“rapid developments in AI technology are making it increasingly difficult . . . to know whether an obscene image depicts a real child or an AI-generated one . . . . creating the very real, very present risk that offenders engaged with material showing actual children will be immune from prosecution . . . . the photorealistic nature of GenAI material will adversely affect critical child-rescue efforts, as rescuers search for children who do not, in fact, exist.” 

Dkt. 64, at 15, 17–20. The government’s attempt to limit Stanley is inconsistent with Free Speech Coalition, 535 U.S. at 240. In Free Speech Coalition, the Supreme Court considered an overbreadth challenge to a precursor to the PROTECT Act, the Child Pornography Prevention Act of 1996, prohibiting child pornography that did not depict an actual child. 535 U.S. 234. The Court noted that Ferber “distinguished child pornography from other sexually explicit speech because of the State’s interest in protecting the children exploited by the production process.” The production of virtual child pornography doesn’t directly harm children, but Congress “decided the materials threaten children in other, less direct, ways.” Id. at 240, 241. The Court concluded that the ban on virtual pornography violated the First Amendment, for two reasons: (1) it prohibited non-obscene expression, including material potentially having significant artistic value; and (2) the government’s proffered reasons (similar to the rationales the government gives above in support of §1466A) for restricting all virtual child pornography were unpersuasive.

The Court rejected the rationale that offenders might groom children with virtual pornography by stating, “There are many things innocent in themselves, however, such as cartoons, video games, and candy, that might be used for immoral purposes, yet we would not expect those to be prohibited because they can be misused.” Id. at 251. The government’s proffered rationale here that an offender’s use of AI-generated material would “normalize” that behavior and increase risk to children is substantially similar to the government’s argument in Free Speech Coalition that virtual child pornography might “whet the appetite” of offenders. The Court considered that rationale and stated, “The mere tendency of speech to encourage unlawful acts is not a sufficient reason for banning it.” Id. at 253. The Court also quoted Stanley for the proposition that Congress “‘cannot constitutionally premise legislation on the desirability of controlling a person’s private thoughts.’” Id. (quoting 394 U.S. at 566). It concluded that “The Government has shown no more than a remote connection between speech that might encourage thoughts or impulses and any resulting child abuse.” Id. 

Free Speech Coalition also rejected the rationale that criminalizing virtual child pornography was necessary to achieve the objective of eradicating the market for all child pornography. Id. at 254 (“We need not consider where to strike the balance [in suppressing speech related to a crime] in this case, because here, there is no underlying crime at all.”). This rationale dovetails with the government’s final rationale here, about modern-day virtual child pornography becoming so photorealistic that its existence hampers the ability of law enforcement to tell whether children in the images are real and to garner child pornography convictions necessitated on the children being real. The government made similar arguments in Free Speech Coalition, with the Court making two points. First, that “The Government may not suppress lawful speech as the means to suppress unlawful speech. Protected speech does not become unprotected merely because it resembles the latter.” Id. at 255. Second, it rejected the government’s argument about the indistinguishability of virtual child pornography from that involving real children, reasoning, “If virtual images were identical to illegal child pornography, the illegal images would be driven from the market by the indistinguishable substitutes. Few pornographers would risk prosecution by abusing real children if fictional, computerized images would suffice.” Id. at 254.

The government urges the court to limit Stanley and Free Speech Coalition to their specific contexts, and to apply Osborne, which approved the criminal prohibition of the possession of child pornography. The government argues that “the in-home possession and production of obscene material involving children is much more like Osborne than it is Stanley.” Dkt. 64, at 16. But Osborne isn’t on point because the images here aren’t of real children. Current § 1466A narrowed the scope of prohibited material to obscene virtual child pornography. That avoids the overbreadth problem identified in Free Speech Coalition. But it doesn’t address the reasoning of Stanley, which relies on the importance of freedom of thought and the sanctity of the home. The government attempts to overcome Stanley’s reasoning with the additional element of the federal charge that the images “have been shipped or transported in interstate or foreign commerce or were produced using materials that had been shipped or transported in interstate or foreign commerce.” Dkt. 64, at 22. It states that “the law here reflects Congress’s assessment that the defendant’s possession of obscene images produced on his foreign-made laptop has a substantial effect on interstate commerce.” Id. 

The jurisdictional element doesn’t meaningfully distinguish this case from Stanley. The upshot of the government’s argument is that the First and Fourth Amendments protect private possession of obscene materials, but only so long as they weren’t produced using materials moved in interstate or foreign commerce. But the obscene materials in Stanley (reels of eight-millimeter film) almost certainly moved in interstate commerce too. (The jurisdictional element was not at issue in Stanley, because the case originated in state court and involved a state obscenity charge.) If the jurisdictional element were enough to overcome Stanley, Stanley would be a dead letter. See also United States v. Ostrander, 114 F.4th 1348, 1361 (11th Cir. 2024) (“But the reasoning of Osborne does not apply to virtual child pornography because there are no children victimized by these images. Free Speech Coal., 535 U.S. at 250. . . . Therefore, the First Amendment protects the private possession in one’s own home of obscene material depicting virtual minors, so long as no real children are victimized. Id. at 256.”).

The court concludes that, following Stanley and Free Speech Coalition, the court must dismiss Count 4, the possession charge under §1466A(b), because it is unconstitutional as applied to Anderegg’s private possession of obscene virtual child pornography. That leaves Count 1, the production charge under § 1466A(a). The Stanley decision explicitly discussed in-home possession only, but the Court’s discussion of the First Amendment right to privacy in one’s home could readily be extended to homemade production of obscene material too. Criminalizing the drawing of an obscene picture in one’s own home arguably isn’t compatible with Stanley’s privacy analysis. “Stanley is asserting the right to read or observe what he pleases—the right to satisfy his intellectual and emotional needs in the privacy of his own home. He is asserting the right to be free from state inquiry into the contents of his library.” Stanley, 394 U.S. at 565; see also Ostrander, 114 F.4th at 1362–63 (“[Defendant] has not demonstrated any realistic indication that the statute would actually be used to prosecute someone simply for making an obscene doodle in the confines of his own bedroom, in his home. . . . The legitimate sweep of this statute is not targeting obscene doodles and legitimate works of art in progress.”).

But the court will decline to extend Stanley to the production of obscene virtual child pornography charged here. Stanley was a narrow holding written against the backdrop of the longstanding general rule that obscenity is unprotected speech that the government may regulate or prohibit. See Roth, U.S. at 492. Stanley doesn’t mention production, but focuses solely on possession. And even that right isn’t absolute after cases such as Osborne, allowing the restriction of obscene material if it is child pornography. If broader protection for production of obscenity is implicit in Stanley, the Supreme Court hasn’t recognized it over the subsequent decades. The court concludes that the private production of obscenity does not fall within the zone protected by Stanley. The motion to dismiss Count 1 is denied.
Reply
#2
I hope he appeals it, how is one able to process obscene material in the privacy of their own home, but 1. can't get it through interstate commerce and 2. can't make it in the privacy of their own homes.
Reply
#3
llinois Constitution Article 1 Section 6 states:

"The people shall have the right to be secure in their persons, houses, papers and other possessions against unreasonable searches, seizures, invasions of privacy or interceptions of communications by eavesdropping devices or other means. No warrant shall issue without probable cause, supported by affidavit particularly describing the place to be searched and the persons or things to be seized."

Illinois AI law is unconstitutional because 1. it violates Article I section 6 "The people shall have the right to be secure in their persons, houses, papers and other possessions against unreasonable searches, seizures, invasions of privacy.  as I stated before the US Supreme court has ruled you have the right to look at obscene material in the privacy of your home.

BTW the case United States v Steven Anderegg is on appeal.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)