It could be better written, but that’s not necessary for our purposes.] Nathaniel Burney may well have something to say about the required culpable mental states—intent for the posting, but strict liability for the lack of express consent.In the First Amendment arena, we may run into the problem of the statute overturned by the Court in : the State is not permitted to select, based on its content, some unprotected speech to forbid and some to permit.The evil that we’re trying to eliminate is people posting nude or explicit images of former lovers online without the former lovers’ consent.At the very least, when Lucrezia shares a nude selfie with Giovanni, we want Giovanni to risk conviction if he posts the image on a revenge porn website with Lucrezia’s address and phone number to humiliate Lucrezia.The Supreme Court hasn’t given any explicit guidance on which meaning “patently” has.On the one hand, the Supreme Court, being crowded with law geeks, generally uses terms in their legal sense; on the other, the Supreme Court has approved laws that allow juries of laypeople to decide what is “patently offensive” without defining “patently.” The Supreme Court has described the thing that must be patently offensive as “a work” (rather than “an act of publication”), but it has made it clear that circumstances extraneous to the work (context and time of broadcast) are relevant to the determination.This is a risk that we have to take—for our statute to be upheld under anything resembling current obscenity law, we have to be willing to bow to the standards of the community, which means making the image’s violation of those standards an element of the offense.So our proposed statute might have a basic framework something like this: [The definition of sexual conduct in (A) and (B), I’ve lifted from Texas’s obscenity statute.
The test refers to “a work,” so you might assume that the test for obscenity relates to inherent qualities of the work. Since obscenity is context-sensitive, an image that is not obscene when Lucrezia publishes it to Giovanni might well be part of an obscene publication when Giovanni distributes it in a different context.The First Amendment problem we face is that “posting nude or explicit images of former lovers online” is speech; a statute focused on such posting is a content-based regulation of speech; content-based regulations of speech are presumed to be invalid (that is, speech is presumed to be protected); and the Supreme Court in expressly rejected a balancing test for content-based criminal laws, instead applying a categorical test.While UH law prof Josh Blackman has said that “Invariably, the court will balance interests in First Amendment jurisprudence” and UCLA law prof Eugene Volokh has suggested that the current definition of obscenity might be expanded to encompass revenge porn, we want our statute to be constitutional here and now, rather than in some speculative world in which the Supreme Court retreats from The categories of unprotected speech that the Supreme Court has recognized are narrowly drawn.If a criminal statute arguably forbids both fighting words and obscenity, then it likely forbids a great deal of speech that is neither, and therefore fails constitutional muster. While the proposed statute that I was analyzing here would not survive a First Amendment challenge, and its author’s justifications for it are undeveloped and petulant, my analysis of the idea that sexual or nude images published nonconsensually could ipso facto be obscenity was incomplete and, I suspect, ultimately wrong.We want our statute to cover only speech that fits in one of the already-recognized categories of unprotected speech. At the heart of obscenity are community standards, and a community might well find a particular revenge-porn publication obscene.