Sec 2. Definitions

Section Concerns
Summary: This section defines that when the bill says “Commission”, they mean the body described in section 3, and that their definition of “interactive computer service (ICS)” relies on the one given in section 230, which broadly covers all digital, interactive spaces including websites, apps, internet service providers, and many other spaces. Here is a little more from a 4th Circuit case: “It has been uniformly held that Internet service providers are ‘‘interactive computer service’’ providers.(7) Courts have concluded that a Web site operator, search engine, or other entity was or was not a provider of an ‘‘interactive computer service’’ depending on whether there was a sufficient indication before the court that it ‘‘provided or enabled computer access by multiple users to a computer server’’ within the meaning of the definition found at § 230(f)(2). (Noah v. AOL Time Warner, Inc., 261 F. Supp. 2d 532 (E.D. Va. 2003), summarily aff’d, 2004 WL 602711 (4th Cir. 2004)websites, apps, internet service providers, and many other spaces. Here is a little more from a 4th Circuit case: “It has been uniformly held that Internet service providers are ‘‘interactive computer service’’ providers.(7) No one has ever tried to narrow the definition of ICS in court because prior to FOSTA, the government never used 230 to criminalize or increase liability, but rather 230 provided defendants with immunity. Because of this, companies have always tried in court to expand the definition of who is an ICS, because everyone wanted to be included to access that immunity. ICS now includes listservs, apps, chatrooms, websites, internet providers, intranet providers – anything giving multiple people access to a single server, including tech we haven’t even envisioned yet. In short, this is very broad.

Sec 3. National Commission on Online Child Sexual Exploitation Prevention

Section Concerns
Summary: 
The Commission will have 18 months to develop and submit to the Attorney General recommended best practices for websites addressing all stages of trafficking of minors into commercial sex, all aspects of minors engaged in commercial sex, child sexual abuse, and the distribution of CSAM, and to whom these standards would apply, as well as alternatives tailored to details about the digital space.

The mandate of this commission covers all aspects of prevention, identification, disruption and reporting, including working with outside entities, retention of metadata and content, training of websites, age ratings and gating for content, parental controls, practices between websites and third parties, data retention and reporting on these aspects.

The commission is asked to consider cost and technical limitations; the impact on competition, product quality, and privacy; and the impact on law enforcement.

The best practices shall be updated and resubmitted every five years.
The commission lacks anyone from the sex industry or service providers to sex workers, including internet/cam/porn performers, in-person sex workers who use the internet to advertise, community groups who use online platforms to organize.

Commission lacks anyone who understands harm reduction, LGBTQ communities, public health including those who do outreach using the internet, understand harm reduction in terms of vulnerabilities of the internet.

AG Barr, the Chairperson, “has made it very clear he would like to ban encryption, and guarantee law enforcement “legal access” to any digital message.

Sec 4. Duties of the Commission

Section Concerns
Summary: The Commission will have 18 months to develop and submit to the Attorney General recommended best practices for websites addressing all stages of trafficking of minors into commercial sex, child
sexual abuse, and the distribution of CSAM, and to whom these standards would apply.

The mandate of this commission covers all aspects of prevention, identification, disruption and reporting, including working with outside entities, retention of metadata and content, training of websites, age ratings and gradings for content, parental controls, and practices between websites and third parties.

The commission is asked to consider cost, and the impact on competition, technology and law enforcement.

The best practices shall be updated and resubmitted every five years.

The mandate is incredibly broad and means the commission has the ability to basically regulate anything it wants without explanation.

The inclusion of “age rating and age gating systems” suggests that this could cover pretty much any content.

Similar to e.g. the Department of Treasury’s FinCen guidelines, which look for red flags for “trafficking,” but clearly target the sex trade more generally and have resulted in sex workers losing access to financial platforms, the lack of even a consideration for sex workers within this mandate means that there isn’t even anyone on the commission who understands, let alone cares, about the impact. The overbreadth of these guidelines, married to a complete lack of transparency/accountability, means that sex on the internet will be regulated by a law enforcement-heavy body, with the primary purpose of surveillance, punishment, and policing. 


EARN IT also offers zero accountability. There are no metrics for success, no metrics for whether there will be a disproportionate impact on anything, no measures of transparency for Congress to even review the impact.

SEC. 5. Protecting Victims of Online Child Sexual Abuse.

Section Concerns
Summary: Similar to FOSTA/SESTA, this section creates a carve out of Section 230 such that website user violations of federal CSAM law will create civil liability for that website’s owner. Lawsuits can be brought using the federal CSAM law, and lawsuits and criminal prosecutions can be brought using state law “regarding” conduct related to CSAM (as that term is defined in federal law). 

Federal criminal prosecutions have never been barred by 230, so federal criminal liability is unchanged. 

Use of encryption on a platform does not create inherent liability.
Similar to the FOSTA/SESTA, this would turn people who are being victimized or at risk of being victimized into liabilities and high-risk groups. This incentivizes kicking people off of platforms to limit liability.

The language creating state-level liability is actually much more broad than the 230 carveout created by FOSTA. EARN IT likely creates liability under all state-level laws which use the federal definition of CSAM (whereas under FOSTA the conduct underlying the violation had to match the conduct prohibited by federal anti-trafficking law). EARN IT would give nearly unrestrained power to states to have an impact across the globe. Websites will need to be aware of all CSAM related statutes in all fifty states. Sites will likely need to adhere to the strictest of these – meaning the most conservative states will control access to online material. FOSTA had a similar, global impact, even in countries where trading sex is legal. This is likely to have a dramatic impact on young people’s access to sex education and health materials, regardless of where they live or what their parents want.

SEC. 6. Use of the Term “Child Sexual Abuse Material”.

Section Concerns
Summary: Replaces “child porn” with “Child sexual abuse material” in existing statutes.  The reason for this change, or how it might impact court interpretations of the phrase, is unclear.
support for when these companies are forced to look for elements of transactional sex. Sex workers are routinely kicked off platforms, lose access to basic tools of communication, etc.

Sec 7. Modernizing the CyberTipline

Section Concerns
Summary: Section 7 expands the information requested to be voluntarily turned over to the National Center on Missing and Exploited Children (NCMEC) when reporting potential instances of CSAM. This includes location and identifying information about “the involved minor”. Additionally, Sec 7 pushes for standarization of formatting for this information.

It’s likely that companies are already collecting this kind of information about their users, and the requests to include it in reports to NCMEC are voluntary. When combined with section 8, tech companies would be permitted to indefinitely keep reports containing CSAM depictions and identifying information about the people depicted.

We have concerns about the security of such information and the harm to survivors if that information is not secure.

Private companies are not required to provide the public with the same level of transparency in their protocols that public agencies are. We worry about the lack of accountability in how companies are holding and using this information.

SEC. 8. Eliminating Network Distribution of Child Exploitation.

Section Concerns
Summary: After a tip is made to NCMEC, the data must be retained by the platform for at least 180 days, up from its current 90.

The provider of the tip may also retain the content of their report for longer than 180 days, including the CSAM itself.
This allows tech companies to retain the CSAM materials that they’ve reported indefinitely “for the purpose of reducing [their] proliferation.” This exempts tech companies from laws against possession of CSAM; laws they are deputized to help to enforce.
Our concerns about the security of these reports and the transparency in their use by private companies are outlined in section 8 above.

Further, considering that tech companies already have the capability to use photoDNA and/or hashing algorithms (non-image information on pictures which can be used to identify images) to ensure images they’ve deleted from their sites don’t get reposted, keeping the actual images seems unnecessary.

SEC. 9. IT Solutions Relating to Combating Online Child Exploitation.

Section Concerns
Summary: The Office of Juvenile Justice and Delinquency Prevention will work with FBI, DHS, HSI, ICE, Internet Crimes Against Child Task Forces, and US Marshals to create technology solutions and tools to provide “consistent, actionable information” to law enforcement. This has a budget of at least a million dollars. This extends funding to law enforcement for internet surveillance with no explicit considerations of privacy, surveillance implications or oversight.

The current OJJDP Administrator, a Trump appointee, has made clear that she will not prioritize racial justice within law enforcement in the same way that other Administrators have in the past. Recently, she said that she wants to “maintain[] public safety,” as though over policing of minority communities is not itself a harm to public safety. Giving greater surveillance tools to law enforcement officials who disregard racial justice is a mistake.

 

Sec 10. Authorization of appropriations

Section Concerns
Summary: There are authorized to be appropriated such sums as may be necessary to carry out this Act. Standard

Sec 11. Severability

Section Concerns
Summary: If one section of the act is found unconstitutional, the rest are not necessarily unconstitutional. Standard