Sec 2. Definitions

Section Concerns
Summary: This section defines that when the bill says “Commission”, they mean the body described in section 3, and that their definition of “interactive computer service (ICS)” relies on the one given in section 230, which broadly covers all digital, interactive spaces including websites, apps, internet service providers, and many other spaces. Here is a little more from a 4th Circuit case: “It has been uniformly held that Internet service providers are ‘‘interactive computer service’’ providers.(7) Courts have concluded that a Web site operator, search engine, or other entity was or was not a provider of an ‘‘interactive computer service’’ depending on whether there was a sufficient indication before the court that it ‘‘provided or enabled computer access by multiple users to a computer server’’ within the meaning of the definition found at § 230(f)(2). (Noah v. AOL Time Warner, Inc., 261 F. Supp. 2d 532 (E.D. Va. 2003), summarily aff’d, 2004 WL 602711 (4th Cir. 2004)websites, apps, internet service providers, and many other spaces. Here is a little more from a 4th Circuit case: “It has been uniformly held that Internet service providers are ‘‘interactive computer service’’ providers.(7) No one has ever tried to narrow the definition of ICS in court because prior to FOSTA, the government never used 230 to criminalize or increase liability, but rather 230 provided defendants with immunity. Because of this, companies have always tried in court to expand the definition of who is an ICS, because everyone wanted to be included to access that immunity. ICS now includes listservs, apps, chatrooms, websites, internet providers, intranet providers – anything giving multiple people access to a single server, including tech we haven’t even envisioned yet. In short, this is very broad.

Sec 3. National Commission on Online Child Sexual Exploitation Prevention

Section Concerns

Summary: This section establishes a Commission tasked with developing “best practices” for websites that are meant to address all stages of trafficking of minors into commercial sex, all aspects of minors engaged in commercial sex, child sexual abuse, and the distribution of CSAM, and to whom these standards would apply, as well as alternatives tailored to details about the digital space.

The Commission would be chaired by the Attorney General and includes 19 members. 11 of these members are taken from federal agencies, law enforcement, and tech companies, but 4 members would be survivors of child sexual exploitation or those supporting survivors, and 4 would have a civil rights, civil liberties, privacy, or cybersecurity background.




This commission does what anti-violence advocates, victims of violence, and service providers have asked to stop: centering law enforcement in our response to harm. This commission centers the needs of law enforcement and tech companies and creates a law enforcement-led response. This commission is about liability and prosecutions – not prevention and healing.

The commission also lacks anyone who understands sex workers, harm reduction, LGBTQ communities, or even public health experts. Instead, the majority of the commission members are chosen by Congress – this is a political body from start to finish.

There is no expectation that any recommendations be evidence-based, no transparency around the commission, and no requirement to report on the efficacy or impact of these recommendations.

Sec 4. Duties of the Commission

Section Concerns
Summary:

The Commission will have 18 months to develop its recommended best practices, which must be updated and resubmitted every five years. 

The mandate of this Commission covers all aspects of prevention, identification, disruption and reporting, including working with outside entities, retention of metadata and content, training of websites, age ratings and gating for content, parental controls, practices between websites and third parties, data retention and reporting on these aspects. 


The Commission is asked to consider cost and technical limitations; the impact on competition, product quality, and privacy; and the impact on law enforcement. 

The Committee’s recommended best practices are published online and in the Federal Register, a journal where government rules, regulations, and notices are posted. These best practices are not binding. 

he mandate is incredibly broad and means the commission has the ability to basically regulate anything it wants without explanation. 

The inclusion of “age rating and age gating systems” suggests that this could cover pretty much any content. We know that parental controls regularly block access to LGBTQ content and women’s health information. A commission led by law enforcement and tech companies is the wrong voice to center in these life and death conversations.

EARN IT also offers zero accountability. There are no metrics for success, no metrics for whether there will be a disproportionate impact on anything, no measures of transparency for Congress to even review the impact. 

SEC. 5. Protecting Victims of Online Child Sexual Abuse.

Section Concerns
Summary:

Similar to FOSTA/SESTA, this section creates a carve out of Section 230 that makes websites liable when one of their users violates federal CSAM law. Websites would be subject to both civil and criminal cases using the federal CSAM law, as well as civil and criminal cases using state laws “regarding” conduct related to CSAM (as that term is defined in federal law).  


Under this section, websites cannot be sued specifically for their use of encryption or their efforts to protect encrypted data. However, if a website is sued under this section, a website’s encryption practices can still be reviewed by a court if it thinks encryption is relevant. 

Federal criminal prosecutions have never been barred by 230, so federal criminal liability is unchanged. However, like FOSTA/SESTA, this section expands federal civil liability and civil and criminal state liability for websites, turning people who are being victimized or at risk of being victimized into liabilities and high-risk groups. This incentivizes kicking people off of platforms to limit liability. 

This section’s state liability language is much broader than the Section 230 carve-out created by FOSTA too. EARN IT would create liability under many state laws that use the federal definition of CSAM (whereas under FOSTA the conduct underlying the violation had to match the conduct prohibited by federal anti-trafficking law). EARN IT gives nearly unrestrained power to states to have an impact across the globe, both now and with state laws enacted in the future. Websites will need to be aware of all CSAM-related statutes in all fifty states. Sites will likely need to adhere to the strictest of these – meaning the most conservative states will control access to online material. FOSTA had a similar, global impact, even in countries where trading sex is legal. This is likely to have a dramatic impact on young people’s access to sex education and health materials, regardless of where they live or what they want. 

The encryption language in this section is a red herring: it looks like it provides strong encryption protections, which would preserve the online privacy of sex workers, vulnerable groups, and community organizers. However, these encryption protections won’t apply to most of the cases brought against websites under this section because most cases are not solely focused on a website’s encryption practices. The exception allowing courts to review website’s encryption practices swallows the protection, meaning websites may still choose to stop encryption their services to avoid legal liability. 

SEC. 6. Use of the Term “Child Sexual Abuse Material”.

Section Concerns
Summary: Replaces “child pornography” with “Child sexual abuse material” in existing statutes. This has the same definition as previously and seems more about impacting how people think about this material for minors.

Sec 7. Modernizing the CyberTipline

Section Concerns
Summary:

This section expands the types of information that parties are expected to send to the National Center on Missing and Exploited Children (NCMEC) when reporting potential instances of CSAM. This includes location data, email addresses, and identifying information about any “involved minor”. Additionally, websites are given broader legal immunity when they send CSAM and related information to NCMEC. 

This section also includes language to standardize the format of reports sent to NCMEC.

Companies are likely already collecting this kind of information about their users. When combined with Section 8, tech companies would be permitted to indefinitely keep reports containing CSAM depictions and identifying information about the people depicted and provide a broad range of information and data about users to NCMEC. 

We have concerns about the security of such information and the harm to survivors if that information is not secure. Private companies are not required to provide the public with the same level of transparency in their protocols that public agencies are, meaning the data and materials they collect and preserve could be leaked, stolen, or otherwise mishandled. Companies need to be held accountable for how they collect and hold this information, and EARN IT doesn’t provide any relevant protections.

SEC. 8. Eliminating Network Distribution of Child Exploitation.

Section Concerns
Summary:

After a tip is made to NCMEC, the website must retain all the data it provided for at least 180 days, up from the current 90 days. 

The tip provider may also retain the content of their report, including the CSAM itself, indefinitely in order to “reduce the proliferation of CSAM.”

This section exempts tech companies from laws against possession of CSAM so long as they are working to “reduce the proliferation of CSAM.” In effect, tech companies are being deputized to enforce CSAM laws. For example, websites may use CSAM they find to train algorithms that will monitor the material that users post and capture material that the company thinks could violate CSAM laws. 

Our concerns about the security of the information that private companies retain and the transparency in their use are outlined in Section 7 above. 

Further, considering that tech companies already have the capability to use photoDNA and/or hashing algorithms (non-image information on pictures which can be used to identify images) to ensure images they’ve deleted from their sites don’t get reposted, keeping the actual images seems unnecessary. 

 unnecessary.

SEC. 9. IT Solutions Relating to Combating Online Child Exploitation.

Section Concerns
Summary: The Office of Juvenile Justice and Delinquency Prevention will work with FBI, DHS, HSI, ICE, Internet Crimes Against Child Task Forces, and US Marshals to create technology solutions and tools to provide “consistent, actionable information” to law enforcement. This has a budget of at least a million dollars. This extends funding to law enforcement for internet surveillance with no explicit considerations of privacy, surveillance implications, or oversight.

 

Sec 10. Authorization of appropriations

Section Concerns
Summary: There are authorized to be appropriated such sums as may be necessary to carry out this Act. This is standard policy language.

Sec 11. Severability

Section Concerns
Summary: If one section of the act is found unconstitutional, the rest are not necessarily unconstitutional. This is standard policy language.