EARN IT Act Community Call

 
This community call took place on March 25th at 6:30 p.m. EST. Hacking//Hustling is joined by sex workers, lawyers, and digital security experts to talk about the EARN IT Bill.

Transcript

KATE: Sure. So the EARN IT Act is a bill that was just introduced only a couple weeks ago in the Senate. And what it does overall is kind of build on a lot of the conversations that have been being had around what oversight of online platforms look like. And because so much of this conversation has really been focused not on the myriad types of abuse that is happening, it is really narrowly focused on things like child pornography, trafficking in the sex trade exclusively of minors. This bill is really geared towards expanding liability as a way to attack what are much more widespread, complicated and nuanced issues.

Much like FOSTA/SESTA. We’re — as a culture not really able to tackle a lot of the root causes, a lot of the issues, a lot of the reasons why these things happen. And instead, what this bill does, and it was drafted by some of the same people that brought you FOSTA/SESTA is to build on that idea that if you just expand liability for platforms, if you just ask private actors to become police and judge and jury, that’s the way to tackle these problems.

We know that it’s ineffective, we know that it captures a lot more people than the people we’re after, and we know that it actually doesn’t prevent it.

And so we are going to break down a little bit of how it does it because it’s a little more nuanced and complicated than FOSTA/SESTA was, but that’s really what this bill is trying to do. It’s trying to look at what a really serious and complicated issue that has a lot of pieces to it, but has not been tackled effectively in 20 years since the passage of the Trafficking Victims Protection Act, and instead simply expands liability, expands the number of police actors to say, well, we’re just capture everyone and try to root out the people that we’re really going for and really not caring about the collateral damage that we leave in its wake.

So Kendra. Do you want to break it down a little more?

KENDRA: So one of our initial conversations about this bills was actually that it’s often quite difficult to explain it. And that’s actually sort of a feature of the bill and not an about — because a lot of what it’s actually trying to do is a little bit unclear and it’s left up to an elected commission of 19 people.

So the first thing to know is that the Attorney General Barr, William Barr, who has expressed a public interest in getting rid of and end encryption would be the head of a 19-person committee or commission, predominantly law enforcement, tech, members of the FTC, FCC, the Federal Trade Commission, federal communications commission, that sort of those sorts of groups and that 19-person commission is sort of given this super broad mandate of what all the things that they’re expected to consider when coming up with best practices for eliminating child exploitation material.

And those best practices are supposed to cover, I don’t know, it honestly kind of amounts to basically everything, but prevention, identification, reporting of and retention metadata and content, training of websites E-trading and sort of parental controls.

And a lot of this stuff is, you know, those — the list of topics is geared at sexual material about minors or child porn. However, it’s not specific to it and it doesn’t restrict the commission’s oversight to only those things.

So when you hear people talking about EARN IT and potential threats to encryption, even though the word encryption is not actually in the bill text, the reason people are worried about the potential effects of EARN IT upon encryption is because the best — the mandate of the commission to create these best practices includes a lot of things that — where folks may think that undermining encryption will serve those purposes.

Those best practices in the most — in the newest draft of the bill are then turned into law. So that’s done through this fast-tracking procedure. So this is not, you know, if you remember your school house rock, this is not that. The — it’s not just the bill. It actually, I think, if I recall correctly follows the path that’s usually restricted for, like ratifications of treaties. So it’s a fast-track procedure with no oversight and no debate. So the goal here, under this bill basically takes these best practices from the commission headed by the attorney general, and then move it is along so congress passes them really quickly.

And then how do you — okay, you have these best practices for theoretically ending the availability of child exploitation material online how do you get online platforms to follow them?

Well, that’s where the EARN IT Act invokes Section 230, which is the same section — the same law that was amended by FOSTA, and it requires that ICSs, which is interactive computer services, which basically covers pretty much, think almost everything on the Internet. If it’s not a static HTML page, like flashback to, you know, 1985, it’s probably an interactive computer service. Can you interact with anyone on it? Then it counts. So we’re talking about stuff like chat rooms, Facebook, GIRA (phonetic), web tickets and pretty much everything that you do online.

And the — those platforms are put in a position where they get a choice. They can either follow the best practices that the commission sets up and retain their Section 230 protections for all kinds of materials including child exploitation material, or not follow those and sort of take their chances at much more significant civil liability for all kinds of online sexual materials.

So in effect, what the bill does is by creating these best practices it can have a very, very broad mandate. It makes it difficult for — it puts online service providers in a position where they’re going to either have to significantly overpolice or comply with the best practices, which may include removing end-to-end encryption.

KATE: So essentially, just so folks know we’re going to have a slide after this for question, so if we don’t hit your question totally feel free to ask again, we promise we’re not ignoring it.

And Blunt it writing down all the questions. Thank you for that. So essentially what we’re talking about is this 19-person commission, and it is basically — thank you.

KENDRA: Sorry. I got rid of the weird yellow line.

KATE: So this 19-person commission which is headed by the attorney general, who is not — is someone who has expressly said they’re interested in ending encryption for not just this, but for a variety of different things just in general does not want end-to-end encryption.

They’re going to head this 19-person commission, and this 19-person commission is going to come up with a list of best practices that are going to be the standard for websites. Websites — websites, apps, list serves etc., anybody that they define as this being applied to is going to have to certify themselves against this or else they’re going to vastly open themselves to liability for civil suits. And we know how this is going to go.

And we know how this is going to go because just like FOSTA/SESTA we’ve seen this tactic before, anti trafficking is not that creative, and so we already know you know what this is going to look like. This is going to look like red flag indicators, this is going to look like requirements for the information that they have to retain, where they put that information and who gets to see that information. And the thing about it is the reason why we keep going back to end-to-end encryption, is not just because Attorney General Barr has said he wants to end encryption, but also because a lot of the things that they’re already talking about are not possible to retain or to access or to view or to scan if there is an encrypted service. So even if it says — even if it doesn’t expressly say you have to end — you have to remove end-to end encryption, platforms are going to have to do that in order to meet these standards and be able to do this.

And a lot of this came out of the conversations about, you know, there is an organization that says, you know, we’re getting — we want tips and we want red flags. And we are concerned that platforms like Facebook have this end-to-end encryption so we’re going to lose a lot of the tips. So what we want you to do is scan things and then send red flags over and we’re going to tell you how much information is there. So that’s kind of the background of where this bill came from.

And where have we seen this before? In a couple of places. So in 2014 there was a group that came together that is called FINCEN is the short acronym and what they did was in 2014 they released guidelines which were about red flag indicators looking for cyber crimes in finance and in banking. And a lot of those red flag indicators, a number of them are very difficult, but a number of them are really specific to commercial sects. Indicators include things like how much you’re depositing in cash but also just fact that you’re advertising on a site that’s known for adult advertisements.

And what have we known to happen since then is that sex workers are constantly kicked off cash payment apps, kicked off losing their bank accounts, even just straight up independent debit accounts. We know that cam sites and porn sites frequently lose access to those platforms, sometimes making it impossible for them to even pay their employees. So we already know that these kind of indicators don’t do well for folks in the sex trade because it’s this idea of if you capture everyone, then we will adjust dive in, but the thing about the sex trade and the thing about a lot of these is once you have these private actors making decisions about their liability, not just for traffics, but also for things like money laundering, liability as far as, you know, whatever else they might deem people who operate in the adult industry to be participating in, they just might say, we don’t want the liability in general and kick them off.

Can you hit, Kendra. Thank you.

Also we see the exact same thing we’re talking about transportation and hotels. Everyone has been seeing that Marriott is training its front line staff to look for trafficking. And what are those red flag indicators that are — where, you know, groups that say they are staunchly anti trafficking came up with, it’s privacy signs on the door, it’s having two cell phones, it’s asking for extra towels. And so they were using this saying we are screening for trafficking, and we’re using these indicators that you have determined to decide what we want to do with people so allowing private actors to again really control and damage the lives of people in the sex trade.

Next slide, or next thing.

Then of course SESTA/FOSTA, and just like SESTA/FOSTA, this is about expanding liability and saying now you’re responsible, you get to decide what to do. And so because we’re going to be talking about red flag indicators, yes they say it’s about youth and child sexual abuse material and trafficking, but think about a private company monitoring all of those, what do you think those red flag indicators are going to be.? They’re going to be focused on youth and that means that young people trying to access sexual health information are going to be more at risk to be — to get red flags. Think about what it means to be underage and trying to access information about abortion, think about what it means to be young and LGBTQ trying to talk about what gender identity means with someone who is older when you’re clearly not going to get that kind of information in a lot of different places and maybe you want to did it anonymously.

So when we’re talking about this, we’ve to think through what have these folks already done? What are the patterns that they have already perpetuated and who has that harmed and who else do we think it’s going to harm? And so what you’re talking about is Internet platforms but also list serves, websites, apps. It even covers intranet, but I don’t know who uses intranet.

Reviewing all images, reviewing communication, looking for what can easily be automated because this is so much information you’re not going to have individuals going through. And so when we’re talking about all of this it’s going to be bots red flagging things, pulling them out, do you really want private companies to be able to make the decision on what to do with that information? About who gets access to that information, about how they understand their liability for that information, if you’re someone like PayPal, if you just want to be an asshole about that information? That’s what we have to think about when we’re talking about this.

And it’s not that we don’t — you know, we have — while the bill is really broad, while we might not necessarily know some of the details, we know the patterns that have been happening for the last several years, and we also know who is disproportionately impacted by all of these.

Next.

BLUNT: So Daly are you here, the next slide.

DALY: Hi everyone, sorry I came in a little late. I’m Daly. I work for the Electronic Frontier Foundation. I’m the a staff technologist there. (Inaudible) I am here on my own volition though, not on EFF’s authority.

So the EFF is, in case you don’t know it’s the Electronic Frontier Foundation. It’s a nonprofit law firm that focuses in digital civil liberties. We spearheaded many different cases that are related to CDA 230, an unofficial tag line goes whenever we take on any of these campaigns is that when you go online we make sure your rights go with you. So as you can imagine CDA 230 is very important to us and any bill that weaken it is, we will fight both on activism and on legislation.

Could you go back? Thank you.

KENDRA: Technology lawyer means I can’t operate this.

DALY: Sorry. I can barely zoom either. Whatever this is, Google slides.

So as Kate put it out earlier we’ve seen similar legislation come through to weaken 230 and that was SESTA/FOSTA EFF has fought against that, we’ve sued against it to try to delegitimize it in the phase of the law as a First Amendment issue still we’re working there through that. We are doing the same thing with this particular one.

And so through filing, like our own litigations and amicus briefs on behalf of others, we are continuously highlights encryption as a necessary utility for digital citizens to a free and open Internet it’s a phrase we often use.

But specifically how encryption is intrinsically tied to First Amendment issues for everyone online.

Next slide, please.

So recently a few of the attorneys that I work with and I helped proofread a little bit, at least the tech side of it, we filed a letter to the Senate judiciary describing how and why the EARN IT Act is violating the First Amendment, and I think Kendra mentioned this earlier, although the EARN IT Act never actually uses the word encryption, when Blumenthal — Senator Blumenthal is very hard on that, reiterating why are you bringing up encryption, it’s not about encryption, I haven’t said encryption, we we’ve used the opportunity to describe exactly how this legislation undermines encryption because it’s demanding, like clear text authority or clear text view of every sort of transmission that wants to be read by the committees that get chosen.

There are some links there, I recommend you look into, specifically to the — the letter to the Senate judiciary committee and then another letter describing — or another article on our blog describing specifically how this is an encryption and First Amendment issue and why it is just another attack on 230 in general.

KENDRA: So you get me again, now that I’m unmuted. I’m going to talk sort of more about why this is a problem for speech online generally. With a particular eye towards how it will affect sex workers but we’ll talk specifically more about how it affects some sex workers on the next slide.

So, you know, one backgrounds thing to note as we sort of talk about the EARN IT Act is that online platform can say already be prosecuted under federal criminal law for for child sexual abuse materials. So, you know, Attorney General Barr, if he wanted to could bring prosecutions against any of the platforms that he thinks aren’t handling child sexual abuse material appropriately. He hasn’t.

What the bill actually — this bill actually changes has been making these platforms potentially liable under state, criminal, law and state civil law as well as state federal law. — or as well as federal civil law. State federal law is a little bit of a contradiction.

So what it means to be liable under civil law is that individuals can bring lawsuits, which sounds great until you realize that often the individuals who are bringing the lawsuits against these companies are not necessarily thinking about sort of a holistic picture of enforcement and prevention but maybe sort of thinking about sort of trying to get a settlement in order to finance — finance a lawyer’s practice.

So what that means is that in effect platforms have no choice they have to comply with the best practices, which is disabling encryption or backdooring it or face significant liability including those potentially frivolous lawsuits. We don’t know what the best practices are going to be yet. As Daly said, we have some guess just based on the content of the — the way this bill has been suggested, the things that the best practice commission — creation commission is supposed to consider, but we can’t say for sure exactly what they’re going to contain.

They could and probably are likely to restrict constitutionally protected speech or sort of disallow or discourage end to ends end encryption, and we don’t quite know yet. That’s why we’re not saying the word encryption is kind of a red herring.

Even if platforms don’t comply with the best practices and if they choose not it disable end-to-end encryption or make restrictions for constitutionally protected speech they’re likely going to remove content in order to reduce their potential liability. This is exactly what we saw with FOSTA/SESTA, which is that platforms, large Internet platforms or even smaller — smaller online platforms will make drastic changes to their own internal moderation policies in order to reduce their risk of litigation or to not be seen as not doing something.

So, you know, what’s happened in the government at that context is widespread removal of all sexual content, including stuff that’s not sexual content, not adult content written by sex worker or people who are profiled as sex workers, even if it has nothing it do with sex whatsoever.

And we also suspect that what we’ll see is major online platforms increasing scanning for sexual content, as well as Kate mentioned, you know, the red — red flags, trying to detect particular partners that may be associated with sex workers or child sex trafficking in order to — in order to crackdown on them, and then profiling or excluding sex workers how they use it.

KENDRA: So it’s CDA230 or Section 230 is a law that means that online platforms can’t be held liable for the speech of their users. And what that means in practice just as an example, right, if you — if someone posted a defamatory review about a local restaurant on Yelp saying they got food poisoning, it was terrible, they saw rats where wherever everywhere, and they didn’t in fact see those things you can sue the person who wrote that review but the can’t sue Yelp because we don’t think Yelp has any particular way of knowing whether that review is accurate.

The same thing applies to generally speech online. And it’s a sort of cornerstone of, first, free speech on the Internet because it keeps platforms being held liable for things they have no reason to know about or speech that they have no reason to know if it’s true, accurate, or if it could get them in trouble.

So what FOSTA did and what the earn it act does was amend Section 230 to remove some of that immunity, which results in platforms often overcompensating by removing all kinds of speech that might possibly trigger liability, which is much, much more broad than what the Constitution — what the government might be able to remove constitutionally.

BLUNT: So now we’re going to talk a little bit about why this is a problem for sex workers. So what this act would do is create a new threat to anyone who would be a target of the Department of Justice under the guise of protecting children. So FOSTA/SESTA was signed into law under the guise of stopping human trafficking and we saw a very similar tactic happening here with the EARN IT bill saying that it’s to protect children without actually doing to provide meaningful support to children who might be at risk of exploitation.

But these rules could easily be used to limit speech about abortion, queer people, harm reduction methods, criticism of law enforcement, trading sex and so on. There’s so many ways that this could be applied overly broadly.

SX: Great. So without encryption with clients and community, we are at higher risk of law enforcement, we can just kind of going over that but pretty much, like, we’re — sex workers need end-to-end encryption, we need to be able to have privacy and be able to communicate privately, whether it just be our own personal thoughts or logistical planning, things like that, so to not have ends to end encryption, will not only jeopardize sex workers but something that I always bring up is I work on the intersection of sex work and sex tech this would be detriment to the sex technology as a whole.

Simply being without being able to talk about our sexuality and these specific topics about encryption is all honestly a violation of our human rights when it comes to digital space, right? So yeah, Blunt, do you want to go to the next one?

BLUNT: Sure. So these communications will be more closely monitored by private companies. And I was seeing in the — in the chat someone was saying that Facebook is already moderating the chats and you can’t send minivids links. So the way that we will be monitored and policed will be detrimental to our access to these online spaces, our ability to make money, our ability to stay on these platforms that everyone else uses.

And we can talk a little bit more about the way that this would impact the sex worker community when we open it up to questions because I know that there’s a lot of community members on the call as well.

BLUNT: I’m going to read Hamid’s question, Hamid from LAPD Stop Spying says, “Besides the attack on sex workers and encryption, there is clearly another assault and criminalization of youth under the guise of youth and community safety. Historically youth have been criminalized through the creation of gang databases and gang injunctions and increasingly the language of extremism and radicalization is being normalized behavioral surveillance is central to this level of criminalization. And to add see something, say something into the mix, there may be so many do-gooders who will be snitching.”

KENDRA: Just to add briefly to that, the, you know, one of the sort of problems that always happens in this space is the sort of if you build it they will come idea is that a lot of these surveillance apparatus as I know folks like LAPD Stop Spying have written about are developed for one purpose and then sort of used more broadly to sort of, as other, you know, and more expansively against other marginalized community what is might first be deployed for child exploitation material gets deployed more generally against sex workers if that isn’t already happening and then sort of gets deployed more broadly after that.

So there is a question about its impacts on community organizing.

KATE: I think it’s also important it remember that under FOSTA/SESTA, there was an expansion of criminal law for all owners and managers of interactive computer services. So all of these platforms for facilitation of prostitution. So even if this this doesn’t explicitly talk about prostitution, what it is built upon is the fact that if they are aware that they are facilitating prostitution, it could get owners and managers of those list search websites apps up to 25 years in prison. So it not only has the liability around 230, it also has the liability that was expanded two years ago on FOSTA/SESTA. So you’re asking these platforms to scan this information, to scan for information that would be related to commercial sects, because trafficking is defined as both engagement in the — in commercial sex, but also engagement of commercial sex of anyone under 18, so they’re going to be monitoring commercial sex more broadly. So even if it is an adult you’re asking them to scan for —

(Inaudible)

BLUNT: Kate, if you still hear us, you froze I think we’re going to move on it another question because Kate has frozen.

So someone in the comments mentioned that this would also affect journalists, I don’t know if someone on the call wants to talk briefly about that. We have questions about how it affects sex tech explicitly and how it would affect journalists communicating with sources.

ERIN: Obviously it’s a huge — I mean I personally feel like it going to be an issue if you are a journalist who for example uses Proton mail or Signal or things to communicate with sources, which I know a lot of people who even if they aren’t covered, for example, marginalized communities or various types do that just as a point of having that safety there and that’s going to probably be gone, you can’t actually guarantee that. And that would be a huge issue, I mean, I feel like personally I would prefer to be able to know that I could speak completely confidentially with my sources, especially if you’re covering communities who for example are sex workers or who are, you know, like going in and out of carcel systems or just in general, and it’s — yeah, it’s pretty, I don’t know it’s pretty concerning and I feel like anybody who cares about privacy on any level, whether they be journalists, sex workers, civilians, people who organize, like they should be concerned about what this could have on their efforts to do that in a way that that isn’t going surveilled to the umpteenth degree even more so than we already are.

I hope that explains a little bit.

SX: Yeah, thank you. So as far as the impact it can have with sex and technology, there is two things, right. So first of all, privacy is not a privilege, it is a right, so fundamentally understand that all technology is created and for example Facebook is — they actively encourage us to share personal information. That is not the way that all tech has to be created or is created, right, so when we have things like the EARN IT Act to kind of have law enforcement come in on the back end of our conversations, that is a privacy violation that does not have to happen.

And on the other end, we have to talk about the hindrance of innovation, so when you’re telling someone if I have a private conversation around my sexuality or sex period, that law enforcement could come in and see, that — that’s a really big problem. You know, I — I’ve been working with different organizations and also just kind of creating my own products. And a major thing we use is end-to-end encryption. So if you have laws in America that hinder end-to-end encryption, people are not going to create products in America, they are not going to want to come here and do any kind of things around technology, and also when we talk about the implication for sex workers, it’s just actually dangerous, you know, because if you go in and you’re looking for a 16 years old sex trafficking victim, you think they’re not going to check to see all the other conversations you’re having? So this is what happened after FOSTA/SESTA when we moved to Proton mail, right, and people were still using Google accounts and things like that Google can still look at your conversations and look at your information, but now with EARN IT, they will have active reason to, and they will have active reason to kick you off, look at your stuff, comb though the information that you’re sharing on their platform. So two things, one privacy is not a privilege, it is a right, and two, the innovation around sex tech will be extremely impacted it already has been by FOSTA/SESTA and EARN IT Act will just decimate the industry here in the U.S.

BLUNT: Thank you very much for that SX. Kate, did you want to finish the thought while you were frozen.

KATE: I just want everyone to remember that FOSTA/SESTA expanded facilitation of prostitution federally and none of the websites that have come down have been charged with trafficking, or none of the individuals associated with those have been charged with trafficking. It’s all been promotion of prostitution through the travel act, which promotion of prostitution is a state law but the travel act says because you’re using the Internet it crosses state lines and it becomes federal. So this is also built on an incredible expansion of criminalization of the sex trade and it has never used or utilized trafficking laws.

KENDRA: So I saw a bunch of questions from folks about how would this affect this particular technology, like, you know, VPN or Signal or proton mail, and I think there’s two answers p.m. one is sort of what Daly flagged, which is that everything that we know about the evidence about what the — what the attorney general wants to do and the commission wants to do suggests that they’re going to be pushing for tech access to encrypted communications but the other answer and I realize this is unsatisfying is that we really don’t know is the way that the bill is structured is meant to move these conversation out of the democratic law making process and instead put them in a commission that isn’t accountable to people.

So I think, you know, with regards to particular technologies, like we can make our best guess, but, you know, I can’t tell you and no one can really tell you firmly exactly what would happen under EARN IT. So I think it’s important that we be talking about how that removes the — like people’s ability to participate in a democratic decision-making process.

The other thing I’ll flag is that, you know, I know there’s a lot of non sex workers on this call and I want to just make sure that we’re highlighting the concerns of sex workers. And if you’re like well, I use a VPN or I use Signal as something — I use it as a day-to-day thing because it’s important to me but not to keep myself safe, let’s make sure that we’re highlighting and centering the folks that need these technologies to keep safe or are going to be primarily the ones who are harmed by overly restrictive crackdowns.

BLUNT: Totally. Thank you so much Kendra and I also want to say after our question segment, we’re going to sort of talk about where the bill is at as well as what our next steps are. So all of that will sort of we will brainstorm collaboratively together after that.

But I have three questions that I think sort of go together that I pulled from the chat. One — four actually, how will this affect sex workers outside of the U.S.?

If it passes how will we be able to communicate with clients, will we be able to communicate with clients? What is the interact for interpersonal violence and cyberstalking? And I think all of those can sort of be answered with a similar answer if anyone wants to hop on that.

SX: I’ll just do the sex worker one real quick. How can it impact international? For one we know that anything that passes in America, for example, FOSTA/SESTA, has a huge international reach, because we have a lot of people — we have a lot of sex — people that are based in America, right.

Two, how does that impact communication with clients. We as sex workers are always survivors so we will find a way, so think about FOSTA/SESTA, how much that put a ridge into communications that happened with providers and clients. Think about that time on steroids, right? Think about all the complications that came from you having to switch to Signal, you having to switch outside of Google to go to Proton, think about that times a million.

ERIN: And in general sex workers know that when interacting with clients, we already have to use, you know, different names for certain things, we know we’re illegal and have to walk around and things such as that, but basically this takes a whole other level of protection interacting with clients we currently still have even after FOSTA/SESTA and whatever happens in the United States will inevitably impact everyone globally, especially sex workers so it’s kind of like inevitably that this will have a huge — (inaudible) interacting with clients directly. And even more so it will probably have a similar reaction to how sex worker community spaces were impacted after FOSTA/SESTA online where a lot of were pushed off because of tech platforms reacting and going the next level and it will just be another level of that.

So in general, whether it be interacting with clients on client-facing accounts, for example, or even just directly trying to talk to them T will impact your ability to do that.

SX: Just to make one small little point, not to divert here, but please understand, for example, think about our political climate right now with COVID happening, everyone and their fucking mom is being pushed to go on to these online sex working, adult sharing websites, right. So when we have any kind of attack on sex technology, these things have impact, because they give these technology companies more control over our livelihood, over how we make money and how we have access to upward mobility. So just think about that for a second as far as how this bill can start impacting sex workers in digital and physical space.

BLUNT: Thank you, SX.

RED: Hey there, Blunt, I just wanted to hop on for a second if that’s okay. Since we did have a couple of questions around domestic violence related issues and impact there, and also just gender-based violence and interpersonal violence I saw a question around deviant IPPVE. Like Blunt had mentioned. We’re going to have some community-based informational specifically trying to galvanized our organized communities. So if you are an a part of an organization, if you’re part of support networks are doing advocacy work and this is a passion point for you, it absolutely is going to be impacting the folks that you’re working with and amongst and I just want to say, I can’t state this any more clearly than that but imagine not being able to safely disclose to advocates, right to, or to care providers who are using alternative methods of communication that aren’t necessarily under HIPAA, right?

We already talked about journalists needing to talk to folks who are perhaps survivors themselves who may be survived acts of violence by law enforcement, right, so I think this has a lot of implications around interpersonal violence but also if you’ve experienced abuse from police and police terror in your communities and you’re trying to disclose and that, you’re trying to get word out around that, and so if you’re surviving hands at someone who has this kind of status of access, right, you know, a lot of us I think who are impassioned by this see the state as an abuser itself and so putting that into this context and thinking about who’s accessing this information when you are disclosing what your survivor status is not to be lost in this conversation.

KATE: And I think the other thing that I want to throw on top of that is especially with everyone including, you know, caseworkers, social workers, lawyers, right now everyone is using these platforms that right now they’re encrypted, but think about what it means for your lawyer to have to get in contact with you and to have to dispose that information. Right now there is a lawyer –s a lot of lawyers that talk to their clients through Signal because it’s encrypted. Let’s say that moves to an unencrypted space where that is red flagged and goes to somewhere else. That means that your communication with a lawyer is now owned by a private company like Facebook and possibly being sent somewhere else. So your ability to talk to someone like a lawyer or a counselor who, using that platform, all of a sudden you lose a lot of the ability to communicate back-and-forth in a safe way with someone where you have to disclose that information or else it’s really going to compromise the ability for you to communicate.

BLUNT: I have this question, that is, if encryption is gone, who can see nude photos that people may send? And I think that this sort of applies for all these forms of communication that we’re talking about, if what we think is going to happen with this bill is to create a backdoor for law enforcement to read encrypted messages.

And what that means is if a backdoor is created, any bad actor can then access that door. So your information is left safe, it doesn’t mean that anyone would be able to see it is what it means is that it’s more vulnerable.

Does anyone else want to add on to that?

Are there any other questions? I think I’ve read most on the list. If I’ve missed your question, feel free enter it again.

How will this impact the battle against revenge porn?

I actually, I think that it also makes non sex working folks, personal intimate content more easy to hack and more easy to gain access to. So I actually think — I believe some of the revenge porn activists were for FOSTA/SESTA but I think we’re going to say a lot more opposition to EARN IT because of the broadness and the vagueness of this bill and the way that it impacts the way that we communicate like using WhatsApp and Facebook and state phone.

SX: So to answer that question, I was, I got to ask a Norma Buster is the client relations magazine for the Carrie Goldberg law firm and they specialize in stalking pervs and kind of revenge porn what they to say is Section 230 doesn’t protect immunity from federal crimes. Revenge porn is a federal crime already. So I think like — like Blunt said, we will see more support from these revenge porn organizations for being against EARN IT because it’s a bit different from previous things that have been presented, but yeah it doesn’t — it’s kind of confusing what it has to do with revenge porn, but it’s already very illegal. So I guess that from with —

KENDRA: SX, just one quick tag on it that, I don’t think there is a federal revenge porn statute, but you’re 100 percent right that there have been cases where federal prosecutors have successfully used federal criminal law in order to prosecute people using revenge porn websites. Totally stands, I’m just a lawyer and I can’t resist the urge to nitpick so for maybe this one.

KATE: And a lot of the and there’s also an edition of different civil laws, but all of them really relate to sexually explicit material that involves minors, so not necessarily revenge porn, so it wouldn’t expand civil liability to be able to go into revenge porn specifically.

BLUNT: And I’m seeing a question here about how this act could possibly be used by law enforcement in not just sting operation but used to criminalized more citizens that otherwise wouldn’t be considered criminals.

Yeah, and I think just everything would be — have the potential to be monitored, the government wants to see what all of us are talking about, and like who opposes the government and what activists are doing. So I think it — even if — and like also a lot of forms of sex work aren’t criminalized but would still be policed and surveilled with this legislation.

KATE: Yeah, I think one thing that’s really important is when we talk about criminalization, we also have to talk about kind of the — the number or the expanse around criminalization. So it’s not just about you send a text and you’re going to jail, it’s also about being kicked of platforms. It’s also about losing your account, it’s also about the way that is we marginalize and surveil and control behaviors and chill the behaviors of a lot of different folks. It is one thing to say, all right, well you’re still going to be allowed to share information that is sexual health related. It’s a very different thing to say, but it’s going to be owned by a private company who is going to have decision-making power over that, who is going to be able to control a lot of the information, like metadata and geolocation information, and record everyone that sends that and possibly send that to a third agency.

You know, when we — yes, we can talk about these things in terms of like very specific criminalization, but is everyone who works in the sex trades around sex workers no, criminalization begins long before anyone gets handcuffs put on them and extends long after anyone gets a charge.

BLUNT: Thank you, Kate.

I’m reading a question that says, I’m a former sex work that did work privately once. I got into my career. Should I be worried about online accounts that I utilized for talking to clients works that be accessed and what steps should be taken to protect ourselves. So I think this question is about past content online.

KENDRA: Sorry, I was dodging that one in chat because it’s a really good question and I don’t know the answer. Which is to say that it’s somewhat difficult to predict exactly what you should worry about under this bill. I think that, like, so I don’t feel like I have’ great answer for you. Other than I think generally affects efforts to pass bills like this do increase the chances that you may have to worry about past messaging.

I see a question, Blunt, do you mind if I take the one about libraries and fast-tracking?

So I see a question about libraries and fast-tracking or sort of circumvent legal challenges. I think that that is likely, but they, like the fast-tracking procedure is part of this in order to circumvent legal challenges or sort of appeals from people who are being — who might be able to put the brakes on things. I also think, you know, so the fast-tracking procedure is actually a new edition to the bill. As Riana Pfefferkorn, sorry I think I just totally butchered your name from Stanford has written about — the previous draft of the bill didn’t actually include the fast-tracking provisions, but they were added when civil society pushed back and said, hey, you’re not even passing these best practices into law, so I think that, you know, it is both an effort to limit debate, but also to make it — it’s actually an effort to make the bill slightly more constitutional. Which I think I appreciate at the same time that I’d rather it get passed.

BLUNT: I’m just taking a second to look at questions. We’re going to be talking about how to get people mobilized outside of just the sex worker community as well and the way that we can sort of extend our organizing tactics in a little bit, so we’ll get to those questions in a minute. I’m just going to look for other questions.

Someone asks, are there anything in the bill that will make the concept of decentralization a platform null and void with the methods being used for censorship?

KENDRA: So, you’re rapidly going to become tired of my I don’t think we know answer, so I do have a slightly better answer for that one, which is that, you know, the newer suggestions for what to consider as part of the best practices are development of reasonable measures, take into account the type of platform. So if it was, you know, decentralized platforms may be under slightly different obligations with regards to content than more centralized platforms.

On the other hand, it may be possible that the best practices that are promulgated won’t actually be possible for decentralized platforms to comply with just because they don’t actually necessarily have the ability to block content in the same way. So the — you know, it’s difficult to predict exactly what the consequences would be, although, you know, it is heartening in some ways that the more recent draft of the bill does sort of think spare a second to think that platforms are different and the same thing won’t work for everybody.

BLUNT: Great. So another question, what about sites like OnlyFans, Patreon Chaturbate, will how EARN IT change or affect that or similar sites?

So I think what we’ve seen in the last few months are a bunch of leaks from websites like this that are happening now without the EARN IT Act. And so when sex workers are unable to use the traditional, like payment processors or website hosts that other folk in other industries are able to use, would it mean that we’re pushed sort of the to the margins to use these servers, these platforms that don’t have as good security practices, and so that’s what I think is behind a lot of the leaps that we’ve seen. And I think that the EARN IT bill would only make this significantly worse, especially since sex workers and activists are some of the primary targets to attacks and doxing.

DALY: Can I comment on that if possible, just from a security point of view on that like, yeah we’ve seen I can think of off the top of my head, breaches and leaks and hacks from different sex platforms over the last couple of months as we’ve said throughout this we don’t know yet like how, we keep saying backdoor, or unencrypted versions we don’t know yet what that will look like, just that it needs to be provided to these special committees. And opening that up in some way is just going to provide a huge vector of like a security concern. So for platforms like Chaturbate or, you know, etc., like put on your hacker hat for a second and try to imagine, the almost limitless possibilities of attack factors there are in this situation where we don’t know yet how the information has to be presented. So if you are an a performer on there or maybe just a registered user of any kind perhaps it can be like registry information is suddenly readily available and in clear text to everyone, all the way down to like banking information, all the way down to actual communications between you and other users on those platforms.

BLUNT: Thank you, Daly. Sorry SX, go ahead.

SX: I don’t — I’ll go ahead. I just wanted to just take a moment and talk about kind of the ethical, kind of philosophical impact this could have on the sex technology forums and the way sex tech is created. So as we know these Chaturbate many individuals, all these platforms are use owned by cis tech white dudes, these big powerhouses in the technology realm and they hold a lot of resource.

So when you pass bills like EARN IT, all it does is encourage them to comply and work with law enforcement. And what that can also do is have them not actually respect the performers of what the work that they’re doing and actually give them rights.

So when we think about how interfaces and websites and navigation of these sites are built, we have to talk about the overall sex tech industry and financial technology industry. So when you have bills like EARN IT, it does not encourage these platform founders to actually treat the people correctly the way they should be ethically and also the money exchange and the power dynamics that happens. So this is like a privacy and kind of political issue, but this is also a psychology-philosophy issue as well, with how we ethically create technology.

BLUNT: Thank you, SX.

We had one clarification question. Daly, would you minds talking very briefly what a decentralized platform is, just to we can make sure we’re defining it because we’re talking about it.

DALY: Sure. So I guess let me first touch on what a nondecentralized platform is or a centralized one, so like Instagram, for instance is just like an application or it’s a software that everyone engages with and uses, but it’s all contained and owned by private company, one of the big ones. And then so like a decentralized platform, they’re often like open source pieces of code or at least like an open sort of protocol that gets used, meaning you can take an instance of it or where you get to use like a clone of the application or software and you get to host it yourself and use it in any way you like for whoever you like, wherever you like, however you like, etc. — I mean however within boundaries, right?

So think of like, there are I think someone in the chat earlier mentioned like Mastodon, imagine Mastodon, it’s like a version of like an Instagram, like Instagram is this huge monolith you have to use and there is mass to done, which can be like a clone that anyone owns and uses. And often depending on the type of protocol or whatever it’s using, maybe those different, like clones or nodes of instances can communicate with each other or they’re completely closed and it’s just a closed community. Yeah. I hope that makes sense.

BLUNT: I think that made sense.

So I think we’re going to move on to the next slide so we can stay on schedule. Could someone actually first summarize the EARN IT Act in like one to three sentences, which I think is a great idea before we move into how the EARN IT Act will affect you.

KATE: Sure so right now we’re going to ask some folks in the chat sex we’re going to give you some prompt questions one by one and we’re going to save this chat, it’s not going to be viewable on the video but we’re going to save it, and especially as we start talking about and thinking through next steps, really pull from people’s experience about what your fears are when it comes to the EARN IT Act.

So the EARN IT Act establishes a federal commission to create best practices around child pornography and trafficking that websites and platforms will have to certify against, and by doing that, surveil all communication on those sites for possible red flags.

So that is the EARN IT Act, soup to nuts. And our concerns are really that their mandate is incredibly broad, that there is 0 oversight or accountability, that the things that they have to consider are not related to harm reduction and it’s impact on users, to collateral consequences, but literally it says you have to consider cost and you have to consider its impact on competition. So not actually thinking about people, and that if there — there are no metrics for success that there are no metrics for failure. So when this does have implication written into the bill is that no one cares what the unintended consequences are.

So we’re going to start asking some questions that we would love for you guys to respond to in the chat and we’ll read some of those over the next 5, six minutes we’ll read some out loud so it’s not just empty space for you to stare at our beautiful faces, but we would love you guys to think about and talk a little bit about how you think that this would impact you and your life.

So Kendra.

So first and foremost, do you intentionally use encrypted platforms? Why, what for, why is that important? So give folks a minute to just type that out. So folks are saying they use it for sex work, for organizing, for social justice. Oh, it’s all coming in at once. To chat, to communicate for work, for organizing. It’s mandatory for talking to sources. I’m guessing that one is a journalist, protects conversations with clients with fellow sex workers with loved ones, confidential communications.

Yes to buy drugs, we definitely have to talk about that, this. This is going to affect a lot of people who really utilize these encrypted platforms.

Next question.

So what would it mean for you to lose these platforms and how would this legislation affect your community?

Someone said I use encrypt the platforms to keep myself safe not just from criminalization, but from abusers to organized harm reduction, to organized groups, to do anti-ICE work.

KENDRA: Fuck ICE.

KATE: Yes, defund ICE.

BLUNT: And I see people saying that it would make them more scared to organize and to engage in even like calls like this.

KATE: Also seeing people saying the risk of deportation, people getting even increase charges, being come after because of increased surveillance and policing, the impact of having lives like your sex working life and like your straight life or civilian life, having to blur, not being able to have those kinds of boundaries as well.

Worried about travel and being questioned at the border, being outed and profiled.

BLUNT: Someone said stress and fear, which I think are very real. Stress definitely — and fear definitely chills speech, so it would just change the way that we talk and the way that we communicate.

KATE: Not being able to protect my more vulnerable friends, losing community, not being able to communicate securely with survivors.

I would speak more in code with my clients which breaks down my ability to uphold personal boundaries…

Absolutely.

Loss of access to online sex worker community I rely on for safety and support. Chilling speech about boundaries. We have someone who is writing from a mutual care organization, saying that like, this would compromise the integrity of people being able to ask for funds with them.

Uncomfortable speaking with my therapist online.

Let’s go to the next question have you had problems with websites taking your personal holding information this isn’t just about end-to-end encryption, this is also going to probably increase the amount of information that websites ask of you, where they hold it, how they hold it, who they transmit it to.

Wow someone here disclosing that they’ve been doxed on popular stalking platforms, people naming Eros, people naming a number of platforms that have already come for them.

Yeah uploading an ID is absolutely going to be on that list.

More people that have been doxed and outed, more people who have been doxed and outed.

BLUNT: People who have their content leaked from adult companies. Social media websites requiring verification.

KATE: Organizers being doxed.

BLUNT: Being identified as a revenge porn victim publicly. And leaks being a huge concern.

KATE: Let’s pull up the next question. So what concerns do you have about the bill. And I know this is really broad and we’re also going to send out these questions afterwards and we’re going to give you some different ways to be able to share more of this information. So if something comes to mind, if you’re still thinking about it, if you don’t necessarily want to put it in the chat and have it only go to organizers, all of that is going to be available to folks.

RED: The political implications of this bill, yes. What is this going to mean for our ability to dissent, our ability to organize? Right? The political that we all hold as like folks who are working or hustling, right, is this SX. The concern that this is being pushed kind of so quietly, yeah absolutely, that there is not a lot of talk about it right now especially given the COVID-19 crisis and pandemic.

The concerns around already being shadowed banned how can we get this information out to others if we’re already experiencing these kinds of like platform banishments.

BLUNT: I’m seeing concern about the bill being snuck in amidst a pandemic, fear of not being able to have a big enough community to successfully oppose the bill.

KATE: Well, that is a beautiful way to segue into the next slide. So where is the bill right now? So EARN IT is Senate bill — oh, sorry.

KENDRA: Sorry. Can I just pause for one second.

KATE: Yeah, of course.

KENDRA: I just want to take a second to thank everyone who shared in chat because there were a lot of hard, really scary really different things people shared, concerned being doxed, being outed I think I speak on behalf of all the organizers that we’re grateful that you shared that with us in this context, we’re grateful you chose to spend time telling us about it before we sort of move to the next part.

KATE: Thank you so much for that Kendra. Really appreciate you taking a moment for that. Yes thank you so much to everyone. We have a lot of gratitude for not just this community but for folks coming together and really wanting to push back against it and also being willing to put that out there. So thank you to folks. And we have so much gratitude for you.

So first and foremost I want to say that on this slide, we will fix it, that is the absolute wrong bill number. It is 3398 and I apologize for that.

So EARN IT, was introduced — thanks guys, sorry, EARN IT was introduced into the Senate only a couple weeks —

KENDRA: Kate, do you want —

KATE: Yes, please. So it was introduced into the Senate a couple weeks ago. So what it has been introduced and put into committee, which is the first step after a bill is introduced. It has had a hearing in that committee only a couple weeks ago. And is looking to be pushed very hard by — thank you — by sponsors who are Senator Graham and Senator Blumenthal, Senator Blumenthal might be familiar to you if you follow SESTA/FOSTA, because his staff wrote SESTA.

So it has been introduced.

The next steps in a legislative process for this are going to be that it would have to be voted out of committee, that has not happened yet. I have not heard anything about it going to that stage. But it has been introduced, it is sitting in committee — sorry we’re fixing it again.

It is sitting in committee right now, which means we know who the targets are, which we know a very short list of folks who are responsible for voting on that bill next and talking about that bill next. So we know exactly who we need to call.

It is — so a lot of our next steps are going to be wrapped around those targets in particular in the Senate. For a bill to pass through Congress, it has to pass through both the Senate and the House. It has not been introduced into the House right now and they are shopping it around. So what’s really important is to know that we still have time to contact folks, to contact staffers, to have an impact on this. And, oh, it is in the Senate judiciary committee.

And the House, it might go to two different committees, those are details that I promise you we will put into our follow-up about who the targets are in the house. Because, even though they don’t yet have it and even though we are in the middle of a pandemic right now and possibly losing our government, they are still individually calling house reps to try to get them on it this bill.

So what our next steps are going to be around? So yes, they are shopping it around in the house, which means they are trying to identify the right people to introduce it in the house it get it to move in the way that they want. Even if in the middle of a crisis, we know that staffers are making personal phone calls to try to get this in front of people. So we know that they are trying to, if not move it, at least build as much support.

And one of the challenges that happened with FOSTA/SESTA was by the time the staffers got phone calls, they were like, we signed on to this bill like six months ago, we can’t vote against a bill that we’re a co-sponsor on. These come back to us next time, please keep talking to us. And so, if you click we’re going to talk about next steps. So our next steps are that we’re going to be sending you a bunch of stuff. As I follow-up to this call and trying to make available to more people, we’re going to be sending a survey talking a little bit more about how this is going to affect you and Blunt if you want to share a little bit more about that.

BLUNT: Yes. We will be sending a survey so that folks who feel comfortable sharing more publicly in a quotable form, it can be anonymously, pseudonymously, cannot say that word, but basically the questions that we just covered, we’re sort of hoping to gather information as well as community source information on the way that they want to see this movement that we’re sort of building around the EARN IT bill progress, and where we think we can build power, what communities can have access. So I will be dropping that link right now. We have a social media power hour planned for April 2nd. So stay tuned to Hacking Hustling on Instagram and Twitter for more information about that. And we’ll be sharing in the next few days scripts for suggested tweets and posts in a few memes as well.

And I also just want to say again, like we’re so thankful for everyone sharing in this chat and taking the time to be here. We were just totally blown away how quickly we got 300 RSVPs and I think that there’s a lot of fear in community after FOSTA/SESTA was signed into law and how little information there is out there about the EARN IT Act and I think that we can really take this time and opportunity to build power together and oppose this legislation. So I’ll be now dropping the link into the chat for the survey if you just give me one moment.

KATE: Yeah, we’re also going to be with that — with that information we’re going to be doing the social media power hour. We’re going to ask everyone to just take ten minutes, here is — here is our big ask, we want you to — we would love for people to engage in the social media power hour. We would love for people to call their senators and call their reps, and we do have a short list of target senators, we do have a short — slightly longer list of target reps in the House. And it’s a little tricky because it’s not very typical to make phone calls on a bill that hasn’t been introduced yet, but we know they’re trying to get folks on that so we’re going to be sending out that list.

We’re going to put information about where they’re from for senators there’s two senators that cover each state, for House reps there is different — you have one rep for more localized area. So we’re going to put information on those senators and on those reps. If you are in a state where you are totally solid, your senator, your rep is going to be fantastic on this, awesome. We would love for you to find one person to ask who is from either — from one of those target locations and say I really, really need you to make this phone call. So definitely make your three phone calls, each of them takes all of like two minutes and then we would love for you to say, hey, you have a really important rep and we really need you to call and we really need you to let them know that this is an important bill that is probably coming through their staff right now. And yeah it says trafficking, yes, it says child porn, yes, it says law enforcement and a lot of people are going to sign on based on that but please just pause and remember, and please just remember that it’s really important for those conversations to happen and for people to stop and pause before they sign on. And make sure that this doesn’t move forward with a lot of those keywords being the thing that people focus on.

BLUNT: Awesome. Thank you so much for that, Kate.

So I am going to send out — yes, so answering the questions, we’re going to post the video recording as well as the transcript for this. So folks who are hard of hearing or deaf can also learn from this call and we will tweet that out. It will be on our website on the link that I sent. I’m sending out the survey monkey again. It really will take like ten minutes, if any of you empowered to share more publicly what you shared in this group chat, that would be totally amazing, it would be super grateful. We ask for suggestions for tweets for a Twitter storm, suggestions for hashtags as well as to share a little bit more about your — about how — how you’d use encrypted services, how the EARN IT bill would affect you, and, yeah so that would take ten minutes and we will e-mail everyone on the call when we have our social media power hour scripts to send to everyone to help organize that. And thank you again. Thank you so much.

KENDRA: Thank you everyone for joining.

BLUNT: We’re two minutes. I’m very impressed with us.

KENDRA: I literally did not believe that was possible.

BLUNT: People showed up half an hour early, we’re two minutes before, if anyone has any questions and the Hacking Hustling website is going to be updating the EARN IT page which he shared, which I’ll share again as we continue to mobilize around this bill.

SX: Yes, SX Noir here thank you all for joining, and I am available for any journalistic quotes or any kind of conversations you want to navigate outside of this conversation. And as always, be thoughtful.

KENDRA: I’m going to stop the recording now.

ERASED: The Impacts of FOSTA-SESTA and the Removal of Backpage

Ariel Wolf and Danielle Blunt present their findings from their research, ERASED: FOSTA-SESTA and the Removal of Backpage, at Harvard Law School.

Legal Literacy Training

Yves, Lorelei Lee, Kendra Albert, and Korica Simon present on the First Amendment, Section 230, Patriot Act, and the ways in which fear creates a push for state surveillance and the impact that has on our community. We contextualize this history in the context of FOSTA-SESTA and the EARN IT Act.

Transcript

DANIELLE BLUNT: Hi, everyone. I am Danielle Blunt. I use she/her pronouns. I’m with Hacking//Hustling, a collective of sex workers and allies working at the intersection of technology and social justice to interrupt state surveillance and violence facilitated by of technology. Do we want to go around and do introductions first?

KENDRA ALBERT: I can go. Hi, everybody. My name is Kendra Albert. I’m an attorney, and obliged to say via that that none of this is legal advice. I work at the Harvard Cyberlaw Clinic as a clinical instructor there, and also do some work with Hacking//Hustling, and my pronouns are they/them. I’m super excited to be here with y’all.

YVES: Hi, I’m Yves. I use she and they pronouns. I’m an organizer with Survived & Punished New York, which is an abolition group. And with Red Canary Song, which is around supporting migrant sex workers.

KORICA SIMON: Hi. I’m Korica Simon. My pronouns are she/her. I’m a law student in third year at Cornell. I got involved with sex worker legal rights when I was in Seattle, and I worked for a nonprofit called Legal Voice. And I also got involved through a clinic at Cornell called the Gender Justice Clinic. So I’m excited to be here today and talk to you all more about the subject.

LORELEI LEE: Hi, everyone. My name’s Lorelei. I’m an activist and I work with Red Canary Song, as well as Hacking//Hustling, and have worked with these folks on issues of surveillance and tech‑related violence that’s impacting people in the sex trades for the last few years. So I’m really excited to be in conversation as well!

DANIELLE BLUNT: Awesome. And Yves, do you want to kick it off with a little community report back? That’d be a great place to start.

YVES: Hi. So, yeah! I’m just going to talk about a little bit about how we got here, and then a little bit how it’s affected mainly the sex working community, but also in general.

So what we’re mainly talking about, or what I’ll mainly be talking about, is around contact tracing and public health, like, uses for surveillance. And then also SESTA/FOSTA and the EARN IT bills.

So these have all increased policing, and I don’t just mean the police department, but also through citizen surveillance and deputizing not only just like people, but also deputizing nonpolice department agencies and companies. Right? This includes a lot of different people, and this really like increases the scope of policing and criminalization in a way that it wasn’t previously. And the way that this has happened is, like, structural violence that has led to so much harm to already marginalized and stigmatized communities.

So kind of the reason why this is, like, happening the way that it is is because a lot of the conversation around surveillance, like, belies powerful moralism that resists against evidence and logic. We see this like happening post‑9/11, where you get this idea of like, if you see something, say something. Which sort of creates these hypervigilant crusaders for both antiterrorism but also anti‑trafficking, which is also an issue that is often tied up with sex work.

So when we see this happening ‑‑ the surveillance has increased in such a way that we’ve seen this happen before? Like, this predates all of these sorts of things. Right? They’ve used contact tracing before to criminalize… different marginalized communities and sex working communities for a really long time. With HIV/AIDS we see as an example, right? They ask you, when you get tested for HIV, they’ve also criminalized the spread of HIV period, right? But when you go into a clinic and you are asked, you’re gonna be asked if you test positive, or if you think that you have any STI, right, but especially HIV/AIDS, they’re going to say, who have you had sex with? Like, name those people. In what timeframe did you have sex with those people? We want to contact those people; how do we get in contact with them. Right? Whether that’s from someone directly, or from you yourself when you go in to get tested. They’re getting that information, and the truth is that information doesn’t just stay at the health clinic, doesn’t just stay where you think it stays. Right? That information gets passed on to the police, to these different agencies like CPS, right? And that leads to the criminalization of a lot of people, not just sex working people. Right? Also people who are profiled as sex workers. They use that information to be like, oh, you’re selling sex. So then the cops are going to come knocking on your door. And this similarly happens with COVID, right?

So we’re seeing what’s happening with COVID is they’re using contact tracing in a very similar way. And also, if you’ve seen the public rhetoric around this, right, the way public health officials and government officials are talking about it, they say that sex workers are high‑risk people, right, to have COVID. So if you are working in a massage parlor or on the street and you come in to get tested for COVID, they’re going to be like, oh, how did you get contact ‑‑ how did you get this? Who were you in contact with? Or if someone you know has got COVID and goes to get tested, they ask who were you in contact with, and you tell them, oh, I went to this massage parlor or met them on the street, that’s a way of criminalization. The truth is they are not just out to treat you, right? They are going to turn over that information to the police. This happens to spread the scope of policing in so many different ways, not just in the sex working community. Right? In a lot of different communities, they use these exact same surveillance techniques. We should think of surveillance as a strategy that exists within the larger frame of policing.

Also contact tracing, like, these different methods of policing are used to marginalize other communities, who are not sex workers but are targeted as if they were anyway? We see this targeting protesters, recently. We see them surveilling a lot of communities in this way. How did you get this? Oh, you were at a protest last week? Who was there? This all happens in the same scope, and in all of these different agencies that have then been deputized. Right.

What happens when you have these situations ‑‑ with SESTA and FOSTA and EARN IT, we wouldn’t directly think of them as surveillance? Oh, they’re censoring and taking down these sites; that’s not directly surveillance. But that’s not exactly true. A lot of these bills are also like collecting information, right? Because they’re putting these sorts of laws in place to go against sex trafficking? So they tell you, these are the indicators of sex trafficking, looking for these indicators. But really, those indicators aren’t indicators of sex trafficking most of the time and are also spread out to sex workers and other people and people who are profiled as sex workers. So it’s also a lot of data collection that is happening in order for them to censor, shadowban, and do all of these things anyway. And that data is also being turned over to the police to be used.

But also, we generally see the kind of attack that happens to sex work coming in from all ends. Where like, so SESTA and FOSTA, we saw this. Right? This is just what happened already. People are pushed to work in more dangerous ways because you can’t go on Backpage, you can’t go on Craigslist ‑‑

DANIELLE BLUNT: Can I stop you for one sec? That is amazing. Thank you so much for that. I just want to take a moment to ask for definitions on what is contact tracing and just, like, anyone who wants to jump in a one‑sentence summary of FOSTA‑SESTA and EARN IT, just so folks are on the same page from the forefront.

YVES: Well, I can like expand a little bit on contact tracing, right? So when an epidemic occurs, right, which I brought up HIV/AIDS, how they kind of figure out who might have it so that they can “treat them,” right, in the best case scenario, this information would not be used to police people. Right? But that’s not what happens. But, like, contact tracing is when they try to figure out who has had it, or who gave it to you, and like ‑‑ or who you could have possibly given it to, right, in order to stop the spread, so you can get those people into treatment centers or in treatment.

So if I, for example, was to go into a clinic, and I was like, I have chlamydia. Right. And they’re like, okay, so you have chlamydia. Who did you have sex with before this? Who did you have sex with after you thought you might have shown symptoms? And then you like list them off, like, okay, I saw Blunt! I saw Kendra! I saw Lorelei! I saw all of these people! And then they’re like, oh, did that person tell you that they have chlamydia? Did that person tell you that they have HIV/AIDS, da da da da da. We see the criminalization of HIV/AIDS like similarly with the criminalization that’s happened with COVID, where they’re criminalizing directly the spread of COVID and HIV, but also generally, right? They use this information to be like, oh, who did you get this from? And ideally, they wouldn’t police people, but what ends up happening is they ask you all of this information, and then they kind of pick out those indicators to be like, you’re a sex worker. Like, you’re selling sex. Right? So we’re gonna like show up, and we’re going to arrest you. Right.

I hope that that makes sense.

DANIELLE BLUNT: Yeah. And I also think, too, with the protests that are going around, I think that there’s a lot of contact tracing that’s being done with like sting rays and cell phone tracing, which I hope some other folks can talk about in a little bit. And I would just love like a one or two‑nonsense summary of FOSTA‑SESTA and EARN IT before we go into them in more detail.

KENDRA ALBERT: Um. I will do my best. Lorelei is watching me with an amused look on their face.

So, FOSTA and SESTA are laws that were passed in 2013 that greatly increased the incentive ‑‑ F‑O‑S‑T‑A and S‑E‑S‑T‑A. Thank you, Blunt. That greatly increased the incentive for online service providers, folks like FaceBook or Twitter or Craigslist, to remove any content related to sex work or possibly attributable in any way to sex trafficking or related to sex trafficking. And we can talk a little bit more about how specifically they did that, but that’s like the one‑line top‑level summary.

EARN IT is a pending bill in front of Congress right now that is meant to do some similar ‑‑ basically, engages in some similar legal stuff around child sexual abuse material, making it ‑‑ incentivizing companies and online platforms to be more potentially invasive in their searches for child sexual abuse material by creating more liability if they’re found hosting it.

Lorelei, how was that?

LORELEI LEE: Great. I think that was great. It’s very ‑‑ I mean, FOSTA‑SESTA has a lot of parts, and so it’s… But I think what you are talking about is the most important part, which is the impact of it and what it incentivizes.

And the one thing I would add about EARN IT is I think what EARN IT will do that is similar to FOSTA‑SESTA is that it will incentivize companies to remove all information related to sexual health and anything that teaches youth about sexuality.

DANIELLE BLUNT: Thank you.

KENDRA ALBERT: Very upbeat.

DANIELLE BLUNT: And ‑‑ yeah. (Laughs) We’ll be getting into those a little bit more. I have one more question for Yves: Is turning health info to the police doable via legal loopholes, like HIPAA, or is that happening in the shadows?

KENDRA ALBERT: I can also take that one, if you prefer. So HIPAA ‑‑ HIPAA, which is the U.S. health care privacy law, federal health care privacy law, explicitly has a carve‑out for so‑called “covered entities,” health care providers, turning over information to law enforcement. So it specifically says, you don’t need to get consent from people to turn their information over to law enforcement. So HIPAA doesn’t prevent that.

You know, another thing, and actually I think this really ties in really nicely to some of the stuff we want to talk about a little bit, like the Patriot Act, which brought in surveillance powers to the U.S. government, passed in 2001, right after 9/11, is a lot of times even if there isn’t an explicit process for getting a law enforcement agent or even a public health entity to request information, say if they went through appropriate legal process, many ‑‑ there are now ‑‑ there are often legal regimes that encourage what’s called “information sharing.” Which just basically means, like, that there are… They try to eliminate, like, privacy or other reasons that information might be siloed between different parts of the government or different governments, like federal, state, local. So even if, you know, you don’t have law enforcement knocking on the health providers’ door with a request for information, like a subpoena or whatever, there’s often these efforts to kind of standardize and collect and centralize these forms of information.

Yves, do you have anything ‑‑ do you want to add to that? Did that feel like an adequate summary?

YVES: I feel like that was very clear. Yeah.

KENDRA ALBERT: So I think ‑‑ Blunt, do you mind if I keep going to talk about Patriot Act stuff for a second? So I think that, you know, one of the things that I think is worth noting is you see a couple general trends, in surveillance. And I’ll also let Korica talk more about some particular ways this plays out in particular communities. But just on a super high level. We can see this sort of movement from the Patriot Act to now, where A, we see more requirements around information sharing. One big critique of the U.S. government’s… I hesitate to say intelligence‑gathering apparatus with a straight face, but! And what I mean by that is the CIA, the NSA, the sort of intelligence agencies, as opposed to more traditional law enforcement agencies like the FBI or police. Was that they were gathering all this data, but they weren’t sharing it in ways that were actionable across multiple agencies. So when the Patriot Act was passed after 9/11, one of the goals was to make it easier for agencies to share information. I think that’s a general trend that’s happened, post‑Patriot Act to the current moment, where we see things like fusion centers and other ways to collect surveillance data from multiple ‑‑ and Palantir’s databases, and ICE’s data collection… Data that’s collected across multiple methods of surveillance and putting it together to gain more information about the lives of individual people.

Obviously, this has dramatic effects on sex working populations, often because, A, often specifically criminalized and over‑surveilled? But also, you know, if… Often, information that is innocuous, sort of not raising a red flag on its own, when combined with other information can suggest more specifically what kind of activities folks are engaged in or who they’re spending time with.

The other thing I want to highlight about the Patriot Act is surveillance after ‑‑ well. Two more things. I’m trying to be brief, but the lawyer thing is everything comes in threes, so I have to have three things I just want to highlight about the Patriot Act. Okay! So number two is that you see these particular surveillance tools be originally deployed for what’s considered very, very important law enforcement activity. So originally, actually, a lot of stuff we talked about in the context of antiterrorism work, and that’s what the Patriot Act was about. But over time, these law enforcement tools get sort of “trickle down,” for lack of a better term, into sort of more day‑to‑day enforcement activities. So we’ve seen these actually a lot with something called the sneak‑and‑peek warrant. Which that’s actually a term that, at least, Wikipedia tells me, that the FBI coined, not anti‑surveillance activists? It’s kind of funny that the FBI thinks that’s a good description of what this thing is. But basically, traditionally, if someone gets a warrant for searching your property, it’s basically a document where you go before a judge and you say here’s why we want to search, and the judge says, okay. You said where, you said why you’re allowed to. I’m going to sign off on this, and you police can go search that person’s house, for example.

So traditionally, you know, if you’re at the house, and police show up, you can ask them to see the warrant. And say, hey, I want to confirm that this is the warrant that allows you to search my house. What a sneak‑and‑peek warrant does is say ‑‑ is actually allows ‑‑ this is kind of, it kind of sounds like bullshit, but this is what really happens. Right? Allows the police to set up a ruse? Like, oh, to get you out of the house, go into the house, and sort of search your stuff. And actually, like, one of the contexts in which we’ve seen this really recently, and one of the reasons I connect this back to the Patriot Act, is the surveillance on massage parlors on the Robert Craft case in Florida. Actually what the police did is get a sneak‑and‑peek warrant, claim there was a bomb threat or a suspicious package at a nearby building. All of the folks who worked in the massage parlors were sort of escorted ‑‑ had to be away from the building for their own safety, and the police went in and put cameras in.

And we know this because they tried to criminally prosecute Robert Craft, and Robert Craft had enough money and sort of ‑‑ to hire a legal team that was able to challenge the validity of the sneak‑and‑peek warrant that they used to surveil the massage parlors and others working there.

So, when those warrants were included in the Patriot Act, there’s nothing in there about human trafficking investigations, let alone the stuff that happened in Florida, which actually ‑‑ there’s no human trafficking prosecution coming out of it. Right. From what I read, and others may know better than I do, so I’ll cede whatever claim I have to the truth there, but. It doesn’t look like human trafficking was involved.

So sneak‑and‑peek warrants weren’t written into the law for those sort of investigations, let alone surveillance of prostitution, but these information technologies ‑‑ and here, I include technologies in the computer sense but also in the ways governments do surveillance. Once they get written into the law, their use often gets broadly expanded to new populations, new circumstances. It’s sort of like, you might as well use it.

I had a third thing about the Patriot Act. But… I guess the third thing I’ll say quickly, before I stop, is that I think one thing you’ll see a lot of in discussions about surveillance reform, and especially how it sort of fits into conversations like we might have here about sex work, is sort of an inherent trust that like procedures are going to save people. (chuckles) Which, I’m really skeptical of, just sort of personally? But you know, if you look at sort of what happened, you know, between the Patriot Act and now… So, there was ‑‑ there’s a thing called Section 215 of the Patriot Act, which basically functionally allowed the National Security ‑‑ the NSA to search people’s call logs to see who was calling who. And this was a ‑‑ there was like a legitimately robust debate about this. But one of the sort of reform methods that was actually put on the table and passed as part of the… USA Freedom Act, in I want to say 2016? Don’t quote me on the date. Was actually, they were like, okay, great. Well, the U.S. government can’t hold this giant database of call data anymore. They can’t see who you called and when. But they can go to the phone company and ask them.

And like, yes, that is better. I’m not gonna ‑‑ I would rather they have to go and ask Verizon nicely before they get the call data. But functionally, I’m like, that’s not ‑‑ that’s not safety. Right? And I think that, you know, when we see a lot of the surveillance ‑‑ some types of surveillance reform activity, especially post‑Patriot Act, we’re not even getting close to back where we were pre‑Patriot Act. We’re sort of, like, trying to kind of tinker around the margins, slash maybe add a teeny bit more process. That’s not going to help the folks who are most criminalized and most surveilled.

Anyway. That was a lot from me, so I’m going to stop. Blunt, do you have another question you want to tee up, or folks want to react to any of that?

LORELEI LEE: I have a question, actually. What is a fusion center?

KENDRA ALBERT: Um. Well, so I’m ‑‑ I’m gonna do my best. It’s been a little while. But the… Ha. Despite the weird, kind of futuristic name, it’s basically where all the cops get together. So different ‑‑ (Laughing) Yeah. Different types of law enforcement often, like, have different beats. So one of the goals of fusion centers is to like combine information and share policing and surveillance information from different law enforcement agencies. So that can be, like, you know the one actually I’m most familiar with is the one outside of San Francisco. And there’s been a lot of ‑‑ I don’t to erase ‑‑ there’s been a lot of really amazing research and activism against fusion centers. But actually often and primarily by communities of color. But usually, they’re like… It’s where the San Francisco PD and the Marin County PD, which is the county north of San Francisco, and the BART ‑‑ which is the public transit, one of the public transit organizations ‑‑ where they cops would all share information and sort of share tips. And they were a result of the sort of, the attempt after 9/11 to deal with this ‑‑ what people saw as this problem of all this information being siloed.

DANIELLE BLUNT: Awesome. Thank you so much, Kendra. Korica, I would love to hear from you if you feel like now’s a good time to chime in.

KORICA SIMON: Yeah. So I can speak a little bit on the history of surveillance. So, as Kendra stated earlier, marginalized communities have historically been affected, have been victims, of government surveillance. And surveillance does have roots in slavery. So in 1713, New York City passed lantern laws. These laws were used to regulate Black and Indian slaves at night. So they ‑‑ if you were over the age of 14, you could not appear in the streets without some kind of lighted candle so that the police could identify you.

And we’ve seen like this same thing recently, with NYPD, where they are shining survey lights like floodlights in Black communities. And we saw that increase after they received a lot of criticism over stop and frisk. And people in those neighborhoods were reporting that they could not sleep at night. Like, the lights were just blinding them. And Simone Browne has written a lot about this subject, and how the ways in which light has been used to surveil people. And I believe they also write about technology, as well, and how we’ve moved to that side of things.

So in regards to technology surveillance, one of the most well‑known abuses of surveillance by the government is COINTELPRO. I think it stands for counterintelligence program. It was basically a series of illegal counterintelligence projects conducted by the FBI aimed at surveilling and discrediting, disrupting, political organizations. So the FBI in particular targeted feminist organizations, communist organizations, and civil rights organizations. And the government’s job was basically to do whatever they could to just like disband them and get rid of them by any means necessary. And they mostly did this through, like, wiretaps, listening in on people’s phone calls, tracking them down, as well as having informants involved as well.

And as a result of this, quite a few people were murdered or put into prison. Some Black members of the Black Panther party are still in prison. And… Two of the most talked about people who are victims of this are Martin Luther King, Jr., as well as Fred Hampton, who was drugged by an FBI informant and then murdered by Chicago police. But also Angela Davis has been a victim of this as well. And again, we know that these practices are still continuing today. So we kinda got into the protesters and how they’re being surveilled. And I think it came out in 2018, 2019, that Black Lives Matter activists were being watched. Their activity was being watched on the internet. And now we have seen recent reports that protesters today are being watched, as well, either through body cameras, cell site simulators, license plate readers, social media, drones, as well as just cameras in that area that may use facial recognition technology that could help the police identify who the protester is and get access to your social media accounts.

So these are all, like, issues that are happening today as technology increases. We’ve only seen it get worse. And we know that marginalized communities are the most affected by this. If they use this on Black, Native, Latinx, immigrant communities, they’re also going to use this on others as well. Sex workers, and sex workers mostly fall into marginalized communities. So.

I don’t know if I should talk about the third part right now, or if I should wait? ‘Cause it’s a little bit different, but… Okay. I’ll just go ahead. (Laughs)

So, kind of transitioning a little bit t. The Third Party Doctrine is a doctrine that comes out of two Supreme Court cases, United States v Miller and Smith v Maryland. And what they state is if you voluntarily give your information to third parties, then you have no reasonable expectation of privacy. So third parties include your phone company, Verizon, Sprint; e‑mail servers, if you use Gmail; internet service providers; as well as banks. And so that means that the government can obtain your information from these companies without having a warrant. So they don’t have to have, like, probable cause that you’re doing something in order to get access to this information.

And the Supreme Court’s logic behind this decision was that, well, if you tell someone something, then you’re giving up your privacy, and like you can’t expect that that will stay private forever. What ‑‑ I should also back up and say that these cases were decided in the 70s? So. Not today, where like our whole life is on the internet, and we are constantly giving third parties our information. And actually Justice Sotomayor, she has suggested that she would like the Court to rethink the third party doctrine, because it’s just a completely different time today. A lot of us use our GPS, and we wouldn’t think that ‑‑ I don’t know. That they could just share all of our information without us knowing.

And I will say that if you’re ever curious about like how often the government is requesting access to this information, some companies, like Google, I think FaceBook, and Sprint, they do report this. I know Google reports it under transparency reports. And you can see how often the government has asked them ‑‑

KORICA SIMON: Oh. Well, hopefully, they’re still doing it, and you can see. I think it’s roughly a hundred thousand people a year. But we don’t know, like, what the result of that is. It’s honestly probably a lot of people who aren’t doing anything at all.

And so we’ve also seen, like, some people starting to move their e‑mail accounts from using Gmail to e‑mail servers that care a little bit more about privacy and that are more willing to fight these requests from the government.

And then I’ll also say the last thing is that the government can also request that these companies, like, not tell you at all that they’ve requested this information. So… This could be done completely in secret, as well. So.

KENDRA ALBERT: So. I think ‑‑ I want to just sort of flag some stuff that Korica said and sort of highlight certain parts of it, and I want to contextualize a little bit of this. I think, you know, often ‑‑ we’re sort of talking here about sort of privacy from law enforcement, and the primary source of privacy from law enforcement in the U.S. is the fourth amendment, which is so obvious Korica didn’t say it, but I’m going to say it just in case it isn’t obvious for other folks. And, you know… Two ‑‑ one thing worth noting about the fourth amendment, for like folks who are sort of concerned about where ‑‑ about like the relationship between all of these legal doctrines and their actual lives? You know, for many, like… Often, I want to contextualize for folks that having Fourth Amendment protection, or like saying, oh, the U.S. government violated the Fourth Amendment, only gets you so far. Because if what you want is the government not to have access to that information, the horse has already left the barn, to use the right metaphor. Which is to say that most of the remedies that come from, you know, unconstitutional searches and seizures, or unconstitutional requests for information, just are about not having ‑‑ like, that information not being able to be used against you in court. Which is of very limited value if what you’re concerned about is like the safety of yourself or your community, of not getting folks arrested, or if you don’t have questions access to the kinds of representation and resources that would allow you to go through a legal battle and you’re going to plea out the second that you get arrested.

So, you know, I always want to caution any story I tell, or any story we tell, about the importance of constitutional rights in this area with a little sort of real politic about what does it mean, or real talk about what does it mean, to have access to these kinds of rights.

The other thing I want to flag is one ‑‑ what Korica said is 100% correct, as a matter of doctrinal law. There is a weird thing that has happened with regards to government access to data, which is a lot of the larger online service providers, and in this I include Google and Twitter, some of the like… FaceBook, actively will not provide certain types of information absent a, like, appropriate legal process. So it’s actually legally contestable whether the government can get access to your e‑mail that’s stored on a server without a warrant. It has to do something with ‑‑ in certain contexts, with how old the e‑mail is, and whether it looks like it’s been abandoned on the server? Is statute that covers that, the electronic communications privacy act, was passed in the 90s, and boy does it read like that! Like, good luck!

But point being, Gmail will require a warrant to get access to your e‑mail content. That’s great, except that, you know… If your e‑mail content gets turned over and then you then want to challenge it, you’re still in the same place you were previously, which is that the government has access to the e‑mail, and that could mean serious consequences for you independent of whether it’s later admissible in court.

So part of what we’re talking about with legal literacy in this space, what I want to encourage folks to think about is, okay, how do I keep me and my community safe independent of the legal remedies? Because oftentimes, those legal remedies aren’t acceptable to everybody. Just realistically and very obviously. And/or will sort of help you after the fact? Maybe it means you recover money. Or maybe it means the evidence isn’t used against you in court. That doesn’t help very much if what you’re concerned about is the safety of you and your people.

So making sure that, like, we don’t pretend these remedies will make people whole for the harms they experience from surveillance or from the government. But rather that, you know, some of these protective measures that folks can take are about sort of trying to prevent the types of harm that the surveillance might cause in the first place.

Apples one more quick note is that a relatively recent Supreme Court decision has suggested that the government does need a warrant to access your cell phone location. So that was like one little bit of good in a sea of terribleness that is the third party doctrine.

LORELEI LEE: I think ‑‑ so I think I’ll respond to a little bit of that to say that I… you know, I’m thinking about the connections that are between what Yves has been talking about, and then what you folks are talking about, in terms of… the way that information gets used against you that isn’t really cognizable in the law, but once they have your information and have you on their radar, they use that information to get more information, to follow you, to trace your contacts, and happening in multiple different contexts. And… Something that I think is really important that people don’t think about all the time is that everyone breaks the law. And… (Laughs) And it is people who are criminalized. It is ‑‑ so, we think of criminalization as being about behavior. But it is really about people in communities.

Because everyone breaks the law, and the only folks who are targeted by police ‑‑ and I mean state police, federal police, et cetera ‑‑ like, those are the folks who get punished for breaking the law. And that punishment can be… you know, because they have followed you for a certain period of time in order to collect enough information in order to the make something cognizable in the law.

So I’m thinking about how one of the themes of what we’ve been talking about is the deputizing of folks who are not thought of as law enforcement agencies, but whose collection of your information becomes a way of enforcing behavioral norms. And that happening in a way that is ‑‑ goes beyond what criminal law can even do. And so… Thinking about sort of the modern history of how this has happened in law that ‑‑ in some of the laws that we’ve been talking about, some of the federal laws that we’ve been talking about. So, the Patriot Act, one thing to think about in terms of the Patriot Act is how prior to 9/11, Congress had been considering passing some form of regulation for internet companies and data ‑‑ regarding data collection. And Kendra, please jump in if I am messing this up. Or anyone, obviously. (Laughs) But… When 9/11 happened, that regulation sort of was pushed to the side. And it is during this time period that we have this sort of ‑‑ we have the increase in government surveillance, but we also have this sort of recognition by private corporations that data collection is something that can be monetized. And they are unregulated in doing this, and there is this idea that if you have given over your information voluntarily, it belongs to those people, regardless of whether you did it knowingly or not.

So we have this sort of rise of data collection that is a creation of surveillance tools by corporations, and there’s sort of a monetary incentive to keep creating stronger and stronger data collection tools that can be more and more invasive and do this kind of contact tracing that does the thing that Yves has been talking about, that you folks have been talking about, where you don’t ‑‑ each piece of information looks innocuous, but when you put it together you have a map of who you are, and that’s especially concerning for sex working people because they identify sex working people based on this collection of seemingly innocuous information. And you have the incentive to create tools that are more and more effective at collecting that information. And then you have the partnership between government and private companies that then allows that information to be used in order to enforce norms that are… expected by the state, that are thought of as beneficial to the state. And, obviously, targeting people in the sex trades at a high rate. And especially people in the sex trades who are parts of ‑‑ part of other marginalized communities.

And so FOSTA and EARN IT are just sort of… I think we talk about FOSTA a lot as though it is a, like, a revolutionary law that was passed, as though it made huge changes in the law. And it, you know, it did make a change to one specific law that I think people thought of as being sort of dramatic. That’s Section 230. However, it is ‑‑ it really was just an evolution out of stuff that had already been happening. So FOSTA’s purpose, and EARN IT’s purpose, as well, one of the main purposes of both of these laws is to decrease limits on civil liability for internet companies. And you can think of that as being Congress sort of taking out, taking themselves out, taking their responsibility away from themselves in terms of regulating these companies and putting, deputizing civilians to do that regulating for them, and deputizing corporate ‑‑ also deputizing corporations to create… rules and collect data that is thought of… (Sighs) Or is publicized! As being some, having some impact on trafficking and sexual exploitation and sexual violence? But all of that being simply… a show. And actually increasing sex workers’ vulnerability to exploitation. And when you decrease our ability to communicate with each other, when you decrease our ability to be visible online, when you decrease our ability to share information, health and safety information, you increase folks’ liability ‑‑ sorry, folks’ vulnerability, to violence and exploitation.

And I notice that someone asked earlier whether EARN IT had a piece about not prohibiting sexual health information for use. And it doesn’t, at all. But what it does is increase civil liability so that it incentivizes companies to draw the line further than the law specifies. And that is the same thing that FOSTA does. So, these laws ‑‑ because civil liability can be ‑‑ right, anybody can sue. You know… So, think about ‑‑ in terms of criminal law enforcement, that ‑‑ you need specific resources. Like, the police and the FBI ‑‑ policing happens on all of these different levels, state and federal. I mean, they do have ‑‑ obviously, this has been, you know, this has been in public conversation for ‑‑ especially recently, is how many resources these folks do have. And it is a obscene amount of resources. However! It is still less likely that you will be subject to criminal prosecution than that as a company you will be subject to a lawsuit. And the lawsuits also have a lower requirement in court in order to have liability. Like, civil liability, you have to show less in court than you do to prove criminal liability.

And so when you increase civil liability, you incentivize companies to draw the line much further than the law specifies… because they want to get rid of even the appearance of liability, and even, you know ‑‑ because also, lawsuits are expensive, regardless of whether the claims can be proven or not! So, that’s ‑‑ I’ll stop there.

DANIELLE BLUNT: I wanted to make sure that we take a few minutes to sort of talk about what FOSTA‑SESTA and what EARN IT are amending. So Kendra, I would love just like a two‑minute summary of Section 230, and then Lorelei, if you wanted to sort of continue with what ‑‑ like, what EARN IT is, and what EARN IT’s proposing, and why the over‑‑‑ how ‑‑ and the ways in which it’s so overly broad that things like queer sex ed could get caught up in it.

KENDRA ALBERT: Yeah. And I think actually I want to sort of tag on to the end of what Lorelei was saying, ’cause I think it ties perfectly into a discussion about Section 230, which is to say the sort of what we lawyers would call “commandeering,” but the use of private companies to do things that the government… isn’t sure that it has the political capital or will to push forward? It’s not just because they, like, can make it happen using civil liability. It also is much harder to bring a First Amendment challenge to, like, companies deciding “voluntarily” to over‑enforce their own rules. Which, they’re not bound by the First Amendment. Versus the government making a particular rule, which would be susceptible to a First Amendment challenge.

So I can talk a little bit more about that, but I just want to make that point what Lorelei is saying. Which is delegating these responsibilities to private companies is not just better from, oh, you can kind of throw up your hands and claim no moral responsibility for what happens, but also it limits the ability of individuals what are harmed by these sort of changes to legal regimes to effectively challenge them.

So let me talk about 230, which I think will help us conceptualize the stuff, and then we can jump back to SESTA and FOSTA and EARN IT.

So Section 230 of the Communications Decency Act was passed in 1996…? I’m really bad with years. Anyways! Passed in 1996. And it was originally part of an omnibus anti‑porn bill, that had everything ‑‑ that was supposed to restrict minors from seeing porn on the internet. Everybody in the 90s was real worried about porn on the internet. And… It turned out that most of that bill was unconstitutional. It was struck down by the Supreme Court in a case called ACLU versus Reno. But what was left was this one provision that hadn’t gotten a ton of attention when it passed called Section 230. And what Section 230 does is it says that no online service provider can be held liable for the sort of ‑‑ or, to be the publisher of content where someone else, like, sort of spoke it online.

Okay. What the fuck does that mean? So, let’s take a Yelp ‑‑ let’s take Yelp, for example. On Yelp, there are Yelp reviews, posted by third parties. So if I post a Yelp review of my local carpet cleaner. Is always use them, because there’s a funny case involving carpet cleaners. Um. Anyway! I post a review, who I have not used. They have cleaned zero of my carpets. I say, these people are scum bags. They ripped me off. They told me it would cost $100 to clean all my carpets, and it cost me 3,000, and I got bedbugs. So that’s inflammatory. They could potentially sue me, because it’s not true and it harms their business.

What Section 230 says is carpet company can come after me, Kendra, for posting that lie, but they can’t sue Yelp. Or if they do, they’re going to lose. Because Yelp has no way of knowing if my claim is true or false.

So that’s the original meaning of Section 230. It’s gotten broadly interpreted, for lots of good reasons.

So right now, there are something like 20 lawsuits against Twitter for facilitating terrorism, all of them thrown outed on Section 230 grounds. The one that is most relevant to our conversation right now is a case out of Massachusetts called Doe v Backpage, which was brought by a number of folks ‑‑ survivors of trafficking against Backpage.com, for what they claimed was complicity and sort of knowledge of the ads that were placed on Backpage that they were harmed as a result of. And the first circuit, which is a sort of… one step below the Supreme Court, in terms of courts, said: That’s all very well and good, but Backpage isn’t the speaker of any of those ads. They didn’t write the ads. They don’t know anything about the ads. We’re throwing out this case. And in the aftermath of that, Congress was like, this shall not stand! And passed FOSTA and SESTA. And I’ll turn it over to Lorelei to talk more about that.

LORELEI LEE: I think I’m curious what the audience’s questions are about FOSTA‑SESTA, because I think there’s been quite a bit of information written about them, about that law, and I wouldn’t want to just talk on and on about it when it’s not focused to what people want to hear. Or should I talk about EARN IT? Or ‑‑ I don’t know, someone tell me ‑‑

DANIELLE BLUNT: I think we did a summary of FOSTA‑SESTA. I would like another one‑sentence summary of FOSTA‑SESTA, the impact. And then what the fuck EARN IT is and where it’s at, would be helpful.

LORELEI LEE: Yeah, so we can talk about why Section 230 matters to these laws.

DANIELLE BLUNT: Yeah.

LORELEI LEE: So FOSTA‑SESTA does several things, does like six things in federal law, including create a new crime under the Mann Act, which is originally the White Slave Trafficking Act of 1910, and it’s been renamed, but it’s not better…? (Laughing) Oh, boy. So it creates new crimes. That’s one thing that it does. But it also a changes Section 230 so that it no longer protects against lawsuits under federal law regarding specifically the federal anti‑trafficking law, 1591 ‑‑ 18 USC, 1591, in case anyone wants that number. Um. (Laughs)

And so that, what that does… There ‑‑ it’s not clear that Section 230 was actually preventing people from suing companies, specifically Backpage. Backpage was the center of congressional conversations and the center of media attention. And… The chamber was that Backpage was going un‑‑‑ they were held ‑‑ not being able to be held accountable for trafficking that was happening on their website. But actually, the first circuit case was maybe… just didn’t have enough evidence yet to show how actually Backpage could have been held responsible regardless of Section 230, because Backpage was doing things like editing ads and that kind of thing that would have made them liable in a way that’s unlike Yelp.

And so… So, okay. But! People started talking about Section 230 because there was a documentary made about the first circuit case, and it was very well‑publicized, and that documentary was shown in Congress. And people started talking about Section 230 as though that was the thing preventing lawsuits.

I mean, another important piece to remember about making civil liability the place where we enforce anti‑trafficking law and anti‑exploitation law is that it puts the onus on victims of trafficking and victims of exploitation to bring those lawsuits that are very expensive, that ‑‑ lawyers for those claims are inaccessible. You have to spend years talking about your trauma. And! You know, it takes such a long time to get ‑‑ if you are going to even get compensation ‑‑ and then, at the end, you get monetary compensation if you win your lawsuit. But ‑‑ it doesn’t prevent trafficking! And it doesn’t prevent exploitation. And we know that there are other methods of doing that that are much more effective. And many of those methods involve the investment of resources, I think ‑‑ I think this is one of the reasons that this is happening, is that many of those solutions involve the investment of resources in marginalized communities. And instead, Congress wants to pass bills that don’t require the redistribution of wealth in this country.

So EARN IT does something similar to Section 230. And the way that FOSTA makes a carve‑out in Section 230 around the federal anti‑trafficking laws, EARN IT makes a carve‑out in Section 230 around the child exploitation laws, specifically child sexual abuse material laws. And, similarly, when Kendra and I did this research, there haven’t ‑‑ there hasn’t been a lot of examples of Section 230 preventing those claims being brought. So, again, it feels a little bit more like this is for show than anything else. But we can predict that the impact that EARN IT will have will be very similar to the impact that FOSTA had in terms of the removal of information online and the censoring of people online. And the ‑‑ not just the removal of information, but the prohibition on specific people talking.

And we think that, based on our, like, analysis of EARN IT, that that impact will be really on queer youth. So, because that’s a specific community for whom there’s a lot of fear around sexual information being shared, and it’s also a specific community who is seeking that information out! Because, I mean, just being ‑‑ having been a queer youth once myself! I know that, like, you just don’t ‑‑ you just don’t necessarily have access to folks when you’re a kid who can tell you that you’re okay and you’re normal.

DANIELLE BLUNT: Yeah. Thank you for that, Lorelei. And I wonder, too, if Yves and Korica, if you have anything that you would like to add before we open it up to the Q&A.

YVES: I mean, I think that… you know, y’all covered it pretty well, right? Like, I think that like everybody covered a lot of stuff. I mean, I’ve been looking at the questions, and I also only really have, like, a little bit to say in terms of… you know, the way that we see a lot of this happen, right? We’ve obviously talked about criminalization; we’ve talked about censorship, and kind of what happens. But specifically, in talking on this panel around the impact on sex workers and marginalized communities, like the ways that we really see a lot of this happen, and the push for this, right? Like, whenever there’s an increase in surveillance, like that increases the scope, it’s going to increase the scope of policing, and also the general stigmatization. Right?

So we don’t ‑‑ like, I think everybody kind of knows that I’m, like, most knowledgeable in terms of the impact on in‑person sex work. But when we also look at the impact these things have on like digital sex work, that has gotten so much more popular during these times, right, we also see that all of these groups, and like ‑‑ the groups that are behind these bills to begin with, right? Are pushing for other forms of censorship or limitations being put on not just sex workers, but other marginalized communities, but also, like, you know, we know that these intersect. We know that there are intersections here. Is that, they ask people like credit card companies to not accept payments for sex workers. We’ve seen that happen, right? And like, in terms of like in‑person sex work ‑‑ and Lorelei talked about this, right? People get pushed into the most dangerous forms of sex work, making them more vulnerable. In fact, making them more vulnerable to the human trafficking that these people are so against, and make everybody so much more vulnerable to all of these things, which we like kind of talked about. And I think it’s kind of important in talking about the questions that I’m kind of seeing about, you know, what do we do? Because I feel like a part of our conversation kind of scares everybody into being like, oh, my gosh, I should just not use social media! I shouldn’t even text! (Laughs) Which, I don’t want people to think that that is the case? Right? F I think that, like, all of these different encryption methods, and these things, right ‑‑ although! Right? I do not think that they’re foolproof, which they’re not! Like, there are still many ways in which the police and like other agencies can get access to this information, one of those ways being like whomever you’re sending the information to, and wherever that information kind of ends up. If you like, you know, sync it to your laptop, sync it to your phone. All of these different ways, right? But these are tools that can protect you.

So I think that if you want to use these encryption methods, these like ‑‑ proton mail to encrypt your mail, iPhone messages, that’s a good thing. Take what you have. But know they’re not foolproof.

I also want to talk about ‑‑ Kendra kind of talked about this and the reforms and what they look like, and how we think we’ve solved this, or people think they’ve been solving it? Obviously, I came into this conversation, I told everybody I’m an abolitionist. Right? I work with abolitionist groups. I think at the end of the day, surveillance is a strategy that they use in policing, before this technology existed. Before they started doing this stuff, they always surveilled. It’s just an arm of policing. At the end of the day, the problem is policing, policing that has always been used and meant to use to disappear communities. right?

So the bigger fight that we’re fighting is policing. So I don’t want people to think that the fear is, oh, you shouldn’t do anything. The truth is, if you’re a marginalized person, if you’re a sex working person, they are going to want to police you, no matter what, and we’re fighting against that. (silent applause)

DANIELLE BLUNT: Thank you so much for that. Korica, did you have anything that you wanted to add?

KORICA SIMON: I will say I went to a conference. It was a Law for… Black Lives? I think is what it’s called? And they do things around, like, the law and liberation of Black people. And the speaker talked about how, like, have you ever noticed if you lived in a Black neighborhood, or a person of color, people of color neighborhood, police are everywhere? But if you live in white neighborhoods, police are not there. And it’s not because there is more crime in one area over another. In fact, like, police just make the crime worse. And I thought about that a lot, ’cause I’ve lived in Black neighborhoods. I’ve lived in white neighborhoods. And there is a stark difference. And there is still, like, “crime” happening in the white neighborhood, but nothing ‑‑ like, police weren’t there. So I think it is important to think about how policing is the problem.

And then one other thing I forgot to bring up is that before this talk, I was doing like research on what’s been happening lately, ’cause I feel like there’s just always so much happening. And something that I missed was that some police departments at NYPD and Chicago Police Department, they have been putting like ads… sex ads on websites, and people will text that number looking for services. And they will ‑‑ the NYPD Police Department will send them a message saying ‑‑ I have it pulled up. It’ll say, like, this is the New York Police Department. Your response to an online ad for prostitution has been logged. Offering to pay someone for sexual conduct is a crime and is punishable for up to seven years of incarceration. And… Yeah! So, people in that article were talking about how the police have access to their name and their phone number, and they don’t know like what’s gonna happen to them. Like, are they just logged in some database? And I think it’s safe to assume that they probably are logged into some kind of database. And I think, as we think about how ‑‑ as yeast said, how sex work is becoming even more digital with the time that we’re in, like the impact of this on sex workers, I’m guessing it’s gonna be really large. So, yeah. I just wanted to bring up that extra way that surveillance is happening.

LORELEI LEE: I actually wanna add one more thing that I intended to say and forgot to say, which is just that in terms of this question of what we do, I do think that one other point to me, like, when they’re passing these laws, something else they’re doing is deputizing us to police ourselves and to chill our own speech and to prevent us from organizing and to prevent us from using any tools at all to communicate with each other and to talk about these issues. So, I don’t know. I do think that the… It is a mistake for us to use this as a reason not to… speak to each other! I mean, that is like really what we’re talking about, when we talk about not using online tools and other electronic tools of communication that are…

DANIELLE BLUNT: Yeah, it’s really interesting, too. And right now, some of Hacking//Hustling’s researchers are wrapping up the survey that we were doing on content moderation and how it impacts sex workers and activists who are talking about Black Lives Matter over the last few months. And like, we’re definitely noticing themes of speech being chilled, just like we did with our research on FOSTA‑SESTA, as well as the impact of, like, platform policing on both social media and financial technologies has just about doubled for people who do both sex work and are vocally protesting or identify as activists. And… The numbers are just… very intense.

So like, I think… Being mindful about how we communicate, rather than not communicating, is a form of harm reduction and community care. And I also see that this ‑‑ this panel as a form of harm reduction and community care, and this in partnership with our digital literacy training. Because I do believe that the more that we know, the more we’re able to engage meaningfully when legislation like this comes up. And… Like… A lot of these laws aren’t meant to be ‑‑ aren’t written to be read by the communities that they impact, and they’re often intentionally written to be unintelligible to the communities that they impact, and think that they can just like get them signed into law without ‑‑ without having to check in with the communities that are harmed by this legislation.

So I think that anything that we can do to better understand this and decrease that gap between the people who are writing this legislation, or the, like, tech lawyers who are opposing this legislation, and like bringing in our own lived experiences? Is incredibly important work.

KENDRA ALBERT: I also ‑‑ well, I know we want to ‑‑ well, I’ll stop. Blunt, you want to do Q&A?

DANIELLE BLUNT: Sure, we can do Q&A. If you had one thing to add, that’s fine.

KENDRA ALBERT: So one thought I had there is, one, it’s totally right? But it can feel like oh, my god, there’s so much? That’s one of the hard things with talking about surveillance. It’s like, yeah! You know, police and law enforcement have so many tools in their law enforcement, and… You know? But at the same time, like, our ‑‑ we care for each other by, like, creating space to talk about what makes us feel safer, and how can we make ‑‑ take risks that we all agree to be taking? Right? Risk‑aware consensual non‑encrypted information.

DANIELLE BLUNT: I love that! (Laughing)

KENDRA ALBERT: It rolls just right off the tongue.

But I think I want to highlight what Yves was saying, in terms of the problem is policing? And I think one of the ‑‑ and Korica also said the same thing, so, you know. What we’re all saying, in terms of the problem being policing, and the solution not just being like finding more ways to like slightly narrow the surveillance tools? I think one of the real problems around surveillance, sort of surveillance debates generally, and I say this as somebody who comes out of a technology law tradition, is that they are ‑‑ the folks who are doing work on like sort of high‑level surveillance tools, like things like the sneak‑and‑peek warrants or Section 215 of the Patriot Act, are often deeply disconnected from the communities who are most likely to be harmed once these surveillance tools are widely used. Right? Like, just like with the technology laws, right, there is this way in which, you know, the conversation around like mass surveillance is up here, and we’re supposed to be afraid of mass surveillance, because mass surveillance means surveillance of white folks like me and not communities of color. Right? But at the same time, like, that… So much of the rhetoric relies on like the idea that it’s okay to surveil some folks, but it’s not okay to surveil others. And part of how we fight back is by deconstructing the notion that it’s okay to do this ‑‑ to use these tools on anybody. That like, you know, it doesn’t ‑‑ it’s not actually like, oh, there’s a bad enough set of crimes to make this okay. Right? And that’s part of ‑‑ part of it is not getting sucked into the sort of like whirlpool of like, well, you know, is terrorism worse than human trafficking? Well, if terrorism isn’t worse than ‑‑ if they’re both equally bad, then we need to have the same tools to prosecute human trafficking as we do to terrorism. And here we are where they’re getting a sneak‑and‑peek warrant to go into a massage parlor in Florida.

And I don’t say that flippantly, because those are real folks’ lives, just like there are real folks’ lives impacted by surveillance of supposed terrorist communities. Looking at all of the mosques in New York and Detroit, where folks were under persistent surveillance after 9/11.

So I think part of what we do is we resist the idea that it’s okay if this happens to other people. Because, you know, that’s how… That’s how the tools get built that will, like, eventually be used against all of us.that was what I wanted to say. I’ll stop there.

DANIELLE BLUNT: Thank you, Kendra. Okay. We’re going to open it up for Q&A. Someone asked if we could touch on the recent encryption legislation and how protected we are using services like Signal, WhatsApp, and Telegram.

KENDRA ALBERT: I can take that, and then if Lorelei and Blunt, if you want to jump in if I screw it up.

So, you know, EARN IT was one of the sort of pieces of legislation that was kind of proposed to… make it sort of ‑‑ I don’t want to say “end” encryption, but would have had the practical effect of making encryption, encrypted services more difficult to sort of produce. The other is the LAPD I think laid ‑‑ that’s probably not how people have been pronouncing it. But. (Laughter)

The ‑‑ that bill is way worse. I do not think it’s going to pass. It sort of all‑out tries to ban encryption.

EARN IT actually, sort of between the initial proposal and the version of the bill we’re currently on, got much better on encryption? So now it specifically says that, you know, using encryption won’t ‑‑ like, isn’t supposed to be able to be used against a service. Like, for purposes of figuring out whether they’re liable for child exploitation material. It’s really ‑‑ it turns out that that construction is not just complicated when I say it, but very complicated in the bill, and might do less.

In terms of what the impact is gonna be on like Signal, WhatsApp, and Telegram ‑‑ you know, what I’m hearing in this question is sort of end‑to‑end encrypted services, where the service provider doesn’t have access to your communications? You know, I think that it would be unlikely ‑‑ if ‑‑ if, God forbid, EARN IT as currently existing passes, I think it would be unlikely to sort of result in Signal or WhatsApp going away. In fact, actually, some advocates are currently ‑‑ like Miles Nick in particular ‑‑ are arguing that the current internet construction earn best of your knowledges encryption? I’m a little more skeptical about that than he is. Happy to sort of ‑‑ you know, at me on Twitter if you want to talk about that.

But I don’t ‑‑ those services are not going away under the current version of EARN IT. However, the Justice Department has been trying to sort of get back doors into encrypted services for a long time, and they’re not going to stop. So it’s sort of a… nothing to watch for right now, but stay vigilant on that front.

YVES: I just wanted to generally say, right, like I think… Like, if the question’s kind of getting at like in your personal life, like, how ‑‑ what’s the danger, or like if you’re doing some kind of criminalized work, or something that you are afraid of like the police getting information of, right? Like, it’s not gonna do your harm to use an end‑to‑end encryption service, like iMessage, like WhatsApp, like Signal, like Telegram. Right? But it’s not something that’s gonna protect you wholly? But also should note that, you know, the like the person asking this question is like an organizer or a whore, like, you know. So like, when ‑‑ most of the time, when this information gets in the hands of police from like your texts or things like that, it’s not because they’ve like hacked the system. It’s not like something like that. It’s usually because someone you’ve talked to has like the, the police have gotten ahold of them, they’ve given that information to them. And like, the ways like ‑‑ I kind of talked about this in the beginning, right? When they deputize civilians, we’re not just generally ‑‑ I literally mean there are also people who are just going to be, like, I think that there’s a sex worker at my hotel! Like, da da da da! I think there’s a sex worker in my Uber! Right? And like handing over that information.

So I don’t want people to be like, oh, I’m just like not safe anywhere. Because that’s not really what the scenario looks like in real life, when you’re like on the street and like working. Right? But they’re not, like, fully safe. It’s not like, oh, you can type anything into Signal, and it’s like Gucci.

DANIELLE BLUNT: Right. And I think, too, people can take screenshots. Oftentimes, that’s how information is shared even when you’re using encrypted channels. So I think also just being mindful about what you say, when our saying it to. If you’re using Zoom, knowing that this is going to be a public‑facing document, and we’re not currently planning any political uprisings in this meeting? So it feels okay and comfortable to be using Zoom as the platform. But like… Personally, in my work, even if I’m using an encrypted platform, like, I don’t say anything that would… like, hold ‑‑ I do my best to avoid saying things that would, like, hold up in court as evidence, in the way that I use language.

KENDRA ALBERT: Yeah. I think in the immortal words of Naomi Lauren of Whose can have corner Is It Anyway, people need to learn how to stop talking. Which it turns out is both solid advice, and what my advice is if the police want to talk to you. So, solid on many different front seat.

LORELEI LEE: I think it’s really important ‑‑ like, I think several people have said this already, but just to really emphasize that when we’re talking about this stuff, the intention is to have… you know, informed consent, you know, for lack of a better word, of using these tools. And that… You know, especially if we’re talking about, we’re talking about sex working people, we’re talking about ‑‑ Aaa! Caty Simon! (Laughing) I’m sorry, I had to interrupt myself to get excited that Caty Simon is here. Another expert on all of this stuff.

The thing I was going to say is I do think that sex workers are criminalized folks from many marginalized communities are really good already at risk assessment. Understanding what level of risk you are comfortable with. And using these tools with that in mind. And knowing that nothing ‑‑ there are no answers! Right? There’s no, there’s no system except abolition that is going to prevent these kinds of harms from happening. Abolition of actually policing and capitalism, perhaps! So.

Oh, and the thing I was na say, which is maybe not that important, but the question I had for Kendra, is whether you think EARN IT is still part of encryption in terms of best practices and how that might inform future corporations. I’m not sure if that’s too far in the weeds?

KENDRA ALBERT: I think it could be. So one of the things we’ve been saying internal internally about EARN IT, like in Hacking//Hustling, that I want to emphasize here, is the lack of clarity of what the bill is going to do is a feature by the creators, not a problem? It’s not that we’re failing at interpretation? ‘Cause we’re not. You know. But the… You know, I can say all I want what I think EARN IT means with regards to Signal and Telegram, but as Lorelei pointed out one of the things EARN IT does is create this commission that creates best practices, which who the hell knows what’s going to be in there. And it’s really unclear even how the liability bits are going to shake out.

So even with a specific amendment to the current version of EARN IT that’s supposed to protect encrypted services, we don’t really know what’s going to happen. So really good point, Lorelei. Thank you.

DANIELLE BLUNT: “Other than being educated or somehow not using any technology, what can we do?” I feel like we touched on that a little bit, but if anyone wants to give a quick summary.

KENDRA ALBERT: Yeah, I mean, I think just to echo what Lorelei and yeast have already said, right. Engage in thoughtful conversations around how you’re using the technology, and be thoughtful around how you’re using it, when our sharing what with. I think for me ‑‑ and actually, maybe this gets to the next question, which is sort of like… It doesn’t matter if you don’t break the law? Or what ‑‑ but I don’t have anything to hide! Right? You know, the way I think about this is that, like, everyone has something that law enforcement could use to, like, make your life miserable. That’s just reality. Some folks have many more things! Like. But everyone has something. And so… Not ‑‑ my goal, like our goal I think here is not to sort of suggest like paranoia, they could be listening to everything. Although, you know, yes! I’m not pretty sure that there’s a legal authority to do most things that law enforcement wants to do, and like I’m not under any illusions about that. But so part of how we think about this is, you know, how do we take care of the folks around us and be thoughtful around the risks that we’re taking and make sure that we’re taking risks that are aligned with our values and the things we need to do? Right? And those are gonna look different for everybody. But I welcome thoughts from other folks on the panel, ’cause I think I’ve said enough.

LORELEI LEE: I think I’d like to add something, which is that oftentimes when we talk about this stuff, we talk about it in terms of personal risk as though risk belongs to us alone, when I think it’s really important to recognize the communities that you’re interacting with and the people that you’re interacting with and to understand that even if you feel as though law enforcement won’t do anything to you, you’re not a likely target, it’s highly likely that there are people in your life who are likelier targets, and your refusal to talk to law enforcement, or your care around how you communicate with folks, is protective of the people around you and the people that you care about who you might not even know what their levels of risk are.

And then the other thing I want to add in terms of things to do is that to think about how ‑‑ what, what actions you’re capable of to oppose the passage of anti‑encryption laws, to oppose the passage of laws that target sex working people and people of color and people in marginalized communities, and to think about if you feel as though you have a lower level of risk of being targeted by law enforcement, that means that you have a greater capability for maybe going out and protesting! I have to tell you, I know a lot of criminalized people who do not feel safe protesting on the street, do not feel safe talking on un‑encrypted platforms, don’t feel safe talking on panels like this. And if you feel safe doing those things, then it’s your responsibility to do those things. So.

YVES: I mean, I don’t know if you were going to ask the next question, but like Lorelei and Kendra kind of talked about it a bit. I kind of just want to say if someone asks you, or like, it doesn’t matter if you don’t break the law, that’s not really a part of the issue, right? Like, laws, crimes, the things that we define as crimes are entirely arbitrary. Right? So who gets arrested, who gets criminalized, all of these things are… just simply based on who, like, the system is against. Which we know that that means Black, Indigenous, people of color, especially trans people, any gender non‑con forge ‑‑ gender nonconforming people, sex working people, anything that is outside the scope of what white supremacist culture would consider to be a good and appropriate person! Right?

So it’s not really about breaking laws. Or, you shouldn’t be afraid of anything if you haven’t done anything. Because it doesn’t matter. They’re going to criminalize people regardless of that. Right? They’re going to incarcerate people regardless of that. Like, all of these things are a death sentence to marginalized folks, which is why we kind of talk about it in this way. It’s not about, like ‑‑ well, I mean, it is about like surveillance is bad? It’s infringing on rights of people, right? But it’s also about the fact that surveillance is just like a tool that is used for policing, for incarceration, in order to just disappear whomever. Right?

So, when talking about that, surveillance is bad for that reason. For the reason, like, I talked about a little bit with contact tracing, right? That, in theory, should be a good thing. Should mean that we are keeping people safe. Should mean that people aren’t getting COVID, or are getting treated for COVID, are getting treated for HIV/AIDS. But we know that in a world where we have policing, that is simply not what happens! Right? It’s not a case of, will they use it? They will. They will use it, they will criminalize it, they will arrest people. So we want to get rid of it. Wholesale.

DANIELLE BLUNT: Yeah. And I think, too, when you talk about contact tracing in that capacity, I can’t help but think about the ways that data is scraped from escort ads to facilitate the de‑platforming across social media and financial technologies of sex workers and other marginalized communities, as well as activists. So I think both on the streets and on the platforms, like… This is not being used for good? And that it needs to end.

Okay. I’m gonna try and get one or two more questions in. Someone asked, do you think that EARN IT is going to be passed?

(all shrugging)

I think that’s our official comment!

KENDRA ALBERT: Yeah, for anyone that’s not watching the video or is not able to watch the video, there is just a lot of shrugging.

DANIELLE BLUNT: Yeah. But if you keep following Hacking//Hustling, we’ll keep talking about EARN IT and updates when they come, so. If you want to follow @hackinghustling on Twitter, that’s usually where our most up to date shit is.

Someone said, hypothetically, if someone wants to be a lawyer and is studying the LSAT and hoping to apply in the fall, should they not post publicly about these things or attend protests where you could be arrested?

KENDRA ALBERT: I can take that one. So, you can absolutely post publicly about these things. So the thing to worry about here is that the… like, for lawyers, is this thing called character and fitness, which is basically if you want to practice as a lawyer after you go to law school, you have to get admitted to one of the state bars, and state bars have particularly requirements. I actually don’t know a ton about how those interact with, like, a past history of sex work? But the sort of watch word in terms of thinking about character and fitness is honesty, generally speaking. Like, the goal ‑‑ folks generally ‑‑ pretty much most things are overcomeable through character and fitness, if you explain sort of what happened. So getting arrested at a protest, like, that’s ‑‑ you can totally still pass the bar and become a lawyer through that. Absolutely posting publicly about like abolition or sex work or, you know, those kinds of things.

You know, where I would start to sort of think about whether you want to talk to someone who has more experience about this than me is, um, if you have felonies on your word, or if you are sort of worried that you have any behavior that folks might use, might believe makes you less honest. So things like fraud convictions often come up. But I’ll stop there.

DANIELLE BLUNT: Awesome. And then I think this will be our last question, as we’re just at time. Any books recommendations along with Dark Matters: On The Surveillance of Blackness by Simone Browne? So sounds like folks are interested and want to learn more.

KENDRA ALBERT: This isn’t a book, but Alvaro Bedoya recently wrote a piece on The Color of Surveillance. It’s really amazing. So I recommend that.

DANIELLE BLUNT: Will you tweet that out? If folks say things, will you tweet them?

KENDRA ALBERT: Yeah.

KORICA SIMON: I have a few books that I’ve ordered and I need to read, before the summer is over? Black Software: The Internet & Racial Justice, from the Afronet to Black Lives Matter. It’s talking about how technology can… Oh. Digital racial justice activism is the new civil rights movement. There’s Automating Inequality: How High‑Tech Tools Profile, Police, and Punish the Poor. Have you read that?

KENDRA ALBERT: It’s really good. I really recommend Automating Inequality.

KORICA SIMON: The last one is Race After Technology: Abolitionist Tools for the New Jim Code.

LORELEI LEE: I think I would add The Age of Surveillance Capitalism, which talks a little bit about the history of the development of some of these data collection tools.

DANIELLE BLUNT: Yves, did you have one you were saying or typing?

YVES: I mean, I would recommend there’s The Trials of Nina McCall, which is about sex work surveillance, and I think it’s the decades‑long government plan to imprison promiscuous women. I would also recommend, if you’re interested in learning more about how public health is weaponized as surveillance against marginalized communities, Dorothy Roberts writes a lot of stuff about this. So, yeah.

DANIELLE BLUNT: And ‑‑ that’s a beautiful place to end. Thank you, Lorelei, for sharing that. I feel like ‑‑ (Laughs) We’re all ‑‑ everyone’s crying. I’m crying. Speaking for everyone. (Laughs) If people want to be found online, or if you want to like lift up the work of the organizations that you work with, can you just shout out the @?

KENDRA ALBERT: @HackinHustling. It’s really great!

YVES: You’re fine. @redcanarysong, and @SurvivePunishNY.

DANIELLE BLUNT: Well, we are slightly overtime. Thank you so much to our panelists, and Cory, our transcriber, for sticking with us. I’m going to stop the livestream now, and stop the recording.

Sex Worker Lobby Day

Sex Worker Lobby Day, which was originally presented on June 1st, 2018.