THIS ARTICLE APPEARED IN THE PRINT EDITION OF PETIT MORT MAGAZINE, VOL 2 (SUMMER 2021).
1.
“Never ascribe to malice that which is adequately explained by ignorance.”
The last time Instagram changed their terms of use, I muttered this quote to myself repeatedly. These words have become a mantra, a thin salve over a wound that reopens each time myself or a fellow sex worker gets blipped out of digital existence.
It was the antidote to that brief surge of fear I felt when I watched my Twitter get disabled for a slightly-too-racy header photo. It’s the way I soothe myself when a client tells me their Venmo transaction was blocked. It’s the lullaby I sing to cover the sound of notifications, these short robotic warning shots that remind me that these tools and services I once viewed as standard are now luxuries, and that no amount of caution will ever be quite enough to protect my work.
When we explain this game of cat and mouse to our non-industry friends, they struggle to understand. Your Paypal was deleted? Just switch to Venmo, right? Just make a new Twitter. Just get another bank. Just stop posting photos of your feet or whatever it is you did to piss off Instagram. Just be more careful next time.
The problem with these questions is that they presume that we have the same type of options. That we can simply make new accounts, switch platforms, apologize and try again.
When you’re privileged, it’s hard to imagine the restrictions you’d face if you weren’t. If you’re an immigrant, if you’re a person of color, if you have a criminal record, if you’re a survival worker, if you live in a more rural area, if you’re not extremely technologically literate, or if you’ve simply been deplatformed before, your options are cut in half over and over again until you’re staring at nothing but the dregs of what digital society has to offer.
After SESTA/FOSTA passed, online work became extremely risky: one study by Hacking//Hustling found that 80% of workers face increased difficulty advertising online as a result. Sex workers are at higher risk for suspension and deletion than the general public. We’re forced to use alternative spellings and euphemisms to maintain an online presence, directly threatening our livelihood and search optimization. We exist under the thin cover of Section 230, and even that protection is being threatened by proposed laws like EARN-IT and SAFE TECH.
When you’re a sex worker, everything is suddenly at risk. It doesn’t matter if your form of sex work is technically legal. We all know about friends who couldn’t get apartments because of their OnlyFans pay stubs, whose banks shut down their account & withheld their strip club income because they “don’t allow proceeds from any form of adult work, including legal work”. Yesterday’s vindictive client might decide to report your Squarespace site, and the next day it could be gone, along with your ability to pay rent.
This anxiety starts to seep into everything. It creeps up in ways you can never predict.
You try to protect yourself by using a VPN; Google reacts by locking your account. When something doesn’t load, most people assume they’re on bad wifi; you assume your account was finally deleted. The panic may be short-lived, but the subsequent burnout is not.
Every glitch turns something commonplace in civilian space into a potential catastrophe.
I often tell myself to calm down, that I’m overreacting. I’ve become trigger-shy and imagine monsters where there are only shadows. But how can this be an irrational response when even a routine Instagram deletion triggers a harmful ripple effect of lost clients, lost revenue, and lost community?
When we get deplatformed on social media, we don’t just lose a vital outlet for advertising. We lose our ability to connect with one another, our community, our support systems. We can’t back up our followers; this is intentional. They don’t just pluck us out like weeds, they rip us away from each other, then salt the earth with IP bans and real name policies to ensure we’ll never grow back.
2.
Who do you call when your account gets deleted?
There’s no formal process, no judge, no jury. Recourse is nearly impossible; you’re pleading with the forces who seek to push you out of online spaces for existing as a sex worker.
Tech companies are at the mercy of payment processors, banks are at the mercy of US law, politicians are influenced by public opinion (which is shaped by the media), and public discourse is controlled by what tech companies will allow. Religion, capitalism, and deeply-held patriarchal standards help grease the wheels.
When you’re deplatformed from any service, it feels personal. Someone, somewhere, looked into your soul, weighed it against a feather, and found you unworthy. They pushed a button and ignored the petitions your friends submitted asking for help. For those of us who grew up in the United States, this feels particularly harsh after being so heavily conditioned to believe in due process (among other endearing myths).
Trying to place blame is an exercise in recursion. You end up trapped in an ouroboros of tech, media, finance, and politics, snapping at each other’s tails in a never-ending cycle of culpability and absolution.
I wanted to pour all my rage and fear and frustration onto this page. I see friends who feel implicitly targeted, who funnel so much passion into trying to change one immovable piece of this puzzle.
But instead, I found myself writing about it from the perspective of the Bad Guys.
The truth is uncomfortably banal. The decision to delete you was likely informed by a set of algorithms. These algorithms do not have souls, but they do have biases. The people who wrote them are mostly white and mostly men, and to them, you’re a thought experiment, a fantasy they can actively disregard for the sake of higher pay and a promotion. They struggle to appease their bosses, advertisers, and lawyers. Pushing out sex workers is always easy to justify—when we’re mentioned at all, we’re often lumped into the same category as drug dealers, scammers, and spammers. If the people who design these algorithms could fully comprehend the impact of their work on real humans, I don’t believe they’d be able to do it.
These algorithms are most visible on social media. What we think of as a simple “if boobs, then delete content” equation is never that simple. Engineers spend years fine-tuning their machine learning (ML) algorithms, but these artificial intelligence systems that sound like Skynet are in reality are currently closer to the intelligence level of a bratty toddler. When something goes wrong, e.g. when the system deletes a post that didn’t actually contain nudity, you can’t just crack them open and peek inside for an apt explanation.
In short, these algorithms are only as accurate as the data they learn from, and the data they learn from is as biased as their creators.
ML systems are excellent at spotting objective things, like reading text in images or alerting you if a sentence contains any mention of COVID-19. These are practical applications that should be lauded; they help combat misinformation, prevent spam, and improve accessibility. The problem comes when they start to evaluate more subjective cases. When machine learning systems are trained to recognize nudity by watching porn, their definitions of nudity are heavily based on what femme-presenting cis women look like. They’re not going to pick up on topless men or children playing in a bathtub; they’re going to aggressively flag all the breasts with visible nipples.
How do you decide what counts as nudity, when men are allowed to show their nipples? How do you decide what a “male” nipple looks like?
Human moderators at third-party companies are hired to make these judgment calls.
The person who deletes your photo, Tweet, or account (if a human was even involved) spends all day evaluating an endless parade of content against secretive and ever-changing checklists. Yet, even this isn’t a simple cut-and-dry process.
On most social media platforms, guidelines vary by location and each platform’s willingness to bend to political pressure in the countries where they operate. Even within the same country, ‘regular’ accounts are held to different standards than verified or protected accounts. Instagram didn’t allow semi-visible nipples, but then Rihanna posted a photo in a mesh bra. Instead of inviting a press fiasco, Instagram quietly changed their internal guidelines for content moderators to allow nipples-through-mesh.
Unless you’re (essentially) a celebrity, content moderators aren’t allowed to care about whether or not you tried to play by the rules. They have a brief window of time (usually under a minute or two) to err on the side of being too aggressive, or they risk losing their job. The decision is simplified: do they protect our income, or their own?
So they erase us. And we adapt.
A new social media platform or payment processor or communication tool comes out, and everyone races to participate. It gains momentum. It may not be explicitly SW-friendly, but it doesn’t ban us, so it shines like a beacon of hope. We flock to it, our clients flock to us, and sex does what it does best: sell.
Most of these platforms are pipe dreams funded by venture capitalists. These tech companies amass a highly skilled workforce funded on a simple bet – that in the end, their company will go public, and they’ll entice the general public to buy their stock. Once they go public, they become beholden to a Board of Directors, who advocates on behalf of shareholders that are united by the single-minded goal of making a profit. This is when the pressure comes.
The tech “startup” starts to clean up their act. Lawyers go on the defensive and new terms of service come into play, banning hate speech, copyright infringement, spam, violence, and—inevitably—nudity. Sex might have built up this platform, but what happens if teenagers start posting nudes? What happens if kids stumble across porn? Worst of all, what happens if a middle-aged conservative housewife in Kentucky sees a nipple and decides not to make a purchase after all?
Ban it all. Protect the profit, no matter the personal cost.
Social media business models are rarely designed to turn a profit, so under the pressure to monetize, they scramble to find new profit avenues. Instagram added product shops. Pornhub and Fetlife launched premium subscriptions. Even Twitter is testing “tip jars” and is secretly working on Twitter Blue, a paid subscription model. As soon as a large platform starts selling directly to their users, they stop making the rules. They’re now at the mercy of payment processors, and payment processors are terrified to “dirty” their hands with our money. If cleanliness is next to godliness, Visa is a deity.
If cleanliness is next to godliness, Visa is a deity.
In the world of banking, there’s a deeply held commitment to the doctrine of “Cover Your Ass”, because even gods can be fined by the government. Unlike tech companies, which still hide in the tenuous shade of Section 230, banks don’t have the luxury of willful ignorance.
The US Department of the Treasury’s Financial Crimes Enforcement Network (FinCEN) requires banks to abide by Know Your Customer laws: they have to know who their customers are, where their money is coming from, where it’s going, and why it’s being exchanged. These requirements have been around since the 1970 Bank Secrecy Act, but were significantly enhanced with the Patriot Act.
Banks are terrified of being accused of facilitating money laundering or trafficking. Not just because of significant fines (to the tune of $10.4 billion in 2020), but because even before those fines get levied, the government requires them to develop and implement plans to make sure it never happens again—regardless of how much that will cost. Since the legality of our work is constantly in flux, most sex work (along with other industries like marijuana) is written off as “too high risk”.
With banks, the issue isn’t so much a puritanical hatred of sex workers as it is about endangering profits. Banks view the risks of breaking the law and pissing off regulators as greater than the benefits they would get from facilitating transactions for any kind of sex work, even legal sex work.
So banks force ‘adult’ content into a high-risk category, and the ones that are comfortable with that risk exploit their position by requiring payment processors to charge upwards of 15% of each transaction. Apple and Google aren’t willing to accept that kind of cut, so they ban nudity, porn, and ‘adult content’ from their app stores, further pressuring tech platforms to comply. Behind the scenes, lawyers and approval teams engage in continual negotiations to ensure compliance.
And then come the reporters.
This is where things have a tendency to go south very quickly.
If someone chooses to approach sex work looking for dirt, they’re going to find a mountain of it—just like they would with nearly any other industry. The lack of public knowledge makes it easy to spin a laughably tenuous association into a news story.
For example, MyFreeCams allows international models, including those who live in Colombia and Ukraine. Banks are required to view these countries as being high-risk for trafficking, so they filed dozens of Suspicious Activity Reports about MyFreeCams payouts to models who reside there. An inexperienced reporter spotted this, assumed the worst, and wrote an article framing MFC’s’ entire business model as suspicious.
Most reporters rarely consider the impact their work will have on us. It happens not just on an industry-wide scale, but a personal one as well. When the Daily News doxxed an EMT because she had an OnlyFans, she lost her job.
When sex work is mentioned in the media, it’s often conflated with trafficking. Evangelical groups like Exodus Cry push their harmful agendas by funding politicians and press that paints all sex work as exploitative. Legislation like SESTA-FOSTA, which passed in 2018, uses the guise of preventing sex trafficking to instead chip away at Section 230 protections (which gives tech companies the freedom to moderate content without being liable for content they miss). The Freedom Network, the largest network of anti-trafficking organizations in the US, has publicly rebuked SESTA-FOSTA for “squandering limited federal resources and puts sex workers at risk of prosecution for the very strategies that keep them safe.”
Despite this, US politicians continue to use sex trafficking as their trojan horse to pass laws that, in reality, make the problem of trafficking even worse.
The US government also has a long history of using technology to clamp down on activists and marginalized groups. With so many major tech providers based in the US, new bills like EARN-IT and SAFE TECH would have a global impact on the ability of marginalized communities to organize and share safety information.
If the general public can’t distinguish between consensual sex work and trafficking, how can we be surprised by the way sympathetic politicians face aggressive opposition when they try to block laws that would harm us? Legislation like SESTA-FOSTA led to a blatant, predictably harmful effect that went beyond just losing Backpage. But as we’ve seen with the Nordic model, even well-intentioned changes can have a devastating impact when sex workers aren’t consulted in the lawmaking process.
The media leverages moral outrage over anything sex-related to hype the general public up into a frenzy. Politicians are pressured to pass new laws. Payment processors react by preemptively tightening the reins. And a new wave of deletions begins.
3.
How do we change this?
We know that we need to build awareness of the systems and institutions that hold us back. Meaningful change happens alongside a cultural shift, and we, as sex workers, are the catalyst. We are more powerful in numbers, and the internet has allowed us to organize in ways we previously couldn’t.
But before we can make our voices heard, we need to protect our ability to speak.
Let’s start with our most vulnerable communication platform: social media.
How do you walk that line to avoid getting deleted?
One way you can get ahead of Instagram and Twitter’s automated censorship by using the same technologies that they do. Google Cloud Vision and Microsoft Computer Vision [ed note: Microsoft has deleted their demo tool since publishing] are two tools that you can use to pre-screen your images by uploading them and reviewing their auto-generated analysis. They’ll catch text you didn’t notice (you already know the words to avoid), tell you the rough likelihood a computer will think your image is porn, and flag categories like “fetish” or “sex toys”. You can also use sentiment analysis tools to pre-screen your Tweets and photo captions to avoid content with a heavily “negative” or “hateful” sentiment. Facebook has a publicly-accessible Sharing Debugger tool that you can use to re-scan (and often un-ban) links that were automatically blocked on Instagram and Facebook. While none of these techniques are foolproof, they can help prevent automated forms of censorship and outwit current algorithms.
Thanks to a law passed in the EU called GDPR, tech companies are required to provide EU users with all the personal information they collect about them. To comply with this law, most tech companies launched globally-available features that allow you to download all the personal information a company stores about you. By regularly downloading your data dumps from Twitter, Instagram, Facebook, and Google, you can retain content you’ve posted that would otherwise be lost.
Go through this data. If they’re collecting information you’re not comfortable with—like your exact geolocation—there are usually settings to delete or stop tracking it. These data dumps also contain the information they can use to ban you from opening new accounts, so pay attention to what they know about you (like your device IDs, cell service providers, and IP addresses).
The one piece of information that social media platforms consistently refuse to let you download is the list of people you follow. For the tech-inclined, there are some hacky ways to do this (like opening web inspector, setting your browser to a mobile device, then saving your page of IG followers), but the more reasonable approach is to create & regularly follow people from a clean, private backup account. Normalize communicating with people you regularly talk to off-platform, to minimize the damage of losing access to your DMs.
To avoid shooting yourself in the foot, it helps to remember that every tech platform rewards “normal human behavior”. If you use a VPN that routes your traffic through another country and then log into your CashApp, you might be locked out due to suspected fraud. Why would your phone suddenly be located in Switzerland, when 20 minutes ago you were in California? Any time you use a new technology that promises better privacy, test it out in low-stakes situations first before trying it for things you rely on.
The biggest way that we can continue to preserve our ability to communicate is to push for the adoption and normalization of technologies that are privacy-centric, requiring them in order to access us and our content. Despite gateway services (like the App Store, Google Search, and social media) being against us, we still have power here: look at how porn was the deciding factor in Blu-ray vs HD-DVDs, pushed the adoption of cable TV, and heavily contributed to the normalization of torrenting.
We can incentivize clients to adopt technology like Bitcoin by giving discounted deposits in crypto. Require them to call & text through Signal; point out how useful that disappearing messages feature is for exchanging sexy photos. We can mention on our websites that we recommend emailing us from a free ProtonMail account, so that both sides of our conversation can stay encrypted (and reduce the likelihood that they’ll lose our messages in Gmail’s spam folder). We can even push them to alternative social media platforms like Mastodon (Switter) instead of Twitter, although I expect Mastodon apps are likely to be banned from the App Store if it takes off. [ed note: Switter was later shut down]
Encrypted communication platforms give us freedoms that can’t be easily stripped away. Google can spy on your Gmail conversations; ProtonMail cannot. Apple could ban Signal from the app store, but there’s no way for them to see the conversation you just had with a client. This helps protect not just us, but the people we organize & communicate with.
Even if you don’t go all-out (does anyone REALLY need a burner e-SIM number that they only use to verify social media accounts?), give yourself credit for the things that you’re already doing.
Start with the easy stuff: enable 2-factor authentication on your accounts to prevent getting hacked, download Signal and use it for SW-related convos, and focus on protecting the tools you use to communicate and organize. Remember that we’re playing a game of chess against opponents who made the board, the pieces, and the rules.
The next part of causing a cultural shift?
Continually speaking out, and organizing to amplify our voices.
One of the most important things we can do right now is to start, or continue, talking to everyone we know about how anti-trafficking organizations share inflammatory propaganda that conflates consensual sex work with trafficking. Call out reporters who fall for this rhetoric: email them and their editors, leave comments on their stories, reply to their tweets and hold them accountable.
Having our viewpoint represented in the media is crucial. It doesn’t just strip away the stereotypes; it normalizes our inclusion in conversations, especially those that concern us, and de-stigmatizes our work.
But if you’re approached by a reporter, interview them first. Ask them if they’ve reported on sex workers in the past- get their references. What angle are they going for? How have they framed sex work previously? Can you request their interview questions in advance? Can you trust them to not just protect your identity, but identifying characteristics as well? If you’re not comfortable speaking to them, can you refer a sex-worker-led organization who is? And best of all: would they be open to letting you (or another sex worker) directly contribute a piece, instead?
If you’re in a position to do so, research your local politicians and representatives to see where they stand on sex work. If you can’t find out, ask, and share what you learn. Don’t just ask them for lip service. Push them to pass laws to fully decriminalize sex work—both buying and selling. Explain why the Nordic model is harmful, why legalization would further criminalize us by creating specific loopholes only accessible to rich business owners. We need laws to protect us from housing and job discrimination, to protect our ability to advertise online, to prevent our occupation from being used against us in family lawsuits.
There are many political groups who are friendly to sex workers, but even these groups need pressure from us to advocate for legislation that will help us. Push the ACLU to advocate for our civil rights, the DSA to endorse politicians who will speak up for us, and the EFF to challenge laws & technologies that deplatform and silence us. You can’t challenge an existing law without a plaintiff, and when we’re the ones being hurt, we are the ones who need to be willing to stand up and oppose it.
While our allies are crucial to making progress, getting involved with niche groups is both powerful and uniquely validating. Politicians are often afraid to meet with sex workers individually due to the stigma, so sex worker advocacy organizations have better chances at being taken seriously. By arming advocacy organizations with numbers (members), funding, and data, we’re sometimes able to get meetings with politicians that would otherwise be untouchable.
The Free Speech Coalition is an adult industry trade association that has helped block harmful legislation for porn actors. Hacking//Hustling researches the widespread impact of SESTA/FOSTA and shadowbanning, turning our stories into tangible data points. And the Adult Performing Actors Guild (APAG) helps organize members who have been deplatformed, and uses their lines of communication with social media companies and payment processors to help. Sex worker activism has real, tangible rewards.
If you don’t have the capacity to be involved in volunteering or the resources to donate, staying aware of what’s happening can still help you communicate our concerns to people in your life. Following SWers with an activist presence on social media—like Danielle Blunt or Mistress Matisse—can help you stay in the loop on news and research about the challenges we, as an industry, are facing. If you don’t have the emotional bandwidth to deal with often-depressing news, I know people who add SW activists & SW-friendly reporters to lists on Twitter and check them when they have the energy. Find what works for you. No act of activism is too small.
So you do all of these things.
You pre-screen your images. You use privacy-centric tech. You call your politicians. You fill out surveys, sign petitions, and send articles to your friends. You get involved and do your best to make meaningful change. But new laws will be announced. New terms of service will come into play. Your content gets shadowbanned. And you get deleted, yet again. All that effort feels like screaming into the void.
If you’re a sex worker, I want you to know that deplatforming is not something you can predict. There is no amount of caution that will prevent these things. You are being discriminated against and you don’t have a myriad of choices, no matter what you’re told by people outside the industry.
Your rage, frustration, and helplessness is valid.
And I hope you share it. Tell your stories.
I don’t say this to sound trite. I say this because—and I cannot emphasize this enough—this is how change actually fucking happens.
At the top levels of most tech companies, media organizations, and many banks, meaningful changes rarely start with data. They start with a quote, a provocative opinion, a moving account about someone’s niece who found something frustrating. The CEO hears an endearing tale in the news, makes an off-handed comment, and the next day thousands of employees alter their course and build something different. These decisions are justified retroactively.
I spent a decade in my former career sitting in rooms with high level executives watching this happen over and over, across dozens of companies and industries. I’ve talked with friends who work in high-level roles in banking, tech, and media about the most actionable ways to drive change, and they’ve all shared the same insight. As someone who loves research and values data-driven decision making, I find this deeply depressing. But as sex workers, we’re excellent at communicating and driving empathy, and leveraging this is part of how we win.
If you’re in a position to do so, strategically share your stories with sympathetic clients about the challenges you face—especially if you have clients who are in decision-making roles at their companies. Ask them what they would do in your shoes, because adopting a problem-solving mindset forces them to empathize.
Will your client stand up in a meeting and suddenly start advocating for sex workers? No. Sex work is still considered too taboo, and it will take a while to turn our quiet allies into vocal advocates. But he might mention how his “friend” really suffered when her account was deleted with no recourse, because he’s still thinking about your conversation last night.
These stories hold more weight than soulless numbers and statistics. These stories are what cause people to look for the data to back up their decisions.
Even though so much of the systemic discrimination we face is malicious, we can continue to make strides against the problem of ignorance. It’s deeply unfair that we have to shoulder the burden of humanizing ourselves, of pushing to make our stories visible in ways that don’t turn around and bite us. It’s doubly unfair that we have to do this in an environment where we risk losing access to basic resources.
But the more we do this, the more we matter.