When Google plays judge and jury: account gone, zero due process

3 months ago 2

Mark Russo

Press enter or click to view image in full size

*I built an app, Punge, to help people take control of their private data. Then Google banned me, labeled me a CSAM risk, and cut me off from the tools I need to survive as a developer.

It’s hard to describe the shock of being accused of something as serious — and stigmatizing — as CSAM (child sexual abuse material). There’s no trial. No opportunity to explain. Just silence and permanent suspension. Your account is gone. Your business is crippled. Your name may be reported to government agencies. And there’s almost no path back.

I experienced this firsthand as an independent app developer. I had recently completed a benchmark study showing that my on-device NSFW detection model, Punge, outperformed commercially available solutions — including Google Cloud Vision. Excited by the results, I wanted to expand the study using a larger dataset.

I turned to Academic Torrents, a platform widely used by researchers to share datasets. There, I downloaded a large NSFW dataset often cited in AI research. I unzipped the file in my Google Drive to begin preprocessing. A few days later, my entire Google account was banned — without warning, without explanation, and without any clear path to appeal.

I wrote about the full ordeal here, including how it’s left me cut off from Gmail, Firebase, Google Cloud, AdMob — everything my apps rely on to function.

I developed Punge, a privacy-first app to help people detect and manage NSFW content on their own devices — locally, without ever uploading to the cloud to ensure people can secure their personal content.

The illusion of justice

Google publicly claims to have reported hundreds of thousands of CSAM cases to authorities. But what they don’t say is how many led to convictions — or how many turned out to be false positives. What happens to the people falsely flagged? Where are their stories?

There’s a deeper problem here: there is no due process. In the U.S., if a private platform flags you, your account is erased. You are guilty until proven innocent — except you’re given no way to prove anything. As Eric Goldman points out in this blog post, Google isn’t even required to explain why they shut you down. You can sue them — but you’re one person taking on a trillion-dollar company. Good luck with that.

We speak about due process for immigrants, and most Americans overwhelmingly support it. But what happens when tech companies operate as judge, jury, and executioner — without any check on their power?

Europe has stronger protections — why not us?

The European Union has implemented laws that prevent companies like Google from taking such extreme action without oversight. But in the U.S., there is no meaningful recourse. Your livelihood can be destroyed with one AI-generated decision, and the burden is on you to fight back — without tools, support, or even a human contact.

We have the Consumer Financial Protection Bureau (CFPB) for banking complaints. But even that system is weak. Earlier this year, my phone and wallet were stolen. $700 in Apple Cash, issued by GreenDot Bank, was stolen from my account. I had the recipient’s phone number, timestamps, and device information. I filed complaints with the CFPB and FDIC. They forwarded it to GreenDot, who reviewed and denied the claim again. I gave up.

That experience showed me: without a strong, independent agency to advocate for individuals, big companies will always win — and people will lose.

What happens when the system fails you? You speak up.

My story isn’t just about a technical violation or dataset confusion. It’s about what happens when individuals — especially small developers — run afoul of automated systems built to protect society, but with no accountability.

I didn’t break the law. I didn’t distribute anything. I built something to help people. And yet my name may now be associated with something horrible, my business is under threat, and my voice is the only thing I have left.

So I’m using it.

If this concerns you — whether you’re a developer, researcher, policymaker, or just someone who cares about fairness — please share this story. And if you’ve been through something similar, know that you’re not alone.

We can’t fix what we don’t talk about.

A Necessary Fight — But Accountability Matters

Let me be clear: I deeply appreciate Google’s efforts to detect and report CSAM. These systems are vital for protecting children and stopping abuse, and I fully support that mission.

But good intentions are not a substitute for fairness. Right now, individuals who are falsely flagged have no way to defend themselves, no way to clear their name, and no meaningful path to restore their livelihood.

If a system is powerful enough to destroy someone’s life, it should also be strong enough to offer transparency, review, and correction. That’s all I — and others like me — are asking for: a fair process to protect the innocent while continuing the fight against real harm.

Share it with friends. Share it on social media. And most importantly, share it with your lawmakers.

Ask them:

  • Why can a company terminate someone’s account without explanation?
  • Where is the oversight when reputations and livelihoods are destroyed by automated decisions?
  • Why doesn’t the U.S. have clear protections like the EU does?

These systems were designed to protect people — but when they operate without transparency or accountability, they hurt innocent people too.

We need laws that protect users as well as platforms.
We need due process.
We need reform.

🗳️ Find and contact your members of U.S. Congress using tools like:

  • Democracy.io — enter your address to identify your Congressional representatives and email them directly.
  • USA.gov — find elected officials and contact forms for local, state, and federal offices.

Suggested message template:

Subject: Please support digital due process and platform oversight

Dear [Representative/Senator Name],

My name is [Your Name], I live in [City, State]. I’m writing because I was recently suspended by Google’s automated systems and accused of involvement with CSAM — despite never interacting with illegal material. My account was terminated without explanation or meaningful appeal, wiping out access to essential developer tools like Firebase and Google Cloud.

If a private company can determine someone’s guilt — and destroy their livelihood — based only on automated detection, that’s a fundamental fairness problem. I urge you to support legislation that ensures:

• Transparency and notice before account termination

• Access to appeal and independent review

• Accountability for platforms using automated content detection

Why this matters:

  • Google claims to report hundreds of thousands of CSAM cases, yet doesn’t disclose how many are false positives or result in actual legal convictions — leaving many individuals without recourse.
    Electronic Frontier FoundationCongress.gov
  • In Europe, digital due process protections prevent platforms from permanently stripping accounts without oversight — protections that don’t exist here in the U.S.
  • Just as we expect individuals to have access to financial dispute resolution systems, there should be equitable mechanisms for digital disputes — especially when livelihoods and reputations are at stake.
Read Entire Article