by dklsf on 8/21/22, 10:15 AM with 559 comments
by Traubenfuchs on 8/21/22, 10:35 AM
by elif on 8/21/22, 2:22 PM
Compared to a beach in Europe, where nearly half of children under 2 run naked, there seems to be no grey area or minimum acceptability in the US.
It makes me wonder if our hypersexualized treatment of child nudity /actively contributes/ to the sexualization of children in our culture.
by ugjka on 8/21/22, 11:24 AM
> “The more eggs you have in one basket, the more likely the basket is to break,” he said.
I only have a google account to have access to Play store. Everything else, mail, calendar, photos and storage, has been moved off Google
by cletus on 8/21/22, 12:47 PM
During Google+, Google controversially instituted a "real name" policy. This was controversial and completely driven by Vic Gundotra who would get up at company meetings when asked about this (back when he still answered questions because believe me that ended) and said "we don't want people named 'dog fart'".
Legitimate concerns that people might have for their safety were completely brushed aside.
Anyway, the enforcement for this of course was automatic and resulted in I'm sure many false positives. But what happened when your account was flagged/ You lose access to everything.
This too was criticized at the time and people asked "well if it's a policy violation for Google+, why do people lose their Gmail?". These questions too were completely brushed off.
At this time I decided I simply couldn't and wouldn't use any other Google service because my Gmail is too important to risk by an automatic ban from a false positive on a completely unrelated product.
And here we are, a decade later, with the exact same nonsense happening.
Now of course the CSAM ban itself is ridiculous. Refusing to reverse it is ridiculous. All of that is true. But don't tie your email or your phone to a company's other services when they do blanket bans like this.
Disclaimer: Xoogler.
by zerof1l on 8/21/22, 11:30 AM
by mikece on 8/21/22, 10:33 AM
by MrDresden on 8/21/22, 11:43 AM
Imagine having all your memories in the form of images and videos taken away because of sloppy review work at a tech company.
by Havoc on 8/21/22, 11:52 AM
1) AI bot
2) Beg HN/twitter for insider help
3) Lawyers
That's absolutely insane for a product set that has become central to one's life
by sgdesign on 8/21/22, 10:34 AM
by SergeAx on 8/21/22, 12:10 PM
> “I decided it was probably not worth $7,000,” he said.
I believe it is one of the roots of the problem. How is it even possible that getting justice in court in such a trivial case costs about three months of median income?
by googlryas on 8/21/22, 3:29 PM
If not, you obviously aren't as committed as Google about ending CSAM.
by LorenPechtel on 8/22/22, 4:48 AM
The instant a company has evidence of a possible crime being committed they should be required to hand the evidence over to the police and then take no other action other than preventing distributing it or the like.
This is not just Google's AI goofing up on what constitutes CSAM (and it sounds like given the witch hunt about such things that Google was being reasonable in informing the police), but colleges expelling "rapists" without evidence etc. The accused never gets anything resembling a fair trial but since it's not the government doing it that doesn't matter, there's no repercussions from messing up lives based on completely incompetent investigations.
by tomohawk on 8/21/22, 1:18 PM
In the olden days, if the AT&T monopoly just cut off phone service to a (convicted in court) pedo, they would get in severe trouble. We the people imposed limits on powerful companies. Even today, with the monopoly split up, this would not be legal. Let alone just deciding on their own initiative to do it.
In this case, a utility provider is cutting off service based on a digital rumor. They are judge, jury, and executioner.
The laws governing telcos were made over a period of 150 years, but most particularly in the 1920s and 1930s.
Google does not fit these laws because they do not charge for them (perhaps this should be made illegal?) and monetize them differently. Also, obviously the services are far beyond simple voice or fax. And yet, they are definitely utilities.
Utility companies must not be politically partisan or active. Mixing those two things is toxic and bad for society. It also is too much of a temptation for politicians to use the implied power of utilities over the people to silence or supress opposition.
If Google wants to be an activist company, then it will need to shed its utilities. If Google wants to provide utilities, then it needs to shut down its activism.
by mgpc on 8/21/22, 9:09 PM
I've been an Android user for a long time, but I think this might finally push me to switch to Apple. I'm just disgusted by this.
by SergeAx on 8/21/22, 12:13 PM
by superchroma on 8/21/22, 10:20 AM
That sounds like a permanent stain on his records.
by nilleo on 8/22/22, 5:44 AM
What's recommended for a domain registrar to move my domains over to?
I figure I can probably self-host photo/file backups, move 2FA to Bitwarden, and migrate mail over to a paid Protonmail plan, but who can I trust for domain names? Mostly just for email aliases, but a couple for some hobby websites. GoDaddy can take a hike, and I've used namecheap before but what other options are good/trusted?
by bborud on 8/21/22, 2:01 PM
You are always only one algorithmic fuck-up away from losing access and having to spend days and weeks dealing with the consequences.
I think the only way to deal with this is through regulation. Make it as inconvenient for Google to ignore customers as it is inconvenient for customers to be ignored by their service provider when something goes wrong.
Systemic mistreatment of customers ought to have consequences of existential proportions to a company. There is no societal benefit to companies like Google getting away with behaving this poorly.
by RcouF1uZ4gsC on 8/21/22, 12:33 PM
Isn’t that also a description of breastfeeding?
by lizardactivist on 8/21/22, 10:51 AM
by Tiddles-the2nd on 8/22/22, 12:30 AM
This is a big part of the problem, technically you have a recourse, but the cost for individuals is a barrier to justice. Organisations have a lot of freedom to act behind the cost to litigate.
by neurostimulant on 8/21/22, 1:04 PM
by ramesh31 on 8/21/22, 1:47 PM
by scarface74 on 8/21/22, 10:47 AM
by scarface74 on 8/21/22, 11:00 AM
by JulianMorrison on 8/21/22, 1:09 PM
Perma-deleting his account on an automated accusation is bad. That should hinge on, at minimum, law enforcement's decision to charge a crime. [Edit: unless the criminality of the images is obvious - again, a human needs to be in the loop.]
by shadowgovt on 8/21/22, 3:23 PM
* Criminalizing possession and creation of child pornography equivalently
* Conscripting the tech sector into detecting it via SESTA/FOSTA.
Look forward to reading stories like this ad nauseum.
by nullc on 8/21/22, 11:32 AM
lol. Missing 4 zeros there.
Part of the reason for the brazen actions of companies like Google is that their substantial financial means and legal department sizes grant them a substantial degree of immunity to judicial review.
by S_A_P on 8/21/22, 1:56 PM
by peoplefromibiza on 8/21/22, 10:57 AM
Why?
Because they are not in contact with our authorities and, frankly, the chances my private files will be of any interest for Chinese authorities are close to zero.
Not that I have nothing in particular to hide, but as this example proves once again, if life damaging mistakes can happen, they will happen.
by koala_man on 8/22/22, 12:58 AM
It never occurred to me that this might get my account banned.
by TheRealDunkirk on 8/22/22, 3:20 AM
by benj111 on 8/21/22, 11:49 AM
by sinuhe69 on 8/22/22, 1:32 AM
by soraminazuki on 8/21/22, 10:56 AM
Wow. Just wow. This is worse than the usual Google's automated screw-ups. In this case, Google was notified of the issue by the NYT. Yet they actively chose to continue to screw over their victims just because they can.
> In a statement, Google said, “Child sexual abuse material is abhorrent and we’re committed to preventing the spread of it on our platforms.”
Just how tone deaf can Google be, continuing to treat these innocent folks as criminals in this passive aggressive statement even after being proven wrong? Do these people have no empathy at all?
by gregwebs on 8/21/22, 12:49 PM
by quantified on 8/21/22, 9:40 PM
Is this not a huge breach of HIPAA?
Telehealth is sort of done for if randos at bug tech can find there way into your sexual health records.
by Overtonwindow on 8/21/22, 1:33 PM
Because why on earth would you oppose protecting children? /s
by engineer_dude on 8/23/22, 1:33 PM
I am extremely fortunate that the account that was deleted without recourse only contained data I had copies of on my hard drive, and to my knowledge law enforcement isn't involved.
The article fails to mention the stress and trauma of being accused of having CSAM. That remains to this day ... I'm posting from an alt because even the false accusation carries a potentially career and family destroying stigma.
by aaomidi on 8/21/22, 1:04 PM
by barneygale on 8/21/22, 12:07 PM
by welder on 8/21/22, 1:20 PM
by Animats on 8/21/22, 11:53 PM
by tomxor on 8/21/22, 3:43 PM
Well... here we are, normal people don't think it's possible to transfer an image over the internet without a megacorp being in the middle of it. Pretty strong sign something has gone wrong.
by samat on 8/21/22, 11:02 AM
by bigtex on 8/21/22, 2:01 PM
by abetancort on 8/22/22, 9:40 PM
by xbar on 8/21/22, 2:14 PM
by throwaway4good on 8/22/22, 1:33 AM
by georgia_peach on 8/21/22, 11:03 AM
by robotburrito on 8/21/22, 6:01 PM
by hk1337 on 8/21/22, 1:52 PM
by makach on 8/21/22, 12:54 PM
Also, I am sorry this happened. It is very human to respond to a person in authority - but we need to be better and start asking questions. It is our privacy at stake.
Hopefully everyone learns from this; Also, Google was doing the right things.