from Hacker News

A dad took photos of his toddler for a doctor – Google flagged him as a criminal

by dklsf on 8/21/22, 10:15 AM with 559 comments

  • by Traubenfuchs on 8/21/22, 10:35 AM

  • by elif on 8/21/22, 2:22 PM

    From my travels, my impression has been that America in particular treats child nudity as completely, unexceptionally obscene, beyond even adult nudity.

    Compared to a beach in Europe, where nearly half of children under 2 run naked, there seems to be no grey area or minimum acceptability in the US.

    It makes me wonder if our hypersexualized treatment of child nudity /actively contributes/ to the sexualization of children in our culture.

  • by ugjka on 8/21/22, 11:24 AM

    > Not only did he lose emails, contact information for friends and former colleagues, and documentation of his son’s first years of life, his Google Fi account shut down, meaning he had to get a new phone number with another carrier. Without access to his old phone number and email address, he couldn’t get the security codes he needed to sign in to other internet accounts, locking him out of much of his digital life.

    > “The more eggs you have in one basket, the more likely the basket is to break,” he said.

    I only have a google account to have access to Play store. Everything else, mail, calendar, photos and storage, has been moved off Google

  • by cletus on 8/21/22, 12:47 PM

    The biggest failure here is Google's continued refusal to segment their services and this has been a problem since at least the Google+ days, which is more than a decade ago at this point.

    During Google+, Google controversially instituted a "real name" policy. This was controversial and completely driven by Vic Gundotra who would get up at company meetings when asked about this (back when he still answered questions because believe me that ended) and said "we don't want people named 'dog fart'".

    Legitimate concerns that people might have for their safety were completely brushed aside.

    Anyway, the enforcement for this of course was automatic and resulted in I'm sure many false positives. But what happened when your account was flagged/ You lose access to everything.

    This too was criticized at the time and people asked "well if it's a policy violation for Google+, why do people lose their Gmail?". These questions too were completely brushed off.

    At this time I decided I simply couldn't and wouldn't use any other Google service because my Gmail is too important to risk by an automatic ban from a false positive on a completely unrelated product.

    And here we are, a decade later, with the exact same nonsense happening.

    Now of course the CSAM ban itself is ridiculous. Refusing to reverse it is ridiculous. All of that is true. But don't tie your email or your phone to a company's other services when they do blanket bans like this.

    Disclaimer: Xoogler.

  • by zerof1l on 8/21/22, 11:30 AM

    This story should serve as a warning and a motivation to move away from iCloud, Google Cloud, and alike for video and photo storage. The easiest thing is to just purchase a drive or NAS and store your media on it. If you're a bit tech-savvy, you can run your own NextCloud. I run NextCloud on Hetzner, 1TB storage box + web server for a total of $7 a month. Or you can get Hetzner to run the NextCloud for you for about $4.7 a month for 1TB storage, but then you don't have full control over it. Then there are quite nice NextCloud apps for Android and iOS that you can configure to sync your photos and videos into your cloud.
  • by mikece on 8/21/22, 10:33 AM

    I worked in a 1 Hour Photo when I was in college. The standing rule we had is that any photos of naked adults were not printed (the customer was given their negatives though). In the case of naked children if it was the typical "kid taking their first bath" it was fine but if there was any doubt we had a manager review it. I think we had to call the cops a couple times (more for passing the buck than making an official decision like that ourselves) but there is/was a policy for things like that.
  • by MrDresden on 8/21/22, 11:43 AM

    I keep telling anyone who will listen that they need to move every aspect of their online identity away from the big tech giants, and make sure that each type of service (email, webpage, memory storage, document storage, backups, contacts, etc) is compartmentalized in such a way that if one gets removed then it wont affect the others in any way.

    Imagine having all your memories in the form of images and videos taken away because of sloppy review work at a tech company.

  • by Havoc on 8/21/22, 11:52 AM

    It's completely ridiculous that Google's customer support process is basically:

    1) AI bot

    2) Beg HN/twitter for insider help

    3) Lawyers

    That's absolutely insane for a product set that has become central to one's life

  • by sgdesign on 8/21/22, 10:34 AM

    I’ve heard great things about Google Domains but this kind of story is exactly why I probably won’t be using that service. It’s just too risky if you lose everything at once.
  • by SergeAx on 8/21/22, 12:10 PM

    > Mark spoke with a lawyer about suing Google and how much it might cost.

    > “I decided it was probably not worth $7,000,” he said.

    I believe it is one of the roots of the problem. How is it even possible that getting justice in court in such a trivial case costs about three months of median income?

  • by googlryas on 8/21/22, 3:29 PM

    Have you ever hated child porn so much that you sent private medical photos of someone's naked children to multiple strangers?

    If not, you obviously aren't as committed as Google about ending CSAM.

  • by LorenPechtel on 8/22/22, 4:48 AM

    The real problem here is companies are not cops and should quit acting like cops.

    The instant a company has evidence of a possible crime being committed they should be required to hand the evidence over to the police and then take no other action other than preventing distributing it or the like.

    This is not just Google's AI goofing up on what constitutes CSAM (and it sounds like given the witch hunt about such things that Google was being reasonable in informing the police), but colleges expelling "rapists" without evidence etc. The accused never gets anything resembling a fair trial but since it's not the government doing it that doesn't matter, there's no repercussions from messing up lives based on completely incompetent investigations.

  • by tomohawk on 8/21/22, 1:18 PM

    It's time for the people to decide how they want companies that provide utilities to behave, and time for utility companies to stop telling the people how to behave.

    In the olden days, if the AT&T monopoly just cut off phone service to a (convicted in court) pedo, they would get in severe trouble. We the people imposed limits on powerful companies. Even today, with the monopoly split up, this would not be legal. Let alone just deciding on their own initiative to do it.

    In this case, a utility provider is cutting off service based on a digital rumor. They are judge, jury, and executioner.

    The laws governing telcos were made over a period of 150 years, but most particularly in the 1920s and 1930s.

    Google does not fit these laws because they do not charge for them (perhaps this should be made illegal?) and monetize them differently. Also, obviously the services are far beyond simple voice or fax. And yet, they are definitely utilities.

    Utility companies must not be politically partisan or active. Mixing those two things is toxic and bad for society. It also is too much of a temptation for politicians to use the implied power of utilities over the people to silence or supress opposition.

    If Google wants to be an activist company, then it will need to shed its utilities. If Google wants to provide utilities, then it needs to shut down its activism.

  • by mgpc on 8/21/22, 9:09 PM

    Google's arrogance here is astonishing. The police say no fault, and it's the subject of an NYT investigation, and they still won't restore the account. What hope do the rest of us have?

    I've been an Android user for a long time, but I think this might finally push me to switch to Apple. I'm just disgusted by this.

  • by SergeAx on 8/21/22, 12:13 PM

    Also, it looks like a powerful attack vector. Just slip a questionable content onto a victim's phone - and voila, a lot of trouble is under way, probably irreversible.
  • by superchroma on 8/21/22, 10:20 AM

    "Google’s review team [then] flagged a video he made and the San Francisco Police Department had already started to investigate him."

    That sounds like a permanent stain on his records.

  • by nilleo on 8/22/22, 5:44 AM

    This might be the final push I needed to migrate off of Google services. It's been all too convenient to have a one stop shop for everything, but I couldn't imagine my rage if I lost all of my child's pictures because Google decided that the picture of their first bath (no genitals or face in frame) was too risky.

    What's recommended for a domain registrar to move my domains over to?

    I figure I can probably self-host photo/file backups, move 2FA to Bitwarden, and migrate mail over to a paid Protonmail plan, but who can I trust for domain names? Mostly just for email aliases, but a couple for some hobby websites. GoDaddy can take a hike, and I've used namecheap before but what other options are good/trusted?

  • by bborud on 8/21/22, 2:01 PM

    This highlights the enormous risk depending on Google, and similar service providers, for email, messaging and other important services is. It isn't so much the policies, but the fact that Google will never do anything to help you when they get it wrong.

    You are always only one algorithmic fuck-up away from losing access and having to spend days and weeks dealing with the consequences.

    I think the only way to deal with this is through regulation. Make it as inconvenient for Google to ignore customers as it is inconvenient for customers to be ignored by their service provider when something goes wrong.

    Systemic mistreatment of customers ought to have consequences of existential proportions to a company. There is no societal benefit to companies like Google getting away with behaving this poorly.

  • by RcouF1uZ4gsC on 8/21/22, 12:33 PM

    > subsequent review of his account turned up a video from six months earlier that Google also considered problematic, of a young child lying in bed with an unclothed woman.

    Isn’t that also a description of breastfeeding?

  • by lizardactivist on 8/21/22, 10:51 AM

    But they keep insisting that their services and cloud storage is private and secure? Odd.
  • by Tiddles-the2nd on 8/22/22, 12:30 AM

    > “I decided it was probably not worth $7,000, [to sue Google]” he said.

    This is a big part of the problem, technically you have a recourse, but the cost for individuals is a barrier to justice. Organisations have a lot of freedom to act behind the cost to litigate.

  • by neurostimulant on 8/21/22, 1:04 PM

    My wife is a doctor and received a few of these kind of images in whatsapp from her patients from time to time. Should she be concerned?
  • by ramesh31 on 8/21/22, 1:47 PM

    What's actually scary here is that these were newly taken photos; not existing CSAM material flagged by hash value. That means Google is doing real time image recognition on all of your photos. And that means Google has an ML model somewhere trained on millions of pictures of.... yeah this is fucked up.
  • by scarface74 on 8/21/22, 10:47 AM

    I have a larger issue that no one addressed. There has to be some type of special software for the medical profession that allows you to take a picture on your phone that is not stored on your phone and that you send to the doctor.
  • by scarface74 on 8/21/22, 11:00 AM

    I’ve heard too many things about how non existent Google’s customer service is to ever trust them for anything critical. You hardly ever hear stories like this from any of the other big tech companies. I’m excluding FB, because I don’t care about FB enough.
  • by JulianMorrison on 8/21/22, 1:09 PM

    Reporting the images to law enforcement is good. There should be a human in the loop to separate medical images from exploitative ones.

    Perma-deleting his account on an automated accusation is bad. That should hinge on, at minimum, law enforcement's decision to charge a crime. [Edit: unless the criminality of the images is obvious - again, a human needs to be in the loop.]

  • by shadowgovt on 8/21/22, 3:23 PM

    This was a predictable outcome of the following factors:

    * Criminalizing possession and creation of child pornography equivalently

    * Conscripting the tech sector into detecting it via SESTA/FOSTA.

    Look forward to reading stories like this ad nauseum.

  • by nullc on 8/21/22, 11:32 AM

    > “I decided it was probably not worth $7,000,” he said.

    lol. Missing 4 zeros there.

    Part of the reason for the brazen actions of companies like Google is that their substantial financial means and legal department sizes grant them a substantial degree of immunity to judicial review.

  • by S_A_P on 8/21/22, 1:56 PM

    Time to break up the google into smaller parts that may allow a competitive market again. Google FB Apple could all use a bit of competition
  • by peoplefromibiza on 8/21/22, 10:57 AM

    controversial opinion: as much as everybody knows that China isn't exactly championing the western way of life and western democratic standards, I keep my private files in a Chinese cloud (backups are kept private in a NAS in my house).

    Why?

    Because they are not in contact with our authorities and, frankly, the chances my private files will be of any interest for Chinese authorities are close to zero.

    Not that I have nothing in particular to hide, but as this example proves once again, if life damaging mistakes can happen, they will happen.

  • by koala_man on 8/22/22, 12:58 AM

    My parents gave me an old photo album I was going to digitize. It includes a photo of me age 1 having a bath in a wash tub.

    It never occurred to me that this might get my account banned.

  • by TheRealDunkirk on 8/22/22, 3:20 AM

    Given this story, you’d expect Google to make a yearly report saying that they successfully threw X number of pedophiles off their services, and the FBI convicted Y% of them. You’d think it would be something they and the government would love to crow about. But they don’t. Why?
  • by benj111 on 8/21/22, 11:49 AM

    I wonder what happens if I share the Nevermind album cover?
  • by sinuhe69 on 8/22/22, 1:32 AM

    Did Google actively scan and classify user’s photos? Because it seems unlikely that these particular photos would match one from the CSAM database.
  • by soraminazuki on 8/21/22, 10:56 AM

    > A Google spokeswoman said the company stands by its decisions, even though law enforcement cleared the two men.

    Wow. Just wow. This is worse than the usual Google's automated screw-ups. In this case, Google was notified of the issue by the NYT. Yet they actively chose to continue to screw over their victims just because they can.

    > In a statement, Google said, “Child sexual abuse material is abhorrent and we’re committed to preventing the spread of it on our platforms.”

    Just how tone deaf can Google be, continuing to treat these innocent folks as criminals in this passive aggressive statement even after being proven wrong? Do these people have no empathy at all?

  • by gregwebs on 8/21/22, 12:49 PM

    What are the ways to mitigate these issues for someone that wants a lot of the features of Google Photos? It seems that Amazon Photos is basically copying Google Photos and has a lot of the features. I wouldn't care if my Amazon account was closed down. And it is free if you are already a Prime member.
  • by quantified on 8/21/22, 9:40 PM

    What is not explained in the article is how the telehealth's services came in contact with Google's image-scanning to begin with.

    Is this not a huge breach of HIPAA?

    Telehealth is sort of done for if randos at bug tech can find there way into your sexual health records.

  • by Overtonwindow on 8/21/22, 1:33 PM

    ..and with CSAM Apple is going to get in the same business. With this latest security update I would not be surprised if CSAM has already been deployed.

    Because why on earth would you oppose protecting children? /s

  • by engineer_dude on 8/23/22, 1:33 PM

    This is why I don't have a Dropbox account anymore.

    I am extremely fortunate that the account that was deleted without recourse only contained data I had copies of on my hard drive, and to my knowledge law enforcement isn't involved.

    The article fails to mention the stress and trauma of being accused of having CSAM. That remains to this day ... I'm posting from an alt because even the false accusation carries a potentially career and family destroying stigma.

  • by aaomidi on 8/21/22, 1:04 PM

    I wonder what google would do if they realized there’s millions of people who live as nudists.
  • by barneygale on 8/21/22, 12:07 PM

    GOOGLE EMPLOYEES: Quit your jobs and do something positive for the world for a change.
  • by welder on 8/21/22, 1:20 PM

    What if someone emails stock photos to Google execs that are known to trigger Google's child abuse algorithm? They would have to build a way to re-activate banned accounts to get their own accounts back.
  • by Animats on 8/21/22, 11:53 PM

    Google needs to be a regulated common carrier for many of their services. Then you have a right to service.
  • by tomxor on 8/21/22, 3:43 PM

    > Mark and his wife gave no thought to the tech giants that made this quick capture and exchange of digital data possible

    Well... here we are, normal people don't think it's possible to transfer an image over the internet without a megacorp being in the middle of it. Pretty strong sign something has gone wrong.

  • by samat on 8/21/22, 11:02 AM

    Is this the case with Apple iCloud Photo Library, too?
  • by bigtex on 8/21/22, 2:01 PM

    We have a direct primary care doctor for our children and would never send a photo with genitalia in it. Either she comes for a house call or we come to her. This article confirms my fear.
  • by abetancort on 8/22/22, 9:40 PM

    Sue Google.
  • by xbar on 8/21/22, 2:14 PM

    Apple Messages encryption is looking pretty good right now. Let's hope they hold the line on CSAM.
  • by throwaway4good on 8/22/22, 1:33 AM

    Sounds like a great tool. Maybe it can be used for things … like preventing the next 9/11.
  • by georgia_peach on 8/21/22, 11:03 AM

    They all embed themselves deeply into our communications, for not-so-altruistic purposes--allegedly to "serve us better", realistically to train the shit out of their AIs in the hopes of growing (or at least maintaining) market share. If people weren't such cattle, a hard line would have already been drawn. If...
  • by robotburrito on 8/21/22, 6:01 PM

    Richard was right.
  • by hk1337 on 8/21/22, 1:52 PM

    It doesn’t seem right for the doctor to ask the parents to take pictures then send them over SMS, email, or whatever they asked them to use. Why wouldn’t this just be done within the privacy of the doctor’s office?
  • by makach on 8/21/22, 12:54 PM

    It is beyond me that someone would use email to submit sensitive information. Pandemic aside, you should know better.

    Also, I am sorry this happened. It is very human to respond to a person in authority - but we need to be better and start asking questions. It is our privacy at stake.

    Hopefully everyone learns from this; Also, Google was doing the right things.