from Hacker News

"white woman with white man" on Google Image Search

by 2-718-281-828 on 2/24/24, 8:21 AM with 20 comments

  • by jiggawatts on 2/24/24, 8:39 AM

    Bing, Yandex, and DuckDuckGo all have similar results.

    Before everyone gets up in arms about this, consider that this search is (for now) using simple keyword matching and isn't an AI that can understand sentences.

    If you search for "woman with man" you'll get mostly white couples and the like.

    Adding "white" anywhere prompts the search to look for the words "man", "woman", and "white". Images of random white couples aren't labelled with "white" in the alt-text or metadata. Images of mix-raced couples often mention "white", hence... the counter-intuitive behaviour.

    This is a bit like adding "Earth" to a search term. Practically speaking no pictures taken on Earth are labelled as such, so the matches will tend to have a space theme to them.

  • by NhanH on 2/24/24, 8:37 AM

    This happens to all search engines. I tried bing, ddg, yandex. Baidu shows the least interracial images (using the exact English phrase, not Chinese), but there are still a number of images.

    It is probably something with the content being produced itself, rather than bias within the search engine. But then, with the way things work, it might make no sense to separate the two.

  • by dash2 on 2/24/24, 8:28 AM

    As everyone knows, the correct way to search for this is "woman without white man".[1]

    [1] https://github.com/elsamuko/Shirt-without-Stripes

  • by dageshi on 2/24/24, 8:53 AM

    If you think about it, GPT is kinda the first time google has ever really been seriously challenged in their core business and I'm honestly wondering if it's causing the company to go a bit insane.

    Feels like the pressure to maintain the stock price and not rock the boat too much is meeting the kind of existential dread that GPT is going to eat their lunch and maybe sooner than later.

    Maybe this is some of that madness leaking out? Because I've never seen them fuck up search like this before, it feels inexplicable to me that google would let this reach production.

  • by andsoitis on 2/24/24, 8:50 AM

    Image Search (not just google, try this on yandex.com for instance) does not look for images that match what you query.

    It returns the images from web pages that are good matches for your query. There's no guarantee that the images on those top results will represent your query.

  • by mongol on 2/24/24, 8:39 AM

    For me, "black woman with black man" gives almost identical results. I am at loss what is happening here.
  • by Trias11 on 2/24/24, 8:35 AM

    Add correct pronouns :-P
  • by theGeatZhopa on 2/24/24, 8:57 AM

    Reading this. Me -> Instant horny

    Google has stopped to be very useful without precise searches. They give you what they think you should read, not what you ask for. I more and more use Copilot for searching the web. A few questions and a request for a list let me quickly decide where to go.