by 2-718-281-828 on 2/24/24, 8:21 AM with 20 comments
by jiggawatts on 2/24/24, 8:39 AM
Before everyone gets up in arms about this, consider that this search is (for now) using simple keyword matching and isn't an AI that can understand sentences.
If you search for "woman with man" you'll get mostly white couples and the like.
Adding "white" anywhere prompts the search to look for the words "man", "woman", and "white". Images of random white couples aren't labelled with "white" in the alt-text or metadata. Images of mix-raced couples often mention "white", hence... the counter-intuitive behaviour.
This is a bit like adding "Earth" to a search term. Practically speaking no pictures taken on Earth are labelled as such, so the matches will tend to have a space theme to them.
by NhanH on 2/24/24, 8:37 AM
It is probably something with the content being produced itself, rather than bias within the search engine. But then, with the way things work, it might make no sense to separate the two.
by dash2 on 2/24/24, 8:28 AM
by dageshi on 2/24/24, 8:53 AM
Feels like the pressure to maintain the stock price and not rock the boat too much is meeting the kind of existential dread that GPT is going to eat their lunch and maybe sooner than later.
Maybe this is some of that madness leaking out? Because I've never seen them fuck up search like this before, it feels inexplicable to me that google would let this reach production.
by andsoitis on 2/24/24, 8:50 AM
It returns the images from web pages that are good matches for your query. There's no guarantee that the images on those top results will represent your query.
by mongol on 2/24/24, 8:39 AM
by Trias11 on 2/24/24, 8:35 AM
by theGeatZhopa on 2/24/24, 8:57 AM
Google has stopped to be very useful without precise searches. They give you what they think you should read, not what you ask for. I more and more use Copilot for searching the web. A few questions and a request for a list let me quickly decide where to go.