by mola on 5/30/25, 5:05 AM
I have this pet theory that this wave of AI won't end with amazing productivity gains.
I think it'll end in mass confusion. Dumbing down of the leadership and elites that now can cosplay as smart and eloquent.
countries that don't embrace AI will have a massive edge over other countries because their population will be smarter and more capable.
by aorloff on 5/30/25, 4:58 AM
I assure you Mrs. Buttle,
the Ministry is very scrupulous about
following up and eradicating any error.
If you have any complaints which you'd
like to make, I'd be more than happy to
send you the appropriate forms.
Braaaaazil......
[where my mind goes on every news from this admin]
by Animats on 5/30/25, 5:11 AM
Someone should put in an FOIA request for the prompt. Or get a congressional committee to ask for it.
by llacb47 on 5/30/25, 5:27 AM
Technically, this is title editorializing, since WH denied using AI… but they are probably lying.
by esalman on 5/30/25, 5:11 AM
First the bogus tariff equation, now this. Certainly not the last.
Jokes aside, it's really sad to see seemingly no competent people not wanting to work with this administration.
by mikestew on 5/30/25, 4:51 AM
by duxup on 5/30/25, 4:44 AM
These people even struggle to do a bad job well enough.
by platevoltage on 5/30/25, 4:48 AM
Next time, they'll just add "don't include citations" in the prompt. Problem solved.
by caseysoftware on 5/30/25, 5:18 AM
Can we all agree to not cite any study that a) doesn't exist, b) cannot be reproduced, or c) includes fake data?
For bonus points, the participants of the b) and c) options should be forced to pay back whoever funded their research.
by Eddy_Viscosity2 on 5/31/25, 11:44 AM
This is a great example of why AI is the perfect tool for politicians who want to do something, then try to find some rational for it. AI will always give you the answer you want, even if it has to make it up.
by gloosx on 5/30/25, 1:08 PM
At this point we can fully replace talking heads with AI. Let them generate reports, write fake citations, review each other’s work, sign it, worship it – all in imitation of what we once called a state. A huge win for humanity.
by kcaseg on 5/30/25, 5:07 AM
Reminds me of this “AI study” on climate change
https://factcheck.afp.com/doc.afp.com.39798G2
It is not the “minor citation errors” what are the most worrying, but rather the fact that LLMs aim to please you, even if you really believe your prompt was neutral (which I doubt when it comes to Kennedy and vaccines for example).
by anal_reactor on 5/30/25, 7:26 AM
The goal here is to normalize the US government doing random shit, in order to reduce transparency. Average US citizen won't be able to tell if another political scandal is just Trump being Trump, or actual power grab.
by bananapub on 5/30/25, 8:11 AM
this is obviously horrific, but they are succeeding at their broader goal - making the entire US Federal apparatus untrustworthy and operated completely by selfish idiots who do whatever they want at any given time, with the only binding force the idiotic whims of the president. (almost all of - AOC and Bernie and a few others are notable exceptions) congress has already completely conceded this fact and continues to do so by not impeaching the president.
to me, it is extremely hard to imagine how this deliberate destruction can be undone in less than decades.
by aristofun on 5/30/25, 1:12 PM
Not really sure what’s better - hallucinating LLM or hallucinating politicians…
by staplung on 5/30/25, 5:05 AM
Think the title here is a little misleading. The citations do indeed appear to be hallucinations but it is not known if the report was written partially or entirely with an LLM. I would have no trouble believing that it was but at this point there isn't even a leak at MAHA claiming that it is.