by jonthepirate on 5/31/23, 7:07 PM with 18 comments
It feels like it understands what I'm asking for and it provides good answers, but so does Google.
I think calling this "Artificial Intelligence" creates a misunderstanding of what's going on because it's pattern matching.
Sure, the input and output is way better than Google, but if it can't reason, where's the intelligence? The whole thing seems like a hype train that I'm evidently not on.
by PaulHoule on 5/31/23, 7:16 PM
Particularly any kind of "A.I." is always considered by some to not be "A.I." for instance a chess playing program is just searching moves, an expert system is just applying rules, software that lays out microchips is just solving an optimization problem, etc.
It is moving the goalposts and it is a form of ignorance that leaves the field wide open to the likes of Eliezer Yudkowsky. Particularly, like the aphorism that "an LLM can't create anything new" it distracts people from the serious task of figuring out what specific things these things can and cannot do.
by proc0 on 5/31/23, 7:35 PM
Also, and maybe more importantly, it can be argued that reasoning is a form of pattern matching. All that brains do is pattern match, they just do it in a complicated way that we have no clue about yet, and therefore all the side effects and intricacies of the brain's architecture are not seen in the relatively more simple algorithms that we have now with artificial neural networks.
That said, maybe a better term could be Algorithmic Cognitive Tools, or something similar to point out it's just extending our own intelligence, however I think most agree that eventually we will have proper AI and machines will be doing some form of reasoning whether human-like or not. I just don't think that "cognitive architectures" (another misnomer abusing biological terms) are there yet.
by wsgeorge on 5/31/23, 7:34 PM
Is there a name for when someone says "X is not really Y, but it is rather <mentions lower-level mechanism employed by X to achieve Y>"?
Because that's what I'm seeing in this post, and I don't think it makes your argument strong.
by effed3 on 5/31/23, 7:51 PM
Building this AI systems, now and in the past years, has proven usefull in narrow and specialized fields, chess play, chemical classification, planning, scientific data analysis, etc, but when the field is the Human Language -per se- and it's significance i feel something big and deep is still missing.
About the "Intelligence" till now we have no good idea of what really is, or how work, and how is related to Mind and Brain.
by mindcrime on 5/31/23, 11:09 PM
For the n'thousandth time: AI does not (necessarily) mean "perfect fidelity with human intelligence". It's many things including a field of research, a body of knowledge, a suite of technologies that display behavior which could be classified as "intelligent" in some sense, and an aspiration, among others. Current "AI" systems absolutely fall under this rubric, even if they aren't functionally equivalent to human intelligence.
Never mind that we don't know for sure that human intelligence doesn't ultimately reduce to "just pattern matching" at some level. And never mind the "AI Effect"[1] where the public at large quits considering anything "AI" once it works. Usually by saying things like "it's just computation" or "it's just pattern matching." :-)
by jstx1 on 5/31/23, 7:58 PM
For most things there's a generally accepted mapping <thing> : <term for the thing> which evolves naturally over time as part of language and culture.
That's what AI is - a name for thing, not a promise or a contractual obligation to perfectly match the preexisting dictionary meaning of the words that compose it.
by theonemind on 5/31/23, 8:03 PM
Using grossly generic terms without distinctions seems to benefit those that want to misrepresent things by using concepts not fully applicable across the gross generic category for their self-interested part of the category, for hype or manipulation.
Currently, people want GPT to take on the luster of a mythical AGI by using the categorical term "AI", so I just call it GPT/LLM. I'll consider "AI" a field of research, not an adjective or noun suitable for products based on research from the field.
by johntiger1 on 5/31/23, 7:30 PM
by throwawayadvsec on 5/31/23, 7:56 PM