Posts

Comments

Comment by Rallo Vulpus (rallo-vulpus) on 2022 was the year AGI arrived (Just don't call it that) · 2023-01-09T22:00:33.313Z · LW · GW

This is just silly. "AI" is not intelligent; it's a bunch of nodes with weights that study human behavior or other patterns long enough to replicate them to a degree that often leave obvious tells of the fact that it's nothing more than a replication based on previously fed data. While it's impressive it can perform well at standardized testing, that's simply because it's been fed enough related data to fake it's way through an answer. It's not thinking about its answers like a human would; It's mashing previously seen responses together well enough to spit out a response that has a chance at being right. If it were intelligent, we'd call that a "guess" rather than knowledge.

ChatGPT cannot code; its answers are almost always riddled with bugs despite it being hyper confident in its implementation, because it isn't actually thinking or understanding what it's doing; it's simply replicating what it was fed (humans writing code and humans being confident in their implementation, bugs and all). The difference being, the bugs ChatGPT creates are hilariously silly and not anything an intelligent creator would've made. A network of weights and numbers, however...

An AI cannot and will never self drive cars; Driving is so self reliant on human sight and how human brains are linked to their eyes that no algorithm can hope to replicate it without, at the very least, a replication of how eyesight works in the form of nerve endings, cones/rods, and signals linking to a specific part of an organic brain. A camera and an algorithm simply cannot compete there, and any "news" of a self driving car is overblown and probably false.

StableDiffusion and other art algorithms also can't create art; The "art" created is portions of works by artists (stolen without their consent) and then mashed up so granularly that it's hard to tell, and spit back up in a pattern that's based off of key words from art pieces it's been fed. It's smoke and mirrors with a whole lot of copyright theft from real artists. One could make the (poor) argument that this simply replicates humans being inspired by other art pieces, but that's a bit of a stretch considering that humans can at least create something wholly new, without using chopped up pieces of the original. Ask an AI art to draw a human hand to see it's lack of intelligent understanding. It's a simple algorithm performing art theft.

I could go on, but all in all, the issue here is that what we can "Artificial Intelligence" is being conflated with the science fiction phrase by the same name. The author here heard about neural networks making okay-ish responses because it was trained from human responses and told to replicate them and went, "Wow. This is just like that scifi novel I real that has that same word in it." When in reality, they are two different concepts entirely.