News:

And we're back!

Main Menu

The AI dooooooom thread

Started by Hamilcar, April 06, 2023, 12:44:43 PM

Previous topic - Next topic

DGuller

Quote from: Jacob on May 30, 2023, 09:46:54 PMWhy would you want to strip away the heuristics specific to humans?
Because heuristics are the opposite of principled thinking and thus not helpful in understanding the concepts.  In fact, they often muddy the concepts.

DGuller

Quote from: The Minsky Moment on May 30, 2023, 10:33:17 PM
Quote from: DGuller on May 30, 2023, 09:17:29 PMSeriously, though, I think what Minsky described is exactly what intelligence is, when you strip away the heuristics specific to humans.  Intelligence is the ability to generalize from prior experience and education in order to understand new situations that you haven't experienced before.

Generative AI doesn't understand new situations (or indeed anything).  It doesn't have experiences and it doesn't recognize new situations.
Depends on what you mean by understanding situations.  To me a definition of understanding a situation is being able to anticipate what would happen in the future.  You've never put a hand on a hot stove, but you've seen your brother do that and get burned.  You've never experienced putting your hand on a hot stove, but you anticipate getting burned in a hypothetical situation where you put your hand on a hot stove, because you generalized from observing your brother's mishap.  You don't have a datapoint, but you're still capable of generating a hypothetical one because of your ability to generalize.

ChatGPT can already write computer code for you.  To me that's already intelligence.  The code it's generating for you is most likely brand new and nothing it's ever seen before, but it can still generate it because it's able to generalize from all the code and the narrative it was exposed to during its training.

As for AI not having experiences, it does.  For AI models experience is the data on which they're trained (and education is transfer learning).

Syt

Call me crazy, but can't we have both?

I am, somehow, less interested in the weight and convolutions of Einstein's brain than in the near certainty that people of equal talent have lived and died in cotton fields and sweatshops.
—Stephen Jay Gould

Proud owner of 42 Zoupa Points.

HVC

Being lazy is bad; unless you still get what you want, then it's called "patience".
Hubris must be punished. Severely.

The Minsky Moment

#94
Quote from: DGuller on May 30, 2023, 11:07:40 PMDepends on what you mean by understanding situations.

At a minimum it would involve an ability to recognize a situation.  Current AI systems can't do that beyond recognizing that an inquiry has been made.

QuoteTo me a definition of understanding a situation is being able to anticipate what would happen in the future. 

My understanding of current generative AI systems is that they don't do that.  They don't anticipate and don't recognize a past, present or future. 

QuoteChatGPT can already write computer code for you.  To me that's already intelligence. 

OK.

QuoteAs for AI not having experiences, it does.  For AI models experience is the data on which they're trained (and education is transfer learning).

Again, it becomes a definitional question.  If experience means nothing more than some sort of interaction with facts or data, then you are correct.  If it means anything more than that, then you are not.
The purpose of studying economics is not to acquire a set of ready-made answers to economic questions, but to learn how to avoid being deceived by economists.
--Joan Robinson

crazy canuck

btw ChatGPT does not "write" code.  It finds code that was already written that is contained within its database that corresponds to the inquiry that has been made. 

Josquius

I've just tried experimenting with chat gpt on giving me some website code. I phrased the instructions vaguely and not very well and...you know its actually quite impressive and didn't need me to have much knowledge to implement. If chat gpt had something like midjourney....
██████
██████
██████

crazy canuck

Yes, it is very good at responding to an inquiry and finding stuff in its data base that relates to it.  But you better know how to read code to make sure it is what you actually want.

DGuller

Quote from: crazy canuck on May 31, 2023, 11:00:42 AMbtw ChatGPT does not "write" code.  It finds code that was already written that is contained within its database that corresponds to the inquiry that has been made. 

That's not correct, it most certainly does write novel code, and it would be a statistical impossibility for there to always be a code that you need in the database.  The database was used to train the generative function so that the code it generates is relevant and valid.  Sometimes it fails at that, but often the kinds of mistakes it makes are of "intelligent guess" variety, like using argument names that have never existed, but it seems logical to think that they would exist.

crazy canuck

Dude, its just predicting the next word or symbol if the code is not in its data base. It is not "writing" anything.

DGuller

It's a neural network, it has no database.  It's always predicting the next word, that's how it writes all answers.

Hamilcar

So this is what mansplaining feels like.  :D

DGuller

Quote from: Hamilcar on May 31, 2023, 01:22:59 PMSo this is what mansplaining feels like.  :D
Come on, don't scare him off, let him share his insights...  :)

Syt

I am, somehow, less interested in the weight and convolutions of Einstein's brain than in the near certainty that people of equal talent have lived and died in cotton fields and sweatshops.
—Stephen Jay Gould

Proud owner of 42 Zoupa Points.

Tamas

One thing is sure: journalists around the world are worried they have an automated competition now.

I realise the revolutionary possibilities and what a big leap this ChatGPT level of "AI" is/can be, but endless articles on a civilisational level existential threat I find ridiculous.