The Technological Singularity and super intelligence revolution

Started by Siege, February 23, 2016, 08:42:05 AM

Previous topic - Next topic

alfred russel

Quote from: 11B4V on February 24, 2016, 10:11:58 PM


Eh, I'd rather be climate aware. Much more important issue.

The singularity will be able to solve our climate issues.  :)
They who can give up essential liberty to obtain a little temporary safety, deserve neither liberty nor safety.

There's a fine line between salvation and drinking poison in the jungle.

I'm embarrassed. I've been making the mistake of associating with you. It won't happen again. :)
-garbon, February 23, 2014

Razgovory

I think you should be skeptical of futurists in general, and extremely suspicious of ones that make absurd claims like "we will abolish death", or, "we will create anything from nothing".
I've given it serious thought. I must scorn the ways of my family, and seek a Japanese woman to yield me my progeny. He shall live in the lands of the east, and be well tutored in his sacred trust to weave the best traditions of Japan and the Sacred South together, until such time as he (or, indeed his house, which will periodically require infusion of both Southern and Japanese bloodlines of note) can deliver to the South it's independence, either in this world or in space.  -Lettow April of 2011

Raz is right. -MadImmortalMan March of 2017

crazy canuck

Quote from: Razgovory on February 25, 2016, 06:51:49 PM
I think you should be skeptical of futurists in general, and extremely suspicious of ones that make absurd claims like "we will abolish death", or, "we will create anything from nothing".

Or you can simply conclude they watch too many star trek re-runs.

11B4V

Quote from: alfred russel on February 25, 2016, 06:15:34 PM
Quote from: 11B4V on February 24, 2016, 10:11:58 PM


Eh, I'd rather be climate aware. Much more important issue.

The singularity will be able to solve our climate issues.  :)

Yeah, by killing all the humans.
"there's a long tradition of insulting people we disagree with here, and I'll be damned if I listen to your entreaties otherwise."-OVB

"Obviously not a Berkut-commanded armored column.  They're not all brewing."- CdM

"We've reached one of our phase lines after the firefight and it smells bad—meaning it's a little bit suspicious... Could be an amb—".

Iormlund

Quote from: Monoriu on February 23, 2016, 09:36:13 PM
Let's assume for a moment that this technological singularity will happen.  One of the implications is that computers and robots will be able to self-improve at an increasing rate.  So they will become self-aware, and will no doubt become much more capable than humans.  This sounds suspiciously like Skynet.  What I am trying to say is, is technological singularity something that we should look forward to, or is it something that we need to guard against  :ph34r:

The technological singularity will turn us into Skynet.

We will start by simply augmenting our bodies. For example with nanobot immune systems which can defend you from anything with a simple firmware update, emergency shut-off valves for our bloodstream in case of accident, bypassable pain receptors, enhanced senses, datalinks, auxiliary co-processors and memory banks.

DontSayBanana

Gah, futurists.  I wish I got paid that kind of money to speculate wildly, incorrectly describe processes, make up whatever pseudo-science I feel will back up my claim, and generally scare the crap out of people.

The plain facts of the matter, and this is speaking as someone who's trying to specialize in artificial general intelligence, are that we don't have a path toward making one.  It's not a question of numbers, like the futurists claim; it's the fact that we simply can't make a computer convincingly show preference.  Also, you don't know what you don't know.  The best futurists in the world can only make guesses when it comes to AGI, when it will happen, what it will look like, how it will react (but you better believe we're going to keep it isolated until we're damn sure it's safe), etc.

Oh, and you can take that "die progress unit" and file it right next to a Scientologists' "thetans-" that's how meaningful a measurement it is.  The author made it up, and it's total bullshit.  If culture shock due to technology could kill an early human, then a chimpanzee would die upon seeing a can opener "magically transform" a can into food.  I'm not about to put too much stock into an author who uses an easily disprovable urban myth as a "unit of measurement."
Experience bij!

The Brain

Women want me. Men want to be with me.

Tonitrus

Quote from: 11B4V on February 25, 2016, 07:40:53 PM
Quote from: alfred russel on February 25, 2016, 06:15:34 PM
Quote from: 11B4V on February 24, 2016, 10:11:58 PM


Eh, I'd rather be climate aware. Much more important issue.

The singularity will be able to solve our climate issues.  :)

Yeah, by killing all the humans.

That won't work...we'll still have a climate.  :mad:

Razgovory

Quote from: DontSayBanana on February 26, 2016, 03:08:12 PM
Gah, futurists.  I wish I got paid that kind of money to speculate wildly, incorrectly describe processes, make up whatever pseudo-science I feel will back up my claim, and generally scare the crap out of people.

The plain facts of the matter, and this is speaking as someone who's trying to specialize in artificial general intelligence, are that we don't have a path toward making one.  It's not a question of numbers, like the futurists claim; it's the fact that we simply can't make a computer convincingly show preference.  Also, you don't know what you don't know.  The best futurists in the world can only make guesses when it comes to AGI, when it will happen, what it will look like, how it will react (but you better believe we're going to keep it isolated until we're damn sure it's safe), etc.

Oh, and you can take that "die progress unit" and file it right next to a Scientologists' "thetans-" that's how meaningful a measurement it is.  The author made it up, and it's total bullshit.  If culture shock due to technology could kill an early human, then a chimpanzee would die upon seeing a can opener "magically transform" a can into food.  I'm not about to put too much stock into an author who uses an easily disprovable urban myth as a "unit of measurement."

It would appear to be a problem of kind rather then degree.  If you sped up a dog's brain you wouldn't have a dog who thinks like a man, you would just have dog that's particularly quick on his feet.  To build this "god computer" you would need to program it to be able think in ways that human beings can't imagine.  That would appear to be impossible by definition.

It seems to me that "The singularity" is to computer science as the philosopher's stone was to chemistry.
I've given it serious thought. I must scorn the ways of my family, and seek a Japanese woman to yield me my progeny. He shall live in the lands of the east, and be well tutored in his sacred trust to weave the best traditions of Japan and the Sacred South together, until such time as he (or, indeed his house, which will periodically require infusion of both Southern and Japanese bloodlines of note) can deliver to the South it's independence, either in this world or in space.  -Lettow April of 2011

Raz is right. -MadImmortalMan March of 2017

frunk

It's possible that humanity will create a self aware intelligence smarter than us.

The difficulties in doing that are wildly underestimated by those who have never done any AI work.

What happens at that point is pure speculation, as in it seems like the futurists are assuming that just because something is smarter than us human motivation and potential physical limitations of the new intelligence become completely irrelevant. 

Siege

You guys really didn't read the two parts of the opening link.
Tim Urban actually flesh out most pro and con points, including all the stuff you guys mention.

Yes, ASI might be impossible to achieve, with it being a simulation of intelligence rather than being actually self aware. That's ok, the are many other paths in front of us.

The bigger picture though, and the argument that really have me thinking, is that ALL intelligence, as it develops as a societal species, lead to a technological singularity  and a post scarcity civilization. Unless it self destroy in the process.

In other words if you take all technology from humankind right now, humanity would just start from zero and in a few thousand years it would be right where we are now. On the brink of a technological singularity. Because intelligence by definition accumulates knowledge in the search of happiness and a better way of life. Eventually leading to the search for immortality, the ultimate goal of every self-aware organism.


"All men are created equal, then some become infantry."

"Those who beat their swords into plowshares will plow for those who don't."

"Laissez faire et laissez passer, le monde va de lui même!"


Monoriu

I like to think it is humanity's destiny to create a robotic race that is more capable than us.  Humans will be destroyed in the process and the robots will move on to become masters. 

Eddie Teach

Quote from: Siege on February 27, 2016, 08:21:41 PM
In other words if you take all technology from humankind right now, humanity would just start from zero and in a few thousand years it would be right where we are now. On the brink of a technological singularity. Because intelligence by definition accumulates knowledge in the search of happiness and a better way of life. Eventually leading to the search for immortality, the ultimate goal of every self-aware organism.

Battlestar Galactica figured it would take us 150,000 years to get back. Because apparently their people forgot what farming and writing are, and the urge to congregate in cities.
To sleep, perchance to dream. But in that sleep of death, what dreams may come?

Siege

Quote from: Monoriu on February 27, 2016, 08:25:41 PM
I like to think it is humanity's destiny to create a robotic race that is more capable than us.  Humans will be destroyed in the process and the robots will move on to become masters. 

Oh pleez, you watch too much movies.

The technological singularity is the merging of humans and technology. Un-enhanced humans will not be able to keep up with the exponential advances.

We will be the droids, the ASIs, the immortal explorers of the universe, eventually ascending to a higher plane of existence as some form of software, retreating from the physical world, interacting with it only through our avatars.

One day, when you go to visit a friend, you will not be visiting just his home, but his own world, probably his own universe, with the rules he sees fit, and we will judge each other ethically by the way we treat the NPCs living in our virtual worlds.

Maybe.

There are way too many paths to a post singularity post scarcity civilization.


"All men are created equal, then some become infantry."

"Those who beat their swords into plowshares will plow for those who don't."

"Laissez faire et laissez passer, le monde va de lui même!"


Razgovory

Quote from: Siege on February 27, 2016, 08:21:41 PM
You guys really didn't read the two parts of the opening link.
Tim Urban actually flesh out most pro and con points, including all the stuff you guys mention.

Yes, ASI might be impossible to achieve, with it being a simulation of intelligence rather than being actually self aware. That's ok, the are many other paths in front of us.

The bigger picture though, and the argument that really have me thinking, is that ALL intelligence, as it develops as a societal species, lead to a technological singularity  and a post scarcity civilization. Unless it self destroy in the process.

In other words if you take all technology from humankind right now, humanity would just start from zero and in a few thousand years it would be right where we are now. On the brink of a technological singularity. Because intelligence by definition accumulates knowledge in the search of happiness and a better way of life. Eventually leading to the search for immortality, the ultimate goal of every self-aware organism.

I read it, it's simply bullshit.  It's not like Mr. Urban is neutral in this.  He is very much a booster.  Maybe you should read some stuff debunking the whole concept.
I've given it serious thought. I must scorn the ways of my family, and seek a Japanese woman to yield me my progeny. He shall live in the lands of the east, and be well tutored in his sacred trust to weave the best traditions of Japan and the Sacred South together, until such time as he (or, indeed his house, which will periodically require infusion of both Southern and Japanese bloodlines of note) can deliver to the South it's independence, either in this world or in space.  -Lettow April of 2011

Raz is right. -MadImmortalMan March of 2017