News:

And we're back!

Main Menu

Sam Harris Ted Talk on the danger of AI

Started by Berkut, September 29, 2016, 02:02:14 PM

Previous topic - Next topic

The Brain

Quote from: celedhring on September 29, 2016, 03:14:18 PM
Yes, like he says a computer can out-think a whole room of MIT nerds, but this computer can only think about things these nerds tell him about.

Why?
Women want me. Men want to be with me.

celedhring

Quote from: Berkut on September 29, 2016, 03:17:31 PM
And she is kind of hot as well. Red hait, pony tail, boobs falling out of top that leaves nothing to the imagination...

Ava?  :lol:

Berkut

Quote from: celedhring on September 29, 2016, 03:14:18 PM
Quote from: Berkut on September 29, 2016, 02:44:07 PM
Quote from: celedhring on September 29, 2016, 02:23:29 PM
Simplifying: Intelligent AIs are unavoidable. Since all technology gets better with the passing of time, they will naturally get so much better than we'll become irrelevant.

It seems to me this falls under "in the long run we're all dead" file.

He also talks about how robotics and such will throw off the economic balance creating lots of unemployment and inequality. This has been discussed previously in here, and I believe that it won't happen. Previous technology revolutions have made lots of jobs redundant indeed, but at the end of the day we've put all that manpower towards more productive labors and lifted the levels of prosperity for everybody.


This is a fallacy. Simply because it has been the case that new technoloy has allowed people to do other things, it doesn't mean that there is an inexhaustible list of other things that need to be done, while there is certainly a finite list of human needs that must be met.

Basically, history has been a case where humans have been able to only do a small subset of the useful possible activities. So as we created technology to handle some activitites, it has freed up humans to engage in other more useful activities that they previously did not have time for. But it would be false to conclude that this is just something that is infinite - indeed, it seems pretty clear it is not.

And given AI that can literally do anything a human can do (which in and of itself is a difference in kind from what we have seen before, hence previous lessons are not necessarily relevant) to include designing other AIs to do any conceivable task, then the value of human work almost immediately becomes zero, other than work that is uniquely human in nature...and I don't know that any such work actually exists. Perhaps it might.

But in any case, it seems certainly that a human social, political, and economic foundation built on he basis of the value of human labor is going to need some significant adjustment.

I disagree. Technology, in fact, has enabled us to do many new things. The same computers that have left so many jobs redundant have created millions of new jobs related to them. From IT technicians to people conceptualizing, writing and programming all kinds of computer entertainment.

But why would you assume that a general purpose AI technology cannot do all that better than a human as well?

Quote

There's a pretty strong historical trend where labor is being pushed out from agriculture and manufacturing towards the various service sectors. And a myriad of new activities have been created there, and are created every passing year, by every new revolution that comes along. I can't see a nearby future where human labor is no longer needed or wanted. If, say, we live in a future where musicians are no longer needed because people like music created by computers, then we'll have a whole sector of people devoted to create the best algorithms to make that.

But what if the best "people" to write those algorithms is a computer?

Quote

Yes, you can do like mr Harris and look so far in the future that you come up with machines that can take care of absolutely everything and learn to do everything. But imho, that's a sci-fi utopia (or dystopia, according to him). With the same rigor I can portend that we'll all go through thermonuclear war before that happens and be back to square one. All Doomsday scenarios become certain if you look far enough.

I don't see how this is responsive.

Quote
And for the "Singularity will happen in 2050!!!!" crowd out there. Moore's law is about brute processing power, but there's still a very hard ceiling on what computers can do. Yes, like he says a computer can out-think a whole room of MIT nerds, but this computer can only think about things these nerds tell him about. Software is still very limited by the capabilities of the human mind.

Right now it is. There is nothing "magic' about the human mind though. It is just a bunch of atoms shoved into a small area in a particular configuration. If you accept that there is nothing "special" about human intelligence, then there is no reason to assume that artificial intelligence cannot do what human intelligence can do, and given that it does not share many of the human limitations, no reason to think it won't be able to do it much, much better.

A human brain, as amazing as it is, is already badly beaten by artifical technology in many ways. It stores information poorly, recalls that information imperfecftly, cannot access data except via pretty terrible interfaces, etc.,etc.

It still thinks in a way that no artificial intelligence has been able to think. But I don't see any reason to believe that there is anything special about biological circuitry that makes it intrinsically superior to artificial, or any reason to think that creating something better is not simply a matter of time.
"If you think this has a happy ending, then you haven't been paying attention."

select * from users where clue > 0
0 rows returned

Berkut

Quote from: The Minsky Moment on September 29, 2016, 03:18:04 PM
There is an AI problem.  The problem is that it kind of sucks.  It's very good at highly stylized scenarios with limited options like chess, not so good on more complex tasks, especially with social implications.  Practically, I think we should be more focused and concerned about the present limitations of AI and the likelihood and speed of overcoming those limitations, than as yet hypothetical speculations about AIs that can replicate and exceed human intelligence across a broad range of activities.

If the super AI scenario did really occur, then Berkut may well be right that our past experience with the impact of technological development may be misleading.  Once you can replicate or exceed human intelligence, there really isn't any human function that can't be replaced other than simply the status of being a human and not a machine.  So our present labor-based economic system would be replaced by a new one.  But that assumes the super AI scenario and being able to fully replicate human intelligence.  Not clear that is feasible at least on any time scale of direct concern to anyone here.  Partial replacement of some tasks by automation, even really big ones with lots of employment, is not a development that differs in nature from similar episodes in the past.

Your objection baically sums up to "Yeah, might be an issue, but not any time soon".

OK. But people who are experts in this field don't agree with you - general AI is a matter of time, and not that long of a time. The dangerous part is that the nature of the problem, if it is a real problem, is that the growth in intelligence will happen incredibly quickly once the threshold is reached. Is that 10 years away? 50? 100?

It is really hard to say, and even Harris would agree with that - but there is little reasonable argument to be made that it is 500 years away, for example.

So it is something we should be thinking very carefully about. Given the potential risk, we should be thinking about it a lot. And right now, it seems like we are not really thinking about it at all. Mostly people just dismiss it as crazy talk.
"If you think this has a happy ending, then you haven't been paying attention."

select * from users where clue > 0
0 rows returned

The Brain

And after we think about it what do we do?

If there is no solution then there is no problem.
Women want me. Men want to be with me.

Ideologue

#20
Quote from: celedhring on September 29, 2016, 03:14:18 PM
There's a pretty strong historical trend where labor is being pushed out from agriculture and manufacturing towards the various service sectors. I can't see a nearby future where humans are no longer needed or wanted there. If, say, we live in a future where musicians are no longer needed because people like music created by computers, then we'll have a whole sector of people devoted to create the best algorithms to make that.

Firstly, the axiom that any human can learn to code is not proven.

Secondly, the near-zero-cost reproducibility of intellectual property means that it won't be a million (let alone a billion) mediocre programmers trying to code musician AIs.  It will be a far smaller number of elites who have the rarefied skills (at least, as determined by the market, which is deeply imperfect), and everyone will just use theirs.

I mean, everyone sure loves movies, for example.  And yet, somehow, backyard feature films made by millions of people and starring their stupid ugly families have somehow not become viably commercial.

We will not become artists.  We will not become coders.  We will not become medieval bards.  But as long as people think that this is, somehow, the likeliest outcome of the ongoing automation revolution--that all of us get new, shiny white collar jobs--then the actual likeliest option will be that society will begin to break down ever more rapidly, the real kick coming in at around the same time that enough people stop being able to afford to eat.

Quote from: JoanPartial replacement of some tasks by automation, even really big ones with lots of employment, is not a development that differs in nature from similar episodes in the past.

Neither would a Mongol invasion, but most people wouldn't treat the prospect of it so blithely.

Also, almost all human jobs are highly-stylized scenarios, including many of those with "social implications," whatever that means.  Take for example this very thread: a robot could do just as good a job, if not better, and saved us the trouble.

That said, I generally agree that fearing the actual rise of Skynet, rather than the increasing marginalization of most forms of human labor, is largely fanciful.  But that's only part of the concern.
Kinemalogue
Current reviews: The 'Burbs (9/10); Gremlins 2: The New Batch (9/10); John Wick: Chapter 2 (9/10); A Cure For Wellness (4/10)

celedhring

#21
Quote from: Berkut on September 29, 2016, 03:25:11 PM
But what if the best "people" to write those algorithms is a computer?

And what if not?

Right now, there's nothing at our current technology levels that makes me think that computers can take over most culturally relevant human activities in any kind of reasonable time scale. Unless the near future is Facebook clickbait articles.

And a big part of that is the same human imperfections you cite. We haven't really figured out how to create "intelligence" that rivals our capabilities in many areas (which Minsky summarized better than me). And when that happens, as Hami says, meh, we might not even be "human" anymore.

I just see this as something so far in the future that it's more a flight of fancy than a real concern.

The Brain

Quote from: celedhring on September 29, 2016, 03:35:32 PM
Quote from: Berkut on September 29, 2016, 03:25:11 PM
But what if the best "people" to write those algorithms is a computer?

And what if not?

Right now, there's nothing at our current technology levels that makes me think that computers can take over most culturally relevant human activities in any kind of reasonable time scale. Unless the near future is Facebook clickbait articles.

And a big part of that is the same human imperfections you cite. We haven't really figured out how to create "intelligence" that rivals our capabilities in many areas (which Minsky summarized better than me). And when that happens, as Hami says, meh, we might not even be "human" anymore.

If humans have been made otherwise redundant then demand for human culture stuff may take a sharp dive.
Women want me. Men want to be with me.

celedhring

Quote from: The Brain on September 29, 2016, 03:38:15 PM
Quote from: celedhring on September 29, 2016, 03:35:32 PM
Quote from: Berkut on September 29, 2016, 03:25:11 PM
But what if the best "people" to write those algorithms is a computer?

And what if not?

Right now, there's nothing at our current technology levels that makes me think that computers can take over most culturally relevant human activities in any kind of reasonable time scale. Unless the near future is Facebook clickbait articles.

And a big part of that is the same human imperfections you cite. We haven't really figured out how to create "intelligence" that rivals our capabilities in many areas (which Minsky summarized better than me). And when that happens, as Hami says, meh, we might not even be "human" anymore.

If humans have been made otherwise redundant then demand for human culture stuff may take a sharp dive.

Or maybe this purported machine age is such an utopia that the only thing humans have to do in their lives is produce and consume cultural interactions.


Berkut

Quote from: celedhring on September 29, 2016, 03:35:32 PM
Quote from: Berkut on September 29, 2016, 03:25:11 PM
But what if the best "people" to write those algorithms is a computer?

And what if not?
Then there is no problem.

But this is the argument that there is in fact something "special" about human intelligence that cannot be replicated except by...what? Evolution? Is that the only possible way to make human like general intelligence?

If find that claim rather spectacular.

Quote

Right now, there's nothing at our current technology levels that makes me think that computers can take over most culturally relevant human activities in any kind of reasonable time scale. Unless the near future is Facebook clickbait articles.

I would think the current dominance of clickbait facebook articles suggests that there is precious little, if any, human thinking that cannot be done by a non-human intelligence.

Quote

And a big part of that is the same human imperfections you cite. We haven't really figured out how to create "intelligence" that rivals our capabilities in many areas (which Minsky summarized better than me). And when that happens, as Hami says, meh, we might not even be "human" anymore.

But we've created plenty of thinking machines that makes our capabilities seem incredibly primitive.

This is like arguing with someone in the beginning of the industrial revolution that since we haven't invented an airplane yet, airplanes are not possible since nothing else we've invented flies, even though we understand the basic of the science and what it takes.

Yes, we have yet to create a general AI. But the progress and our understanding of how thinking actually works suggests that there is no reason to believe that it isn't just possible, but inevitable.

Quote
I just see this as something so far in the future that it's more a flight of fancy than a real concern.

I don't think that is the case at all.
"If you think this has a happy ending, then you haven't been paying attention."

select * from users where clue > 0
0 rows returned

The Brain

Quote from: celedhring on September 29, 2016, 03:40:41 PM
Quote from: The Brain on September 29, 2016, 03:38:15 PM
Quote from: celedhring on September 29, 2016, 03:35:32 PM
Quote from: Berkut on September 29, 2016, 03:25:11 PM
But what if the best "people" to write those algorithms is a computer?

And what if not?

Right now, there's nothing at our current technology levels that makes me think that computers can take over most culturally relevant human activities in any kind of reasonable time scale. Unless the near future is Facebook clickbait articles.

And a big part of that is the same human imperfections you cite. We haven't really figured out how to create "intelligence" that rivals our capabilities in many areas (which Minsky summarized better than me). And when that happens, as Hami says, meh, we might not even be "human" anymore.

If humans have been made otherwise redundant then demand for human culture stuff may take a sharp dive.

Or maybe this purported machine age is such an utopia that the only thing humans have to do in their lives is produce and consume cultural interactions.

How do they pay for it if their labor is worthless and machines control property?
Women want me. Men want to be with me.

celedhring

Quote from: The Brain on September 29, 2016, 03:45:53 PM
How do they pay for it if their labor is worthless and machines control property?

By the "creating" part.

I don't control any kind of means to produce food, actually I don't produce anything that has a physical presence, yet I'm paid enough to be able to consume these things.

Quote from: Berkut on September 29, 2016, 03:44:36 PM
This is like arguing with someone in the beginning of the industrial revolution that since we haven't invented an airplane yet, airplanes are not possible since nothing else we've invented flies, even though we understand the basic of the science and what it takes.

Yes, we have yet to create a general AI. But the progress and our understanding of how thinking actually works suggests that there is no reason to believe that it isn't just possible, but inevitable.

Except that we don't? Our understanding of cognitive functions is vastly primitive. We can't even agree on what exactly makes us "sentient" much less replicate it. It takes more than Moore's Law to create a general AI and we are very far from being there.

I believe that trying to imagine the presence of omnipotent AIs in our current society is a futile exercise, since our society will have changed radically due to other, more immediate, pressures and challenges when that somehow ends happening.

Berkut

Quote from: celedhring on September 29, 2016, 03:55:16 PM
Quote from: The Brain on September 29, 2016, 03:45:53 PM
How do they pay for it if their labor is worthless and machines control property?

By the "creating" part.

I don't control any kind of means to produce food, actually I don't produce anything that has a physical presence, yet I'm paid enough to be able to consume these things.

Quote from: Berkut on September 29, 2016, 03:44:36 PM
This is like arguing with someone in the beginning of the industrial revolution that since we haven't invented an airplane yet, airplanes are not possible since nothing else we've invented flies, even though we understand the basic of the science and what it takes.

Yes, we have yet to create a general AI. But the progress and our understanding of how thinking actually works suggests that there is no reason to believe that it isn't just possible, but inevitable.

Except that we don't? Our understanding of cognitive functions is vastly primitive. We can't even agree on what exactly makes us "sentient" much less replicate it. It takes more than Moore's Law to create a general AI and we are very far from being there.

I think we might be far from understanding how cognition really works, but we are vastly farther along that we were even 20 years ago. The key being that we are to the point where we understand that what IS happening in biological intelligence is not actually "magic" in any way - it is just chmical process analagous to the same process we have transistors do now.

And again, we don't have to be smart enough to understand it ourselves, we just have to be smart enough to create something slightly smarter than we are - and then it is off to the races, and we cannot win.

Quote

I believe that trying to imagine the presence of omnipotent AIs in our current society is a futile exercise, since our society will have changed radically due to other, more immediate, pressures and challenges when that somehow ends happening.

OK, so you fall in the "It won't happen for a really long time because....well, because..." camp.
"If you think this has a happy ending, then you haven't been paying attention."

select * from users where clue > 0
0 rows returned

celedhring

Well, I just gave you my "because" in the paragraph you quoted previously, and some other previous posts. You're bullish about the speed of AI evolution, I'm bearish. We can agree to review the issue in 40 years and see who was right?

The Brain

Quote from: celedhring on September 29, 2016, 03:55:16 PM
Quote from: The Brain on September 29, 2016, 03:45:53 PM
How do they pay for it if their labor is worthless and machines control property?

By the "creating" part.

I don't control any kind of means to produce food, actually I don't produce anything that has a physical presence, yet I'm paid enough to be able to consume these things.


And if your customers had nothing to pay you with how would you eat?
Women want me. Men want to be with me.