News:

And we're back!

Main Menu

Do you know who Claude Shannon was?

Started by Savonarola, September 29, 2022, 03:32:00 PM

Previous topic - Next topic

Do you know who Claude Shannon was?

Yes
3 (9.1%)
I've heard the name but I'm not familiar with his work
1 (3%)
No
21 (63.6%)
Give me a second to check Wikipedia... why yes I'm an authority on Clyde Shannon
8 (24.2%)

Total Members Voted: 33

Savonarola

Claude Shannon was an Electrical Engineer who worked for Bell Labs.  His work "A Mathematical Theory of Communication" is a major milestone in Information Theory and is the basis for all subsequent digital communication technology.  Plus he looked like the Phantom of Bell Labs:



I was recently reading a history of science and music called "Measure for Measure" by Thomas Levenson.  Shannon's work was key in digitizing music (both as synthesizers and digital storage devices.)  Levenson introduces Shannon assuming that the reader didn't know who he was.  Obviously I'm familiar with his work; but I was curious if he's well known outside the fields of Information Theory and Communication Engineering.
In Italy, for thirty years under the Borgias, they had warfare, terror, murder and bloodshed, but they produced Michelangelo, Leonardo da Vinci and the Renaissance. In Switzerland, they had brotherly love, they had five hundred years of democracy and peace—and what did that produce? The cuckoo clock

HVC

Never heard of him, but he definitely had people locked up in his basement
Being lazy is bad; unless you still get what you want, then it's called "patience".
Hubris must be punished. Severely.

PDH

I had a Cody Shannon in my Western Civ class.
I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth.
-Umberto Eco

-------
"I'm pretty sure my level of depression has nothing to do with how much of a fucking asshole you are."

-CdM

Josquius

No.
And I do have an interest in early computer science and the like.
I guess Turing overshadows him?
██████
██████
██████

Barrister

Posts here are my own private opinions.  I do not speak for my employer.

Savonarola

Quote from: Josquius on September 29, 2022, 03:41:13 PMNo.
And I do have an interest in early computer science and the like.
I guess Turing overshadows him?

They were in different fields (although if you did read the Wikipedia page you'll know that Shannon invented a Roman numeral computer.)  From what I remember "A Mathematical Theory of Communication" uses telegraphy as its basis.
In Italy, for thirty years under the Borgias, they had warfare, terror, murder and bloodshed, but they produced Michelangelo, Leonardo da Vinci and the Renaissance. In Switzerland, they had brotherly love, they had five hundred years of democracy and peace—and what did that produce? The cuckoo clock


Admiral Yi


grumbler

Quote from: HVC on September 29, 2022, 03:34:25 PMNever heard of him, but he definitely had people locked up in his basement

"What's the difference between a Ferrari and a trash can full of dead babies?"

"Claude Shannon didn't have a Ferrari in his garage."
The future is all around us, waiting, in moments of transition, to be born in moments of revelation. No one knows the shape of that future or where it will take us. We know only that it is always born in pain.   -G'Kar

Bayraktar!

Iormlund

I knew about the Nyquist-Shannon sampling theorem, but we didn't delve on the guy.

My uni offered some history of engineering courses, but I used my "anything goes" credits on things like programming & database architecture classes.

Maximus

Yes, but information science is my field.

Savonarola

I was reading a little more on communication theory.  Shannon had used telegraphy as the basis for "A Mathematical Theory of Communication" and since then base 2 has been the basis of communication theory; that is dot or dash and later 0 or 1.  So the basic unit of communication is the bit.

Shannon's paper owes a great deal to Ralph Hartley's work on information theory.  His central idea was that the lower probability of the event communicated, the greater information it conveys.  He defined information as the log of the inverse of the probability of the event occurring1.  Since he worked in base 10 the basic unit is named in his honor the Hartley (and there's a "Nats" unit for base e.)  So if we had built a base 10 computer we could be talking about Hartleys and Hartlays rather than bits and bytes.  (I don't think it's possible to build a base e computer so no nats and nytes.)

1.)  That is if I tell you that a random number between 1 and 10 is even I've provided you with one bit of information.  There are 5 even numbers out of ten, log2(10/5) = 1.  If I tell you that the number is 2 then I've provided you with 3.3 bits of information.  There's only 1 number and log2(10/1) = 3.3.
In Italy, for thirty years under the Borgias, they had warfare, terror, murder and bloodshed, but they produced Michelangelo, Leonardo da Vinci and the Renaissance. In Switzerland, they had brotherly love, they had five hundred years of democracy and peace—and what did that produce? The cuckoo clock