Poll
Question:
Do you know who Claude Shannon was?
Option 1: Yes
votes: 3
Option 2: I've heard the name but I'm not familiar with his work
votes: 1
Option 3: No
votes: 21
Option 4: Give me a second to check Wikipedia... why yes I'm an authority on Clyde Shannon
votes: 8
Claude Shannon was an Electrical Engineer who worked for Bell Labs. His work "A Mathematical Theory of Communication (https://people.math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf)" is a major milestone in Information Theory and is the basis for all subsequent digital communication technology. Plus he looked like the Phantom of Bell Labs:
(https://upload.wikimedia.org/wikipedia/commons/9/99/ClaudeShannon_MFO3807.jpg)
I was recently reading a history of science and music called "Measure for Measure" by Thomas Levenson. Shannon's work was key in digitizing music (both as synthesizers and digital storage devices.) Levenson introduces Shannon assuming that the reader didn't know who he was. Obviously I'm familiar with his work; but I was curious if he's well known outside the fields of Information Theory and Communication Engineering.
Never heard of him, but he definitely had people locked up in his basement
I had a Cody Shannon in my Western Civ class.
No.
And I do have an interest in early computer science and the like.
I guess Turing overshadows him?
Huge fan of his.
Quote from: Josquius on September 29, 2022, 03:41:13 PMNo.
And I do have an interest in early computer science and the like.
I guess Turing overshadows him?
They were in different fields (although if you did read the Wikipedia page you'll know that Shannon invented a Roman numeral computer.) From what I remember "A Mathematical Theory of Communication" uses telegraphy as its basis.
No idea.
Dude had some pointy ears.
Quote from: HVC on September 29, 2022, 03:34:25 PMNever heard of him, but he definitely had people locked up in his basement
"What's the difference between a Ferrari and a trash can full of dead babies?"
"Claude Shannon didn't have a Ferrari in his garage."
I knew about the Nyquist-Shannon sampling theorem, but we didn't delve on the guy.
My uni offered some history of engineering courses, but I used my "anything goes" credits on things like programming & database architecture classes.
Yes, but information science is my field.
I was reading a little more on communication theory. Shannon had used telegraphy as the basis for "A Mathematical Theory of Communication" and since then base 2 has been the basis of communication theory; that is dot or dash and later 0 or 1. So the basic unit of communication is the bit.
Shannon's paper owes a great deal to Ralph Hartley's work on information theory. His central idea was that the lower probability of the event communicated, the greater information it conveys. He defined information as the log of the inverse of the probability of the event occurring1. Since he worked in base 10 the basic unit is named in his honor the Hartley (and there's a "Nats" unit for base e.) So if we had built a base 10 computer we could be talking about Hartleys and Hartlays rather than bits and bytes. (I don't think it's possible to build a base e computer so no nats and nytes.)
1.) That is if I tell you that a random number between 1 and 10 is even I've provided you with one bit of information. There are 5 even numbers out of ten, log2(10/5) = 1. If I tell you that the number is 2 then I've provided you with 3.3 bits of information. There's only 1 number and log2(10/1) = 3.3.