[Each message is kin to all messages. "Up until that time, everyone though that communication was involved in trying to find ways of communicating written language, spoken language, pictures, video, and all those different things--that all of these would require different ways of communicating," said Shannon's colleague Robert Gallager. "Clause said no, you can turn all of them into binary digits. And then you can find ways of communicating the binary digits." You can code any message as a stream of bits, without having to know where it will go; you can transmit any stream of bits, efficiently and reliably, without having to know where it came from. As information theorist Dave Forney put it, "bits are the universal interface."
A Mind at Play
Imagine, if you will, that one central thinker hovers over nearly every aspect of modern life, but that you no nothing of him, including his name. Well, that is more than likely the case, because, Claude Shannon, while revered in scientific circles, never sought out the public spotlight in the way that peers like Albert Einstein did. Instead, after virtually creating information theory and fathering the Information Age, he pursued personal interests, like juggling, tinkering with robotics and the like. His mind, as the title of this terrific biography suggests, was as content in play as in revolutionizing the sciences. Perhaps fittingly, such television appearances as he made tended to present him primarily as the maker of a maze running robotic mouse rather than as one of the 20th century's most important mathematical geniuses.
The co-authors of this book do not have scientific backgrounds, so their choice of topic here may seem odd. Indeed, Mr. Soni is a former editor of the Huffington Post and Mr. Goodman is Political Science professor at Columbia. Both have worked in and written about American politics and they co-wrote a well-received biography of Cato. Shannon and Information Theory were not the logical next step. But they recognized an interesting and important story--and most of all an untold one--when they saw it and secured the co-operation of Shannon's, since-deceased, widow and children to help produce a work that not only locates Shannon's work at the center of our lives but play at the center of Shannon's own life. The result is improbably fun look at a a rather quiet man.
It was Vannevar Bush who most fully realized Shannon's potential and brought him from the University of Michigan into the MIT graduate program to work on the early computer Bush was pioneering, a differential analyzer. Shannon was ideally suited to such work not just because of his mental gifts but because he was, had always been and always would be a tinkerer as well. Seemingly of particular import for his future insights, Shannon made his own barbed wire telegraph and won a Boy Scout wig wag contest, a form of Morse Code transmission using hand signals. when you hear that dot-dash lingo in old movies and tv shows it seems an awfully primitive way to communicate. Shannon was to show that, in effect, all communication is that basic.
Shannon's master's thesis, A Symbolic Analysis of Relay and Switching Circuits, was the first great demonstration of his intellect. Therein he combined Boolean Logic--whereby you can prove a statement true or false mathematically, using just a few operations, without ever knowing what it means--with the mechanical reality that Bush's machine functioned via switches opening or closing and the network of relay circuits he worked on during an internship at Bell Labs, and derived the idea that "because Boole had shown how to resolve logic into a series of binary, true-false decisions, any system capable of representing binaries has access to the entire logical universe he described. 'The laws of thought' had been extended to the inanimate world." Dot/dash, on/off, 1/0...suddenly the seemingly simple yielded extraordinary complexity in the form of the binary-driven machines that surround us.
This paper alone made him a reputation and, along with the patronage of Bush and others, got him to Cold Spring Harbor, where he did ground-breaking genetic research, back to Bell Labs, to Princeton's Institute for Advanced Studies, to the National Defense Research Committee and back to a position at Bell Labs. The case for his inventing the Information Age rests largely on the 1948 paper he produced while at Bell : A Mathematical Theory of Communication. The authors present the basic premises of his theory as follows:
The *information source* produces a message.
The *transmitter* encodes the message into a form capable of being sent as a signal
The *channel* is the medium through which the signal passes.
The *noise source* represents the distortions and corruptions that afflict the signal on its way to the receiver.
The *receiver* decodes the message, reversing the action of the transmitter.
The *destination* is the recipient of the message.
In essence, he had shown that every system of communication is reducible to this one model.
As with Boolean logic, it did not genuinely matter whether we knew what the message meant. But he went on to show that since we generally do know the language the original message was produced in and can apply rules of probability to deciphering it, the receiver can render its original meaning and get it to the destination. (see this video)[Mind you; that's a poor translation of what he achieved and you really should read the book for an adequate explanation.]
Soni and Goodman share one especially delicious legend about Shannon in conversation with John Von Neumann at Princeton circa 1940:
Shannon approached the great man with his idea of information-as-resolved-uncertainty--which would come to stand at the heart of his work--and with an unassuming question. What should he call this thing? Von Neumann answered at once: say that information reduces "entropy." For one, it was a good, solid physics word. "And more importantly," he went on, "no one knows what entropy really is, so in debate you will always have the advantage."
Sublime.
Now, having accomplished so much so soon, were Shannon in some other professions we might expect that he then moved from achievement to achievement. But mathematics (like most pro sports) is actually a young man's game and it is fairly rare to have major breakthroughs to your credit later in life. That's not to say his story was over. Indeed, one of the features that makes this book so readable and accessible is the authors' portrayal of his life beyond the revolutionary theories. Returning eventually to MIT, as a professor, from Bell Labs, Shannon and his second wife lived in a Winchester, MA home that seems to have essentially been a smart kids'clubhouse. Friends, colleagues and students frequented a home cluttered with various unicycles and projects like a ball bouncing robotic W. C. Fields. Shannon also turned his mind to esoteric topics like an academic paper on juggling, one pretty much forecasting the advent of computer chess and one using physics to assist an American driving on England's reversed roads.
Throughout their account of his life, the authors make it very clear that Shannon's love of "play" and the joy he took in tinkering were central to his theoretical work. Because of these very physical pursuits he would appear to have had a particular comprehension of how ideas/theories could be applied. What is often referred to as his unique gift of "intuition" comes across as nothing so much as the notion that one can work backwards from the hands-on result one is trying to achieve and see how ideas get you there. The same Robert Gallager quoted above tells the following story in the book:
I had what I thought was a really neat research idea, for a much better communications system than what other people were building, with all sorts of bells and whistles. I went in to talk to him about it and I explained the problems I was having trying to analyze it. And he looked at it, sort of puzzled, and said, "Well, do you really need this assumption?" And I said, well, I suppose we could look at the problem without that assumption. And we went on for a while. And then he said, again, "Do you really need this other assumption?" And I saw immediately that would simplify the problem, although it started looking a little impractical and a little like a toy problem. And he kept doing this , about five or six times. [...]
At a certain point, I was getting upset, because I saw this neat research problem of mine had become almost trivial. But at a certain point, with all those pieces stripped out, we both saw how to solve it. And then we gradually put all these little assumptions back in and then, suddenly, we saw the solution to the whole problem. And that was just the way he worked. He would find the simplest example of something and then he would somehow sort out why that had worked and why that was the right way of looking at it.
While it's not precisely what Gallager meant, one is tempted by that dismissive "toy problem" opposed to his "neat research problem." One of the seemingly simple-minded devices that Shannon constructed was his "Ultimate Machine," which did nothing more than cause a mechanical hand to reach out and move a switch to "off" at the flip of a switch. It is the quintessential solution to a "toy problem." But, wasn't the nature of Shannon's mind such that he likewise constructed the Information Age out of the on/off binary? A mind at play indeed. And a truly fascinating life of an American original.
Shannon showed in 1948 how information could be transmitted efficiently across communication channels using coded messages. Shannon described an information-generating system as a combination of five essential components: an information source, a transmitter, a channel, a receiver and a destination (Shannon, 1948, p. 4; Wiener, 1948, p. 79). The information source produces a message to be communicated to the receiver. The transmitter operates on the message to produce a signal suitable for transmission over the channel, which is simply the medium of signal transmission. The receiver reconstructs the message from the signal. And the destination is the entity for which the message is intended. This idea might sound very dull to anyone who uses mobile phones every day, but without Shannon’s work you wouldn’t have a mobile phone! According to Shannon, communication amounts to the source of information producing a sequence of symbols, which is then reproduced by the receiver. The reproduction is only to some degree of accuracy – as you realise when you don’t hear clearly during a mobile phone conversation.
In “A Mathematical Theory of Communication,” his legendary paper from 1948, Shannon proposed that data should be measured in bits—discrete values of zero or one. (He gave credit for the word’s invention to his colleague John Tukey, at what was then Bell Telephone Laboratories, who coined it as a contraction of the phrase “binary digit.”)