If the sequences were equally likely, then Shannon’s formula would say that the entropy rate is indeed 470 bits per minute. One could encode all these possibilities into 470 bits, since 2 470 ≈ 26 100. For example, texting at the rate of 100 English letters per minute means sending one out of 26 100 possible messages every minute, each represented by a sequence of 100 letters. The lower the entropy rate, the less the uncertainty, and thus the easier it is to compress the message into something shorter. This number quantifies the uncertainty involved in determining which message the source will generate. A portmanteau of “binary digit,” a bit could be either a 1 or a 0, and Shannon’s paper is the first to use the word (though he said the mathematician John Tukey used it in a memo first).įirst, Shannon came up with a formula for the minimum number of bits per second to represent the information, a number he called its entropy rate, H. Playing a central role in all three is the concept of an information “bit,” used by Shannon as the basic unit of uncertainty. Given that framework of uncertainty and probability, Shannon set out in his landmark paper to systematically determine the fundamental limit of communication. This came as a total shock to the communication engineers of the day. This single observation shifted the communication problem from the physical to the abstract, allowing Shannon to model the uncertainty using probability. After all, if you knew ahead of time what I would say to you in this column, what would be the point of writing it? Shannon’s genius lay in his observation that the key to communication is uncertainty. The probabilistic noise added further randomness for the receiver to disentangle.īefore Shannon, the problem of communication was primarily viewed as a deterministic signal-reconstruction problem: how to transform a received signal, distorted by the physical medium, to reconstruct the original as accurately as possible. He imagined the information source generating one of many possible messages to communicate, each of which had a certain probability. Despite its simplicity, Shannon’s model incorporates two key insights: isolating the information and noise sources from the communication system to be designed, and modeling both of these sources probabilistically. The heart of his theory is a simple but very general model of communication: A transmitter encodes information into a signal, which is corrupted by noise and then decoded by the receiver. Shannon instead asked, “Is there a grand unified theory for communication?” In a 1939 letter to his mentor, Vannevar Bush, Shannon outlined some of his initial ideas on “fundamental properties of general systems for the transmission of intelligence.” After working on the problem for a decade, Shannon finally published his masterpiece in 1948: “A Mathematical Theory of Communication.” But the engineering of communication systems was always tied to the specific source and physical medium. From smoke signals to carrier pigeons to the telephone to television, humans have always sought methods that would allow them to communicate farther, faster and more reliably. Next, Shannon set his sights on an even bigger target: communication.Ĭommunication is one of the most basic human needs. ![]() ![]() It was a transformative work, turning circuit design from an art into a science, and is now considered to have been the starting point of digital circuit design. After graduating from the University of Michigan with degrees in electrical engineering and mathematics, he wrote a master’s thesis at the Massachusetts Institute of Technology that applied a mathematical discipline called Boolean algebra to the analysis and synthesis of switching circuits. Shannon was born in Gaylord, Michigan, in 1916, the son of a local businessman and a teacher. But more than 70 years ago, in a single groundbreaking paper, he laid the foundation for the entire communication infrastructure underlying the modern information age. He never won a Nobel Prize, and he wasn’t a celebrity like Albert Einstein or Richard Feynman, either before or after his death in 2001. Very rarely does one individual simultaneously make central contributions to all three - but Claude Shannon was a rare individual.ĭespite being the subject of the recent documentary The Bit Player - and someone whose work and research philosophy have inspired my own career - Shannon is not exactly a household name. The three disciplines are interdependent but distinct. Engineering builds systems to solve human needs. Mathematics searches for new theorems to build upon the old.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |