Unformatted text preview:

1 Bits1.1 The Mathematical Bit1.2 The Circuit Bit1.3 The Control Bit1.4 The Physical Bit1.5 The Quantum Bit1.5.1 An Advantage of Qubits1.6 The Classical Bit1.7 SummaryChapter 1BitsInformation is measured in bits, just as length is measured in meters and time is measured in seconds.Of course knowing the amount of information is not the same as knowing the information itself, what itmeans, or what it implies. In these notes we will not consider the content or meaning of information, justthe quantity.Different scales of length are needed in different circumstances. Sometimes we want to measure length inkilometers, sometimes in inches, and sometimes in˚Angstr¨oms. Similarly, other scales for information besidesbits are sometimes needed; in the context of physical systems information is often measured in Joules perKelvin.How is information quantified? Consider a situation or experiment that could have any of several possibleoutcomes. Examples might be flipping a coin (2 outcomes, heads or tails) or selecting a card from a deck ofplaying cards (52 possible outcomes). How compactly could one person (by convention often named Alice)tell another person (Bob) the outcome?First consider the case of the two outcomes of flipping a coin, and let us suppose they are equally likely.If Alice wants to tell Bob the result of the coin toss, she could use several possible techniques, but theyare all equivalent, in terms of the amount of information conveyed, to saying either “heads” or “tails” or tosaying 0 or 1. We say that the information so conveyed is one bit.If Alice flipped two coins, she could say which of the four possible outcomes actually happened, by saying0 or 1 twice. Similarly, the result of an experiment with eight equally likely outcomes could be conveyed withthree bits, and more generally 2noutcomes with n bits. Thus the amount of information is the logarithm(to the base 2) of the number of equally likely outcomes.Note that conveying information requires two phases. First is the “setup” phase, in which Alice and Bobagree on what they will communicate about, and exactly what each sequence of bits means. This commonunderstanding is called the code. For example, to convey the suit of a card chosen from a deck, their codemight be that 00 means clubs, 01 diamonds, 10 hearts, and 11 spades. Agreeing on the code is done beforethe outcome is known. The se tup phase can include informing the recipient that there is new information tobe sent. Then, there is the “outcome” phase, where actual sequences of 0 and 1 are sent. These sequencesare the data. Using the agreed-upon code, Alice draws the card, and tells Bob the suit by sending two bitsof data. She could do so repeatedly for multiple experiments, using the same code.After Bob knows that a card is drawn but before receiving Alice’s message, he is uncertain about the suit.His uncertainty, or lack of information, can be expressed in bits. Upon hearing the result, his uncertainty isreduced by the information he receives. Bob’s uncertainty rises during the setup phase and then is reducedduring the outcome phase.Note some important things about information, some of which are illustrated in this example:Author: Paul Penfield, Jr.This document: http://www.mtl.mit.edu/Courses/6.050/2006/notes/chapter1.pdfVersion 1.3, February 7, 2006. Copyrightc 2006 Massachusetts Institu te of TechnologyStart of notes · back · next | 6.050J/2.110J home page | Site ma p | Se arch | About this document | Comments and inquiries11.1 The Mathematical Bit 2• Information can be learned through observation, experiment, or measurement• Information is subjective, or “observer-dependent.” What Alice knows is different from what Bobknows (if information were not subjective, there would be no need to communicate it)• A person’s uncertainty can b e increased upon learning that there is an observation about which infor-mation may be available, and then can be reduced by receiving that information• Information can be lost, either through loss of the data itself, or through loss of the code• The physical form of information is localized in space and time. As a consequence,– Information can be sent from one place to another– Information can be stored and then retrieved later1.1 The Mathematical BitAs we have seen, information can be communicated by sequences of 0 and 1 values. By using only 0 and1, we can deal with data from many different types of sources, and not have to worry ab out what the datameans. We are thereby using abstract, not specific, values. This very powerful approach lets us ignore manymessy details associated with sp e cific information processing and transmission systems.Bits are simple, having only two possible values, and the mathematics used to denote and manipulatesingle bits is not difficult. It is known as Boolean algebra, after the mathematician George Boole (1815 -1864). In some ways Boolean algebra is similar to the algebra of intege rs or real numbers which is taught inhigh school, but in some ways it is different.Algebras deal with variables that have certain possible values, and with functions which, when presentedwith one or more variables, return a result which again has certain possible values. In the case of Booleanalgebra, the possible values are 0 and 1.There are exactly four Boolean functions of a single variable. One of them, called the identity, simplyreturns its argument. Another, called not, or negation, or inversion, or complement, changes 0 into 1 andvice versa. The other two simply return either 0 or 1 regardless of the argument. Here is a table showingthem:x f(x)Argument IDEN T IT Y NOT ZERO ONE0 0 1 0 11 1 0 0 1Table 1.1: Boolean functions of a single variableNote that Boolean algebra is much simpler than algebra dealing with integers or real numbers, each ofwhich has infinitely many functions of a single variable.How many Boolean functions are there of two variables A and B? Each of the two arguments can take oneither of two values, so there are four possible input combinations. There are 16 different ways of assigningthe two Boolean values to four inputs. Of these 16, two simply ignore the input, four assign the output tobe either A or B or their complement, and the other ten depend on both arguments. The most often usedare AND, OR, XOR, NAND, and N OR, shown in Table 1.2.It is tempting to think of the Boolean values 0 and 1 as the integers 0 and 1. Then AND would correspondto


View Full Document

MIT 6 050J - Bits

Download Bits
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Bits and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Bits 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?