DOC PREVIEW
MIT 6 050J - Bits

This preview shows page 1-2 out of 6 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1 Bits1.1 The Mathematical Bit1.2 The Physical Bit1.3 The Classical Bit1.4 The Quantum Bit1.5 An Advantage of QubitsChapter 1BitsInformation is measured in bits, just as length is measured in meters and time is measured in seconds. Ofcourse knowing the amount of information is not the same as knowing the information itself, what it means,or what it implies. In these notes we will not consider the content or meaning of information, just thequantity.Different scales of length are needed in different circumstances. Sometimes we want to measure length inkilometers, sometimes in inches, and sometimes in˚Angstr¨oms. Similarly, other scales for information besidesbits are sometimes needed; in the context of physical systems information is often measured in Joules perKelvin.How is information quantified? Consider a situation that could have any of several possible outcomes.An example might be flipping a coin (2 outcomes, heads or tails) or selecting a card from a deck of playingcards (52 possible outcomes). How compactly could one person (by convention usually named Alice) tellanother person (Bob) this outcome?First consider the case of the two outcomes of flipping a coin, and let us suppose they are equally likely.If Alice wants to tell Bob the result of the coin toss, she could use several possible techniques, but they areall equivalent, in terms of the amount of information conveyed, to saying either “heads” or “tails” or, forthat matter, to saying 0 or 1. We say that the information so conveyed is one bit.If Alice flipped two coins, she could say which of the four possible outcomes actually happened, by saying0 or 1 twice. Similarly, the result of an experiment with eight equally likely outcomes could be conveyed withthree bits, and more generally 2noutcomes with n bits. Thus the amount of information is the logarithm(to the base 2) of the number of equally likely outcomes.Note that c onveying information requires two phases. First is the “setup” phase, in which Alice andBob agree on what they will communicate about, and exactly what each sequence of bits means. Thiscommon understanding is called the code. This is done before the outcome is known. Then, there is the“communication” phase, where actual sequences of 0 and 1 are sent. These sequences are the data. Thusto convey the suit of a card chosen from a deck, their code might be that 00 means clubs, 01 diamonds, 10hearts, and 11 spades. Using that code, Alice draws the card, and tells Bob the suit by sending two bits ofdata. She c ould do so repeatedly for multiple experiments, using the same code.After Bob knows that a card is drawn but before receiving Alice’s message, he is uncertain about thesuit. His uncertainty, or lack of information, can also be expressed in bits. Upon hearing the result, hisuncertainty is reduced by the information he receives. Bob’s uncertainty rises during the setup phase andthen is reduced during the communication phase.Note some important things about information, some of which are illustrated in this example.• Information can be learned through observation, experiment, or measurement.Author: Paul Penfield, Jr.Version 1.0.2, January 30, 2003. Copyrightc 2003 Massachusetts Institute of TechnologyURL: http://www-mtl.mit.edu/Courses/6.050/notes/chapter1.pdfTOC: http://www-mtl.mit.edu/Courses/6.050/notes/index.html11.1 The Mathematical Bit 2• Information is subjective, or “observer-dependent.” What Alice knows is different from what Bobknows. (If information were not subjective, there would be no need to communicate it.)• A person’s uncertainty can be increased upon learning that there is an observation about which infor-mation may be available, and then can be reduced by receiving that information.• Information can be lost, either through loss of the data itself, or through loss of the code.• The physical form of information is localized in space and time. As a consequence . . .– Information can be sent from one place to another.– Information can be stored and then retrieved later.1.1 The Mathematical BitAs we have seen, information can be communicated by sequences of 0 and 1 values. This very powerfulabstraction lets us ignore many of the details associated with specific information processing and transmissionsystems.Bits are simple, having only two possible values, and the mathematics used to manipulate single bits isnot difficult. It is known as Boolean algebra, after the mathematician George Boole (1815 - 1864). In someways it is similar to the algebra of integers or real numbers which is taught in high school, but in some waysit is different.Algebras deal with variables that have certain possible values, and with functions which, when presentedwith one or more variables, return a result which again has c ertain possible values. In the case of Booleanalgebra, the possible values are 0 and 1.There are exactly four Boolean functions of a single variable. One of them, called the identity, simplyreturns its argument. Another, called not, or negation, or inversion, or complement, changes 0 into 1 andvice versa. The other two simply return either 0 or 1 regardless of the argument. The easiest way to describ ethese functions is to simply give a table with their results:x f(x)Argument IDEN T IT Y NOT ZERO ON E0 0 1 0 11 1 0 0 1Table 1.1: Boolean functions of a single variableNote that Boolean alge bra is much s impler than algebra dealing with integers or real numbers, each ofwhich has infinitely many functions of a single variable.How many Boolean functions are there of two variables A and B? Each of the two arguments can take oneither of two values, so there are four possible input combinations. There are 16 different ways of assigningthe two Boolean values to four inputs. Of these 16, two simply ignore the input, four assign the output tobe either A or B or their complement, and the other ten depend on both arguments. The most often usedare AND, OR, XOR, NAND, and N OR, as shown in Table 1.2.It is tempting to think of the Boolean values 0 and 1 as the same as the integers 0 and 1. Then ANDwould correspond to multiplication and OR to addition, sort of. However, familiar results from ordinaryalgebra simply do not hold for Boolean algebra, so such analogies are dangerous. It is absolutely necessary,though sometimes difficult, to distinguish the integers from the Boolean values.This task is made more difficult by the standard notation used for Boolean algebra.


View Full Document

MIT 6 050J - Bits

Download Bits
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Bits and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Bits 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?