DOC PREVIEW
ISU CPRE 681 - Controlled Physical Random Functions and Applications

This preview shows page 1-2-21-22 out of 22 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

15Controlled Physical Random Functionsand ApplicationsBLAISE GASSEND, MARTEN VAN DIJK, DWAINE CLARKE, EMINA TORLAK,and SRINIVAS DEVADASMassachusetts Institute of TechnologyandPIM TUYLSPhilips ResearchThe cryptographic protocols that we use in everyday life rely on the secure storage of keys inconsumer devices. Protecting these keys from invasive attackers, who open a device to steal itskey, is a challenging problem. We propose controlled physical random functions (CPUFs) as analternative to storing keys and describe the core protocols that are needed to use CPUFs. A physicalrandom functions (PUF) is a physical system with an input and output. The functional relationshipbetween input and output looks like that of a random function. The particular relationship is uniqueto a specific instance of a PUF, hence, one needs access to a particular PUF instance to evaluatethe function it embodies. The cryptographic applications of a PUF are quite limited unless the PUFis combined with an algorithm that limits the ways in which the PUF can be evaluated; this isa CPUF. A major difficulty in using CPUFs is that you can only know a small set of outputs ofthe PUF—the unknown outputs being unrelated to the known ones. We present protocols that getaround this difficulty and allow a chain of trust to be established between the CPUF manufacturerand a party that wishes to interact securely with the PUF device. We also present some elementaryapplications, such as certified execution.Categories and Subject Descriptors: K.6.5 [General]: Security and Protection—Physical SecurityGeneral Terms: Security, TheoryAdditional Key Words and Phrases: Certified execution, physical security, physical random func-tion, physical unclonable function, trusted computingACM Reference Format:Gassend, B., van Dijk, M., Clarke, D., Torlak, E., Devadas, S., and Tuyls, P. 2008. Controlled physi-cal random functions and applications. ACM Trans. Inform. Syst. Secur. 10, 4, Article 15 (January2008), 22 pages. DOI = 10.1145/1284680.1284683 http://doi.acm.org/10.1145/1284680.1284683Authors’ address: Blaise Gassend, Marten van Dijk, Dwaine Clarke, Emina Torlak, and SrinivasDevadas, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge,MA 02139; email: [email protected]; Pim Tuyls, Philips Research, ISS WB 6038, Prof Holst-laan 4, Eindhoven.Permission to make digital or hard copies of part or all of this work for personal or classroom use isgranted without fee provided that copies are not made or distributed for profit or direct commercialadvantage and that copies show this notice on the first page or initial screen of a display alongwith the full citation. Copyrights for components of this work owned by others than ACM must behonored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers,to redistribute to lists, or to use any component of this work in other works requires prior specificpermission and/or a fee. Permissions may be requested from Publications Dept., ACM, Inc., 2 PennPlaza, Suite 701, New York, NY 10121-0701 USA, fax +1 (212) 869-0481, or [email protected]2008 ACM 1094-9224/2008/01-ART15 $5.00 DOI 10.1145/1284680.1284683 http://doi.acm.org/10.1145/1284680.1284683ACM Transactions on Information and System Security, Vol. 10, No. 4, Article 15, Pub. date: January 2008.15:2•B. Gassend et al.1. INTRODUCTIONTypically, cryptography is used to secure communication between two partiesconnected by an untrusted network. In such communication, each party has pri-vately stored key information which allows it to encrypt, decrypt, and authen-ticate the communication. It is implicitly assumed that each party is capable ofsecuring its private information. This assumption is reasonable when a partyis a military installation, or even a person, but breaks down completely for low-cost consumer devices. Once a secret key is compromised, eavesdropping andimpersonation attacks become possible. In a world where portable devices needto authenticate credit card transactions and prevent copyright circumvention,protecting the keys in the devices that surround us is of utmost importance.To be successful, a key protection scheme has to protect keys from applicationprogramming (API) attacks in which the device’s API is tricked into releasingtrusted information, from noninvasive attacks in which the key is deducedfrom unintended signals emanating from the device, and from invasive attacksin which the attacker opens the device to find the key. All these types of attackshave been demonstrated in real systems [Kocher et al. 1999; Anderson andKuhn 1996, 1997; Gutman 1996].Focusing on the problem of invasive attacks, it is apparent that once a devicehas been opened, the large difference in state between a 0 and a 1 makes itrelatively easy to read out the device’s digitally stored secrets. Traditionally,such attacks are avoided by detecting intrusion and erasing the key memorywhen an intrusion is detected [Smith and Weingart 1999]. However, tamper-sensing environments are expensive to produce and, as long as a key is beingprotected, the intrusion sensors need to be powered, further increasing costs.Since it is the digital nature of the secret key material that makes it easy toextract invasively, we can try to use information of a continuous nature instead.For example, by measuring a complex physical system, and performing suitableprocessing, a key can be generated [Gassend 2003; Suh et al. 2005; Skoric et al.2005]. The invasive adversary now has to study a complex physical system,measure it, and simulate it precisely enough to determine the device’s key.With careful design, the complex physical system can be fabricated such thatan invasive adversary, who wants to measure it, has to destroy it in the process.Thus unless the adversary successfully models or clones the physical system,his tampering will be noticed.These physically obfuscated keys seem to increase the difficulty of an attack,but they still have a single digital point of failure. When the device is in use,the single physically obfuscated master key is present on it in digital form. Ifan adversary can get that key he has totally broken the device’s security. Goingone step farther, we get to physical random functions: instead of being used togenerate the same key every time, the complex physical system is parameteri-zable. For each input to the physical system, a different key is produced. Thusthe complexity of


View Full Document
Download Controlled Physical Random Functions and Applications
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Controlled Physical Random Functions and Applications and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Controlled Physical Random Functions and Applications 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?