DOC PREVIEW
Berkeley COMPSCI 161 - Lecture Notes

This preview shows page 1-2-3 out of 9 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Page 1CS 194-1 (CS 161) Computer SecurityLecture 2Threat models, security goals,access controlAugust 30, 2006Prof. Anthony D. Josephhttp://cs161.org/Lec 2.28/30/06 Joseph CS161 ©UCB Fall 2006Review: What is Computer Security Today?• Computing in the presence of an adversary!– An adversary is the security field’s defining characteristic• Reliability, robustness, & FT: random failures• Security– Dealing with/surviving actions of knowledgeable attacker dedicated to causing harm• Wherever there is an adversary, there is a computer security problem!Lec 2.38/30/06 Joseph CS161 ©UCB Fall 2006Review: Analyze to Learn!• Study attackers and think about how to break into systems to learn attack tactics– Analyze previous successful attacks • Arms race for solutions…– (Some) attackers are intelligent » Attacks will change and get better with time– Deploy a new defense, they respond, you build a better defense, they respond, you…» Try to anticipate future attacks• Security is like a game of chess– Except the attackers often get the last move!Lec 2.48/30/06 Joseph CS161 ©UCB Fall 2006Review: Security Evaluation Process• We need a framework to help you think through the ways that an adversary might penetrate system security?• Start with security goals:– What properties do we want the system to have, even when it is under attack? – What are we trying to protect from the attacker?– Or, to look at it the other way around, what are we trying to prevent?Lec 2.58/30/06 Joseph CS161 ©UCB Fall 2006Goals for Today• How do we assess threats to a system?• How do we create a threat model?• What is access control and what is its role?Lec 2.68/30/06 Joseph CS161 ©UCB Fall 2006Threat Assessment• Some questions:– What kind of threats might we face?– What kind of capabilities might we expect adversaries to have?– What are the limits on what the adversary might be able to do to us?• Result is a threat model, a characterization of the threats the system must deal with– Think: Who? What? Why?Page 2Lec 2.78/30/06 Joseph CS161 ©UCB Fall 2006Developing a Threat Model• First decide how much we can predict about kinds of adversaries we will face– Sometimes, know very well who the adversary is, and even their capabilities, motivations, and limitations» Cold War: US military oriented towards main enemy (Soviets) and focused on understanding USSR’s military capabilities/effectiveness/responsiveness• If we know potential adversary, can craft a threat model that reflects adversary’s abilities and options and nothing more– However, often adversary is unknownLec 2.88/30/06 Joseph CS161 ©UCB Fall 2006Thinking Generically• Must reason more generically about unavoidable limitations of the adversary– Silly ex: physics means adversary can’t exceed speed of light• Can usually look at system design and identify what an adversary might do– Ex: If system never sends secret info over wireless nets, then don’t need to worry about threat of wireless eavesdropping– Ex: If system design means people might discuss secrets by phone, then threat model needs to include possible phone co. insider threats: eavesdrop/re-route/impersonateLec 2.98/30/06 Joseph CS161 ©UCB Fall 2006What to Ignore?• Good threat model also specifies threats we don’t care to defend against– Ex: home security – I don’t worry about a team of burglars flying a helicopter over my house and rappelling down my chimney• Why not?• Many easier ways to break into my house…• Can classify adversaries by their motivation– Ex: financial gain motivation means won’t spend more money on attack than they’ll gain – Burglar won’t spend 1,000’s to steal car radio• Motives are as varied as human nature– Have to prepare for all eventualities…Lec 2.108/30/06 Joseph CS161 ©UCB Fall 2006Helpful to Examine Incentives• Ex: Do fast food places make more profit on soft drinks than on food? – Would expect some places to try to boost drink sales (e.g., salting french fries heavily)• Ex: Do customer svc reps earn bonuses for handling more than X calls per hour?– Would expect some reps to cut long calls short, or to transfer trouble customers to other depts. when possible• Ex: Do spammers make money from those who respond, while losing nothing from those who don’t?– Would expect that spammers send their emails as widely as possible, no matter how unpopular it makes themLec 2.118/30/06 Joseph CS161 ©UCB Fall 2006Incentives• As a rule of thumb, organizations tend not to act against their own self-interest (at least not often…)• Incentives (frequently) influence behavior– Exposes the motivations of potential adversaries• Incentives are particularly relevant when two parties have opposing interests– When incentives clash, conflict often follows.– In this case it is worth looking deeply at the potential for attacks by one such party against the otherLec 2.128/30/06 Joseph CS161 ©UCB Fall 2006Threat Assessment Summary• Remember the three W's:– Who are the adversaries we might face?– How might they try to attack us, and what are their capabilities?– Why might they be motivated to attack us, and what are their incentives?• Given security goals and threat model, last step is performing a security analysisPage 3Lec 2.138/30/06 Joseph CS161 ©UCB Fall 2006Administrivia• Space still available in this class– Talk to Michael-David Sasson today• Three sections on Thursdays in 320 Soda– 101. 10:00-11:00– 102. 11:00-12:00– 103. 3:00- 4:00• 18 students have final exam conflicts– CS 162 and EE 122• No account forms needed, use named accts (details in HW #1)• Minor changes to project and HW due datesBREAKLec 2.158/30/06 Joseph CS161 ©UCB Fall 2006Security Analysis• Seeing whether there attacks (within threat model) that successfully violate security goals– Often highly technical and dependent on system details– We’ll show you many security analysis methods• One analogy: – Threat model defines set of moves an adversary is allowed to make– System design defines how defender plays game– Security goals define success condition: if adversary violates any goal, he wins; otherwise, the defender wins– Security analysis is examining all moves and counter-moves to see who has a winning strategyLec 2.168/30/06 Joseph CS161 ©UCB Fall 2006Another Analogy• Mystery writers like to talk about means, motive, and


View Full Document

Berkeley COMPSCI 161 - Lecture Notes

Documents in this Course
Rootkits

Rootkits

11 pages

Load more
Download Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?