DOC PREVIEW
Berkeley COMPSCI 161 - Lecture Notes

This preview shows page 1-2-3 out of 9 pages.

Save
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Review What is Computer Security Today CS 194 1 CS 161 Computer Security Computing in the presence of an adversary An adversary is the security field s defining characteristic Lecture 2 Reliability robustness FT random failures Security Threat models security goals access control Dealing with surviving actions of knowledgeable attacker dedicated to causing harm Wherever there is an adversary there is a computer security problem August 30 2006 Prof Anthony D Joseph http cs161 org 8 30 06 Joseph CS161 UCB Fall 2006 Lec 2 2 Review Analyze to Learn Review Security Evaluation Process Study attackers and think about how to break into systems to learn attack tactics We need a framework to help you think through the ways that an adversary might penetrate system security Start with security goals Analyze previous successful attacks Arms race for solutions What properties do we want the system to have even when it is under attack What are we trying to protect from the attacker Or to look at it the other way around what are we trying to prevent Some attackers are intelligent Attacks will change and get better with time Deploy a new defense they respond you build a better defense they respond you Try to anticipate future attacks Security is like a game of chess Except the attackers often get the last move 8 30 06 Joseph CS161 UCB Fall 2006 Lec 2 3 8 30 06 Goals for Today Joseph CS161 UCB Fall 2006 Lec 2 4 Threat Assessment Some questions How do we assess threats to a system What kind of threats might we face What kind of capabilities might we expect adversaries to have What are the limits on what the adversary might be able to do to us How do we create a threat model What is access control and what is its role Result is a threat model a characterization of the threats the system must deal with Think Who 8 30 06 Joseph CS161 UCB Fall 2006 Lec 2 5 8 30 06 Page 1 What Why Joseph CS161 UCB Fall 2006 Lec 2 6 Developing a Threat Model Thinking Generically First decide how much we can predict about kinds of adversaries we will face Must reason more generically about unavoidable limitations of the adversary Sometimes know very well who the adversary is and even their capabilities motivations and limitations Silly ex physics means adversary can t exceed speed of light Can usually look at system design and identify what an adversary might do Cold War US military oriented towards main enemy Soviets and focused on understanding USSR s military capabilities effectiveness responsiveness Ex If system never sends secret info over wireless nets then don t need to worry about threat of wireless eavesdropping Ex If system design means people might discuss secrets by phone then threat model needs to include possible phone co insider threats eavesdrop re route impersonate If we know potential adversary can craft a threat model that reflects adversary s abilities and options and nothing more However often adversary is unknown 8 30 06 Joseph CS161 UCB Fall 2006 Lec 2 7 8 30 06 What to Ignore Ex Do fast food places make more profit on soft drinks than on food Ex home security I don t worry about a team of burglars flying a helicopter over my house and rappelling down my chimney Would expect some places to try to boost drink sales e g salting french fries heavily Ex Do customer svc reps earn bonuses for handling more than X calls per hour Why not Many easier ways to break into my house Can classify adversaries by their motivation Would expect some reps to cut long calls short or to transfer trouble customers to other depts when possible Ex financial gain motivation means won t spend more money on attack than they ll gain Burglar won t spend 1 000 s to steal car radio Ex Do spammers make money from those who respond while losing nothing from those who don t Motives are as varied as human nature Have to prepare for all eventualities Joseph CS161 UCB Fall 2006 Lec 2 9 8 30 06 Incentives Remember the three W s Who are the adversaries we might face How might they try to attack us and what are their capabilities Why might they be motivated to attack us and what are their incentives Exposes the motivations of potential adversaries Incentives are particularly relevant when two parties have opposing interests Given security goals and threat model last step is performing a security analysis When incentives clash conflict often follows In this case it is worth looking deeply at the potential for attacks by one such party against the other Joseph CS161 UCB Fall 2006 Would expect that spammers send their emails as widely as possible no matter how unpopular it makes themJoseph CS161 UCB Fall 2006 Lec 2 1 0 Threat Assessment Summary As a rule of thumb organizations tend not to act against their own self interest at least not often Incentives frequently influence behavior 8 30 06 Lec 2 8 Helpful to Examine Incentives Good threat model also specifies threats we don t care to defend against 8 30 06 Joseph CS161 UCB Fall 2006 Lec 2 1 1 8 30 06 Page 2 Joseph CS161 UCB Fall 2006 Lec 2 1 2 Administrivia Space still available in this class Talk to Michael David Sasson today Three sections on Thursdays in 320 Soda 101 10 00 11 00 102 11 00 12 00 103 3 00 4 00 BREAK 18 students have final exam conflicts CS 162 and EE 122 No account forms needed use named accts details in HW 1 Minor changes to project and HW due dates 8 30 06 Joseph CS161 UCB Fall 2006 Lec 2 1 3 Security Analysis Another Analogy Seeing whether there attacks within threat model that successfully violate security goals Mystery writers like to talk about means motive and opportunity Often highly technical and dependent on system details We ll show you many security analysis methods Security evaluation is similar way of thinking One analogy Threat assessment examines the means and motive Security analysis examines what opportunity the adversary might have to do harm Threat model defines set of moves an adversary is allowed to make System design defines how defender plays game Security goals define success condition if adversary violates any goal he wins otherwise the defender wins Security analysis is examining all moves and counter moves to see who has a winning strategy 8 30 06 Joseph CS161 UCB Fall 2006 Lec 2 1 5 8 30 06 Identify the security goals What are my security goals What are we trying to protect Protecting assets from theft or tampering integrity Protecting my personal safety Perform a threat assessment What threats does the system need to protect against Ex if


View Full Document

Berkeley COMPSCI 161 - Lecture Notes

Documents in this Course
Rootkits

Rootkits

11 pages

Load more
Download Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture Notes and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?