Unformatted text preview:

June 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-1Chapter 25: Intrusion Detection• Principles• Basics• Models of Intrusion Detection• Architecture of an IDS• Organization• Incident ResponseJune 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-2Principles of Intrusion Detection• Characteristics of systems not under attack– User, process actions conform to statisticallypredictable pattern– User, process actions do not include sequences ofactions that subvert the security policy– Process actions correspond to a set of specificationsdescribing what the processes are allowed to do• Systems under attack do not meet at least one oftheseJune 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-3Example• Goal: insert a back door into a system– Intruder will modify system configuration file orprogram– Requires privilege; attacker enters system as anunprivileged user and must acquire privilege• Nonprivileged user may not normally acquire privilege(violates #1)• Attacker may break in using sequence of commands thatviolate security policy (violates #2)• Attacker may cause program to act in ways that violateprogram’s specificationJune 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-4Basic Intrusion Detection• Attack tool is automated script designed toviolate a security policy• Example: rootkit– Includes password sniffer– Designed to hide itself using Trojaned versionsof various programs (ps, ls, find, netstat, etc.)– Adds back doors (login, telnetd, etc.)– Has tools to clean up log entries (zapper, etc.)June 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-5Detection• Rootkit configuration files cause ls, du, etc.to hide information– ls lists all files in a directory• Except those hidden by configuration file– dirdump (local program to list directoryentries) lists them too• Run both and compare counts• If they differ, ls is doctored• Other approaches possibleJune 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-6Key Point• Rootkit does not alter kernel or filestructures to conceal files, processes, andnetwork connections– It alters the programs or system calls thatinterpret those structures– Find some entry point for interpretation thatrootkit did not alter– The inconsistency is an anomaly (violates #1)June 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-7Denning’s Model• Hypothesis: exploiting vulnerabilitiesrequires abnormal use of normal commandsor instructions– Includes deviation from usual actions– Includes execution of actions leading to break-ins– Includes actions inconsistent withspecifications of privileged programsJune 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-8Goals of IDS• Detect wide variety of intrusions– Previously known and unknown attacks– Suggests need to learn/adapt to new attacks or changesin behavior• Detect intrusions in timely fashion– May need to be be real-time, especially when systemresponds to intrusion• Problem: analyzing commands may impact response time ofsystem– May suffice to report intrusion occurred a few minutesor hours agoJune 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-9Goals of IDS• Present analysis in simple, easy-to-understandformat– Ideally a binary indicator– Usually more complex, allowing analyst to examinesuspected attack– User interface critical, especially when monitoringmany systems• Be accurate– Minimize false positives, false negatives– Minimize time spent verifying attacks, looking for themJune 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-10Models of Intrusion Detection• Anomaly detection– What is usual, is known– What is unusual, is bad• Misuse detection– What is bad, is known– What is not bad, is good• Specification-based detection– What is good, is known– What is not good, is badJune 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-11Anomaly Detection• Analyzes a set of characteristics of system,and compares their values with expectedvalues; report when computed statistics donot match expected statistics– Threshold metrics– Statistical moments– Markov modelJune 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-12Threshold Metrics• Counts number of events that occur– Between m and n events (inclusive) expectedto occur– If number falls outside this range, anomalous• Example– Windows: lock user out after k failedsequential login attempts. Range is (0, k–1).• k or more failed logins deemed anomalousJune 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-13Difficulties• Appropriate threshold may depend on non-obvious factors– Typing skill of users– If keyboards are US keyboards, and most usersare French, typing errors very common• Dvorak vs. non-Dvorak within the USJune 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-14Statistical Moments• Analyzer computes standard deviation (firsttwo moments), other measures ofcorrelation (higher moments)– If measured values fall outside expectedinterval for particular moments, anomalous• Potential problem– Profile may evolve over time; solution is toweigh data appropriately or alter rules to takechanges into accountJune 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-15Example: IDES• Developed at SRI International to test Denning’smodel– Represent users, login session, other entities as orderedsequence of statistics <q0,j, …, qn,j>– qi,j (statistic i for day j) is count or time interval– Weighting favors recent behavior over past behavior• Ak,j sum of counts making up metric of kth statistic on jth day• qk,l+1 = Ak,l+1 – Ak,l + 2–rtqk,l where t is number of logentries/total time since start, r factor determined throughexperienceJune 1, 2004 Computer Security: Art and Science©2002-2004 Matt BishopSlide #25-16Example: Haystack• Let An be nth count or time interval statistic• Defines bounds TL and TU such that 90% ofvalues for Ais lie between TL and TU• Haystack computes An+1– Then checks that TL ≤ An+1 ≤ TU– If false, anomalous• Thresholds updated– Ai can change rapidly; as long as thresholds met, all is


View Full Document

UCD ECS 154B - Chapter 25- Intrusion Detection

Documents in this Course
Load more
Download Chapter 25- Intrusion Detection
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Chapter 25- Intrusion Detection and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Chapter 25- Intrusion Detection 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?