DOC PREVIEW
U of I CS 498 - Evaluating Systems

This preview shows page 1-2-3-27-28-29 out of 29 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 29 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 29 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 29 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 29 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 29 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 29 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 29 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Slide 1Slide 2Slide 3Slide 4Slide 5Slide 6Slide 7Slide 10Slide 11Slide 12Slide 13Slide 14Slide 15Slide 16Slide 17Slide 18Slide 19Slide 20Slide 21Slide 22Slide 23Slide 24Slide 25Slide 26Slide 27Slide 28Slide 29Slide 30Slide 31Evaluating SystemsInformation AssuranceFall 2006Reading Material•Chapter 21 Computer Security: Art and Science•The orange book and the whole rainbow series–http://www.radium.ncsc.mil/tpep/library/rainbow/•The common criteria–Lists all evaluated protection profiles and products–http://www.commoncriteriaportal.orgOutline•Motivation for system evaluation•Specific evaluation systems–TCSEC/Orange Book–Interim systems–Common CriteriaEvaluation Goals•Oriented to purchaser/user of system•Assurance that system operates as advertisedEvaluation Options•Rely on vendor/developer evidence–Self-evaluate vendor design docs, test results, etc–Base on reputation of vendor•Rely on an expert–Read product evaluations from trusted source–Penetration testingFormal Evaluation•Provide a systematic framework for system evaluation–More consistent evaluation–Better basis for comparing similar product•Trusted third party system for evaluation•Originally driven by needs of government and militaryTCSEC: 1983-1999•Trusted Computer System Evaluation Criteria (TCSEC) also called the Orange Book–Specifies evaluation classes (C1, C2, B1, B2, B3, A1)–Specifies functionality and assurance requirements for each class•Functional Model builds on–BLP (mandatory labeling)–Reference MonitorsTCSEC Functional Requirements•DAC•Object Reuse –Sufficient clearing of objects between uses in resource pool–E.g. zero pages in memory system•MAC and Labels•Identification and Authentication•Audit –requirements increase at higher classes•Trusted Path–Non-spoofable means to interact with TCB–Ctl-Alt-Del in WindowsTCSEC Assurance Requirements•Configuration Management–For TCB•Trusted Distribution–Integrity of mapping between master and installations•System Architecture–Small and modular•Design Specification – vary between classes•Verification – Vary between classes•Testing•Product DocumentationTCSEC Classes•D – Catch all•C1 – Discretionary Protection–Identification and authentication and DAC–Minimal Assurance•C2 – Control access protection–Adds object reuse and auditing–More testing requirements–Windows NT 3.5 evaluated C2TCSEC Classes•B1 – Labeled Security Protection–Adds MAC for some objects–Stronger testing requirements. Information model of security policy.–Trusted Unixes tended to be B1•B2 – Structured protection–MAC for all objects. Additional logging. Trusted Path. Least privilege.–Covert channel analysis, configuration management, more documentation, formal model of security policyTCSEC Classes•B3 – Security Domains–Implements full RVM. Requirements on code modularity, layering, simplicity.–More stringent testing and documentation.•A1 – verified protection–Same functional requirements as B3–Significant use of formal methods in assurance–Honeywell’s SCOMPTCSEC Evaluation process•Originally controlled by government–No fee to vendor–May reject evaluation application if product not of interest to government•Later introduced fee-based evaluation labs•Evaluation phases–Design analysis – no source code access–Test analysis–Final reviewTCSEC Evaluation Issues•Evaluating a specific configuration–E.g., Window NT, no applications installed, no network–New patches, versions require re-certification•RAMP introduced to ease re-certifications•Long time for evaluation–Sometimes product was obsolete before evaluation finished•Criteria Creep–B1 means something more in 1999 than it did in 1989Interim Efforts in the ’90s•Canadian Trusted Computer Product Evaluation Criteria (CTCPEC)•Information Technology Security Evaluation Criteria (ITSEC) – Western Europe•Commercial International Security Requirements (CISR) – AmEx and EDS•Federal Criteria – NSA and NISTFIPS 140•Framework for evaluating Cryptographic Modules•Still in Use•Addresses–Functionality–Assurance–Physical securityCommon Criteria – 1998 to today•Pulls together international evaluation efforts–Evaluations mean something between countries•Three top level documents–Common Criteria Documents•Describe functional and assurance requirements. Defines Evaluation Assurance Levels (EALs)–CC Evaluation Methodology (CEM)•More details on the valuation. Complete through EAL5 (at least)–Evaluation Scheme•National specific rules for how CC evals are performed in that country•Directed by NIST in USCC Terminology•Target of Evaluation (TOE)–The product being evaluated•TOE Security Policy (TSP)–Rules that regulate how assets are managed, protected, and distributed in a product•TOE Security Functions (TSF)–Implementation of the TSP–Generalization of the TCBProtection Profile (PP)•Profile that describes the security requirements for a class of products–List of evaluated PP’s http://www.commoncriteriaportal.org/public/expert/index.php?menu=8•Replaces the fixed set of classes from TCSEC•ISSO created some initial profiles to match TCSEC classes–Controlled Access Protection Profile (CAPP) corresponds to C2–Labeled Security Protection Profile (LSPP) corresponds to B1Product evaluation•Define a security target (ST)–May leverage an evaluated protection profile•Evaluated with respect to the STCC Functional Requirements•Defined in a taxonomy–Top level 11 classes•E.g., FAU – Security audit and FDP – User Data Protection–Each class divided into families•E.g., FDP_ACC – Access control policy–Each family divided into components•E.g., FDP_ACC.2 – Complete access control–Each component contains requirements and dependencies on other requirementsCC Assurance Requirements•Similar class, family, component taxonomy•Eight product oriented assurance classes–ACM – Configuration Management–ADO – Delivery and Operation–ADV – Development–AGD – Guidance Documentation–ALC – Life Cycle–ATE – Tests–AVA – Vulnerability Analysis–AMA – Maintenance of AssuranceEvaluation Assurance Levels•7 fixed EALs–EAL1 – Functionality Tested–EAL2 – Structurally Tested–EAL3 – Methodically tested and checked•Analogous to C2–EAL4 – Methodically Designed, Tested, and


View Full Document

U of I CS 498 - Evaluating Systems

Documents in this Course
Lecture 5

Lecture 5

13 pages

LECTURE

LECTURE

39 pages

Assurance

Assurance

44 pages

LECTURE

LECTURE

36 pages

Pthreads

Pthreads

29 pages

Load more
Download Evaluating Systems
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Evaluating Systems and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Evaluating Systems 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?