DOC PREVIEW
CMU CS 15819 - Lecture

This preview shows page 1-2-3-4-5 out of 16 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 16 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 16 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 16 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 16 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 16 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 16 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

15-819K: Logic ProgrammingLecture 18Proof TermsFrank PfenningOctober 31, 2006In this lecture we will substantiate an earlier claim that logic programmingnot only permits the representation of specifications and the implementa-tion of algorithm, but also the realization o f proofs of correctness of algo-rithms. We will do so by first showing how deductions can be representedas terms, so-called proof terms. We also discuss the problems of checkingthe validity of deductions, which amounts to type checking the terms thatrepresent them.18.1 Deductions as TermsIn logic programming we think of computation as proof search. However,so far search only returns either success togethe r with an answer substitu-tion, fails, or diverges. In the case of success it is reasonable to expect thatthe logic programming engine could also return a deduction of the inst an-tiated goal. But this raises the question of how to represent deductions. Inthe traditional logic programming literature we will find ideas such as alist of the rules t h at have been applied to solve a goal. There is, however, amuch better answer. We can think of an inference rule as a function whichtakes deductions of the premisses to a deduction of the conclusion. Such afunction is a constructor for proof terms. For example, when interpretingplus(z, N, N)pzplus(M, N, P )plus(s(M), N, s(P ))pswe ext r act two constructorspz : plus(z, N, N )ps : plus(M, N, P ) → plus(s(M), N, s(P )).LECTURE NOTES OCTOBER 31, 2006L18.2 Proof TermsThe only unusual thing about these constructors is their type. Th e typ e ofpz, for example, is a proposition.The idea th at propositions can be types is a crucial observation of theCurry-Howard isomorphism in functional programming that also identi-fies proofs (of a proposition) with programs (of th e correspending type).Here, the correspondence of propositions with types is still perfect, butproofs are not programs (which, instead, or are given by inference rules).As an example of h ow a complete proof is interpreted as a term, con-sider the comp utation of 2 + 2 = 4.plus(0, 2, 2)pzplus(1, 2, 3)psplus(2, 2, 4)psHere we have abbreviated z = 0, s(z) = 1, . . .. This deduction be comes thevery simple termps(ps(pz)).Our typing judgment should be such thatps(ps(pz)) : plus(2, 2, 4)so that a proposition acts as the type of its proofs.As a slightly more complex example, consider multiplicationtimes(z, N, z)tztimes(M, N, P ) plus(P, N, Q)times(s(M), N, Q)tswhich yields the following constructorstz : times(z, N, z)ts : times(M, N, P ), times(P, N, Q) → times(M, N, Q).Now the deduction that 2 ∗ 2 = 4, namelytimes(0, 2, 0)tzplus(0, 2, 2)pztimes(1, 2, 2)tsplus(0, 2, 2)pzplus(1, 2, 3)psplus(2, 2, 4)pstimes(2, 2, 4)tsLECTURE NOTES OCTOBER 31, 2006Proof Terms L18.3becomes the termts(ts(tz, pz), ps(ps(pz))) : times(2, 2, 4).The tree structure of the deduction is reflected in the corresponding t reestructure of the term.18.2 Indexed TypesLooking at our example signature,pz : plus(z, N, N)ps : plus(M, N, P ) → plus(s(M), N, s(P ))tz : times(z, N, z)ts : times(M, N, P ), times(P, N, Q) → times(M, N, Q)we can observe a new phenomenon. The types of o ur cons tructors containterms, both constants (such as z) and variables (such as M , N , or P ). We saythat plus is a type family indexed by terms. In general, under the propositions-as-types interpretation, predicates are interpreted type families indexed byterms.The free variables in the declarations are interpreted schematically, justlike in inference rules. So pz is really a family of constants, indexed by N .This has some interesting consequences. For example, we found earlierthatps(ps(pz)) : plus(2, 2, 4).However, we can also check thatps(ps(pz)) : plus(2, 3, 5).In fact, our type system will admit a most general type:ps(ps(pz)) : plus(s(s(z )), N, s(s(N))).This schematic type captures all types of the term on the left, because anytype for ps (ps(pz)) is an instance of plus(s(s(z)), N, s(s(N ))).18.3 Typing RulesIn order to write down the typing rules, it is conve n ien t to make quantifi-cation over schematic variables in a indexed ty pe declaration explicit. WeLECTURE NOTES OCTOBER 31, 2006L18.4 Proof Termswrite ∀x:τ for quantification over a single variable, and following our gen-eral no tational convention, ∀x:τ for a sequence of quantifiers . We havealready introduced quantified propositions, so we emphasize its role asquantifying over the schematic variables of a proposition viewed as a type.The example of addition would be written asz : nats : nat → natpz : ∀N :nat. plus(z, N, N)ps : ∀M :nat. ∀N :nat. ∀P :nat. plus(M, N, P ) → plus(s(M), N, s(P ))We call such explicitly quantified types dependent types. Unlike other for-mulations (for example, in t h e LF logical framework), but similarly to ourtreatment of polymorphism, the quantifiers do not affect the term language.We write ‘∀’ to emphasize the logical reading of the quantifiers; in a fullydependent type t h eory they wo uld be written as ‘Π’.There are a many o f similarities betwe en polymorph ic and dependenttypes . We will see that in addition to the notation, also the typing rulesare analogous. Nevert h eless, they are different concepts: polymorphismquantifies over types, while dependency quantifies over terms. We reviewthe rule for polymorphic typing.dom(ˆθ) = αf : ∀α. σ → τ ∈ Σ ∆ ` t : σˆθ∆ ` f(t) : τˆθRecall thatˆθ is a substitution of type s for type variables, and that ∆ containsdeclarations x:τ as well as α type .The rule fo r dependent types is analogous, using ordinary substitutionsinstead of type substitution.dom(θ) = xc : ∀x:τ . Q → P ∈ Σ ∆ ` t : Qθ∆ ` c(t) : P θWe have written P and Q instead of τ and σ to emphasize the interpreta-tion of the types as propositions.If we require θ to be a ground substitution (that is, cod(θ) = ∅), t h en wecan use this rule to determine ground typings such asps(ps(pz)) : plus(2, 2, 4).LECTURE NOTES OCTOBER 31, 2006Proof Terms L18.5If we allow free variables, that is, cod(θ) = dom(∆), then we can write outschematic typings, such asn:nat ` ps(ps(pz)) : plus(s(s(z)), n, s(s(n))).In either case we want the substitution to be well-typed. In the presence ofdependent types, the formalization of this would lead us a bit far afield, sowe can think of it just as before: we always substitute a term of type τ for avariable of type τ .18.4 Type Ch eckingOrdinarily in logic programming a


View Full Document

CMU CS 15819 - Lecture

Download Lecture
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?