Standardization for the Coinductive Lambda-Calculus
Sign up for access to the world's latest research
Abstract
In the calculus Λ co of possibly non-wellfounded λ-terms, standardiza-tion is proved for a parallel notion of reduction. For this system conflu-ence has recently been established by means of a bounding argument for the number of reductions provoked by the joining function which witnesses the confluence statement. Similarly, bounds have to be introduced in order to turn the proof of standardization for the wellfounded λ-calculus into a sound coin-ductive argument, thus limiting the number of reduction steps arising in the process of standardization. This leads to elementary complexity bounds for the length of the resulting standard reduction sequence in terms of the length of the input sequence. A fortiori, these bounds also apply to the usual wellfounded λ-calculus, strengthening previous results by Xi.
Related papers
Rewriting Techniques and …, 2002
1993
We add extensional equalities for the functional and product types to the typed λ-calculus with not only products and terminal object, but also sums and bounded recursion (a version of recursion that does not allow recursive calls of infinite length). We provide a confluent and strongly normalizing (thus decidable) rewriting system for the calculus, that stays confluent when allowing unbounded recursion. For that, we turn the extensional equalities into expansion rules, and not into contractions as is done traditionally. We first prove the calculus to be weakly confluent, which is a more complex and interesting task than for the usual λ-calculus. Then we provide an effective mechanism to simulate expansions without expansion rules, so that the strong normalization of the calculus can be derived from that of the underlying, traditional, non extensional system. These results give us the confluence of the full calculus, but we also show how to deduce confluence without the weak confluence property, using only our technique of simulating expansions.
Information and Computation, 2002
For a notion of reduction in a λ-calculus one can ask whether a term satisfies conservation and uniform normalization. Conservation means that single-step reductions of the term preserve infinite reduction paths from the term. Uniform normalization means that either the term will have no reduction paths leading to a normal form or all reduction paths will lead to a normal form. In the classical conservation theorem for I the distinction between the two notions is not clear: uniform normalization implies conservation, and conservation also implies uniform normalization. The reason for this is that I is closed under reduction, due to the fact that reductions never erase terms in I. More generally for nonerasing reductions, the two notions are equivalent on a set closed under reduction. However, when turning to erasing reductions the distinction becomes important as conservation no longer implies uniform normalization. This paper presents a new technique for finding uniformly normalizing subsets of a λ-calculus. This is done by combining a syntactic and a semantic criterion. The technique is demonstrated by several applications. The technique is used to present a new uniformly normalizing subset of the pure λ-calculus; this subset is a superset of I and thus contains erasing K-redexes. The technique is also used to prove strong normalization from weak normalization of the simply typed λ-calculus extended with pairs; this is an extension of techniques developed recently by Sørensen and Xi. Before presenting the technique the paper presents a simple proof of a slightly weaker form of the characterization of perpetual redexes by Bergstra and Klop; this is a step for the later applications of the technique.
Theoretical Computer Science, 2009
I give a proof of the confluence of combinatory strong reduction that does not use the one of lambda-calculus. I also give simple and direct proofs of a standardization theorem for this reduction and the strong normalization of simply typed terms.
Electronic Notes in Theoretical Computer Science, 2002
A general reducibility method is developed for proving reduction properties of lambda terms typeable in intersection type systems with and without the universal type Ω. Sufficient conditions for its application are derived. This method leads to uniform proofs of confluence, standardization, and weak head normalization of terms typeable in the system with the type Ω. The method extends Tait's reducibility method for the proof of strong normalization of the simply typed lambda calculus, Krivine's extension of the same method for the strong normalization of intersection type system without Ω, and Statman-Mitchell's logical relation method for the proof of confluence of βη-reduction on the simply typed lambda terms. As a consequence, the confluence and the standardization of all (untyped) lambda terms is obtained.
arXiv (Cornell University), 2021
We study the reduction in a λ-calculus derived from Moggi's computational one, which we call the computational core. The reduction relation consists of rules obtained by orienting three monadic laws. Such laws, in particular associativity and identity, introduce intricacies in the operational analysis. We investigate the central notions of returning a value versus having a normal form, and address the question of normalizing strategies. Our analysis relies on factorization results. As in [AFG19], the cornerstone of our analysis are factorization results (also called semi-standardization in the literature): any reduction sequence can be re-organized so as to first performing specific steps and then everything else. Via factorization, we show the key result (6), analogous to (5), relating reduction and evaluation: We then analyze the property of having a normal form (normalization), and define a family of normalizing strategies, i.e. subreductions that are guaranteed to reach a normal form, if any exists. An easy argument, similar to that in [Cra09] (which in turn simplifies the one in [Plo75]) gives: Since !V is a returned value, so is L by value persistence (Lemma 7.13). Therefore, N ⇓. Corollary 7.15 (Observational equivalence contains equational theory). If M = © N then M ∼ = N .
Information Processing Letters, 1995
The classical notion of P-reduction in the A-calculus has an arbitrary syntactically-imposed sequentiality. A new notion of reduction fi' is defined which is a generalization of P-reduction. This notion of reduction is shown to satisfy the Church-Rosser property as well as some other fundamental properties, leading to the conclusion that this generalized notion of P'-reduction can be used in place of p-reduction without sacrificing any of the fundamental properties.
Softwaretechnik-trends, 1999
Inductive characterizations of the sets of terms, the subset of strongly normalizing terms and normal forms are studied in order to reprove weak and strong normalization for the simplytyped λ-calculus and for an extension by sum types with permutative conversions. The analogous treatment of a new system with generalized applications inspired by generalized elimination rules in natural deduction, advocated by von Plato, shows the flexibility of the approach which does not use the strong computability/candidate styleà la Tait and Girard. It is also shown that the extension of the system with permutative conversions by η-rules is still strongly normalizing, and likewise for an extension of the system of generalized applications by a rule of "immediate simplification". By introducing an infinitely branching inductive rule the method even extends to Gödel's T.
Journal of Functional Programming, 1994
We present the complete development, in Gallina, of the residual theory of β-reduction in pure λ-calculus. The main result is the Prism Theorem, and its corollary Lévy's Cube Lemma, a strong form of the parallel-moves lemma, itself a key step towards the confluence theorem and its usual corollaries (Church-Rosser, uniqueness of normal forms). Gallina is the specification language of the Coq Proof Assistant (Doweket al., 1991; Huet 1992b). It is a specific concrete syntax for its abstract framework, the Calculus of Inductive Constructions (Paulin-Mohring, 1993). It may be thought of as a smooth mixture of higher-order predicate calculus with recursive definitions, inductively defined data types and inductive predicate definitions reminiscent of logic programming. The development presented here was fully checked in the current distribution version Coq V5.8. We just state the lemmas in the order in which they are proved, omitting the proof justifications. The full transcript is ava...
Journal of Functional Programming, 1996
We exhibit confluent and effectively weakly normalizing (thus decidable) rewriting systems for the full equational theory underlying cartesian closed categories, and for polymorphic extensions of it. The λ-calculus extended with surjective pairing has been well-studied in the last two decades. It is not confluent in the untyped case, and confluent in the typed case. But to the best of our knowledge the present work is the first treatment of the lambda calculus extended with surjective pairingandterminal object via aconfluentrewriting system, and is the first solution to the decidability problem of the full equational theory of Cartesian Closed Categories extended withpolymorphic types. Our approach yields conservativity results as well. In separate papers we apply our results to the study of provable type isomorphisms, and to the decidability of equality in a typed λ-calculus with subtyping.

Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.