Academia.eduAcademia.edu

Normalization-by-evaluation

description16 papers
group0 followers
lightbulbAbout this topic
Normalization-by-evaluation is a technique in programming language theory and type systems that simplifies the process of evaluating expressions by transforming them into a normal form. This method allows for the direct evaluation of terms in a typed lambda calculus, facilitating the implementation of functional programming languages and type inference algorithms.
lightbulbAbout this topic
Normalization-by-evaluation is a technique in programming language theory and type systems that simplifies the process of evaluating expressions by transforming them into a normal form. This method allows for the direct evaluation of terms in a typed lambda calculus, facilitating the implementation of functional programming languages and type inference algorithms.

Key research themes

1. How do normalization techniques improve neural network training stability and generalization through statistical and geometric transformations?

This research focus investigates normalization methods as architectural or algorithmic components in deep neural networks, aimed at stabilizing and accelerating training, improving generalization, and refining optimization dynamics. It explores the interplay between normalization-induced transformations of layer activations or model parameters and their effects on training convergence, loss landscape smoothness, and model robustness. The works examine normalization both from the perspective of computational efficiency and theoretical understanding of convergence behaviors.

Key finding: Introduces Stochastic Whitening Batch Normalization (SWBN), which estimates whitening matrices incrementally online, eliminating expensive operations like SVD or matrix inversion. SWBN decouples whitening from... Read more
Key finding: Provides a precise quantitative comparison of the convergence and stability behaviors of gradient descent with Batch Normalization (BNGD) versus ordinary gradient descent for the ordinary least squares problem. Shows that... Read more
Key finding: Proposes GhostNorm (independent normalization within mini-batch groups) and novel Sequential Normalization (SeqNorm) that applies such normalization sequentially across multiple input dimensions. Contrary to common belief... Read more

2. In what ways do neuroscience-inspired and information-theoretic normalization methods enable unsupervised regularization and attention in deep networks?

This theme covers normalization approaches motivated by principles from neuroscience and information theory, focusing on how neural networks can regularize implicit representations through statistical regularity and description length minimization. These methods conceptualize training as a model selection or compression process, deriving normalization factors from information-theoretic quantities or biologically inspired attention mechanisms. The research seeks to harness such normalization to improve learning dynamics, robustness to data distribution shifts, and representation efficiency beyond conventional batch normalization.

Key finding: Introduces Regularity Normalization (RN) as an unsupervised attention mechanism based on the Minimum Description Length principle. RN computes an adaptive normalization factor corresponding to universal code length within... Read more

3. How can batch normalization variants be modified to enhance adversarial robustness while preserving training benefits?

Research under this theme evaluates the adversarial vulnerabilities introduced by batch normalization and proposes normalization variants or modifications that mitigate this weakness. The underlying problem involves distribution shifts caused by adversarial inputs affecting the batch statistics used in BN, impairing robustness. The works analyze how replacing or adapting these statistics during inference, or redesigning normalization layers, can retain accelerated convergence and generalization without compromising security against adversarial attacks.

Key finding: Identifies that adversarial inputs cause distribution shifts that render Batch Normalization’s train-time estimated statistics inaccurate, increasing model vulnerability. Demonstrates that using batch statistics computed at... Read more

All papers in Normalization-by-evaluation

This paper introduces a new recursion principle for inductive data modulo α-equivalence of bound names. It makes use of Odersky-style local names when recursing over bound names. It is formulated in an extension of Gödel's System T... more
We show that Hyland and Ong's game semantics for PCF can be presented using normalization by evaluation (nbe). We use the bijective correspondence between innocent well-bracketed strategies and PCF Böhm trees, and show how operations on... more
We e xtend normalization by e v aluation (rst presented in 5]) from the pure typed-calculus to general higher type term rewriting systems. We d istinguish between computational rules and proper rewrite rules, and de ne a domain theoretic... more
This paper describes formalizations of Tait's normalization proof for the simply typed λ-calculus in the proof assistants Minlog, Coq and Isabelle/HOL. From the formal proofs programs are machine-extracted that implement variants of the... more
The effect of a fully-premixed pilot flame on the velocity-forced flame response of a fully premixed flame in a single-nozzle lean-premixed swirl combustor operating on natural gas fuel is investigated. Measurements of the flame transfer... more
We prove the correctness of an algorithm for normalizing untyped combinator terms by evaluation. The algorithm is written in the functional programming language Haskell, and we prove that it lazily computes the combinatory Böhm tree of... more
In higher-order abstract syntax, the variables and bindings of an object language are represented by variables and bindings of a meta-language. Let us consider the simply typed λ-calculus as object language and Haskell as meta-language.... more
In these lecture notes we give an introduction to functional programming with dependent types. We use the dependently typed programming language Agda which is an extension of Martin-Löf type theory. First we show how to do simply typed... more
We present an Agda formalization of a normalization proof for simply-typed lambda terms. The normalizer consists of two coinductively defined functions in the delay monad: One is a standard evaluator of lambda terms to closures, the other... more
A core programming language is presented that allows structural recursion over open LF objects and contexts. The main technical tool is a coverage checking algorithm that also generates valid recursive calls. Termination of call-byvalue... more
A core programming language is presented that allows structural recursion over open LF objects and contexts. The main technical tool is a coverage checking algorithm that also generates valid recursive calls. Termination of call-byvalue... more
The decidability of equality is proved for Martin-Löf type theory with a universeá la Russell and typed betaeta-equality judgements. A corollary of this result is that the constructor for dependent function types is injective, a property... more
Dependent function types Fun A λxB (= (x : A)-> B) with η. Predicative universes Set 0 , Set 1 ,. .. . Natural numbers. We handle large eliminations (types defined by cases and recursion), in contrast to Harper & Pfenning (2005). Scales... more
In this paper we develop the language theory underpinning the logical framework PLF. This language features lambda abstraction with patterns and application via pattern-matching. Reductions are allowed in patterns. The framework is... more
We study monadic translations of the call-by-name (cbn) and the call-by-value (cbv) fragments of the classical sequent calculus λµμ by Curien and Herbelin and give modular and syntactic proofs of strong normalization. The target of the... more
This paper revisits the results of Barendregt and Ghilezan [3] and generalizes them for classical logic. Instead of λ-calculus, we use here λµ-calculus as the basic term calculus. We consider two extensionally equivalent type assignment... more
This paper revisits the results of Barendregt and Ghilezan [3] and generalizes them for classical logic. Instead of λ-calculus, we use here λµ-calculus as the basic term calculus. We consider two extensionally equivalent type assignment... more
Normalization of muscle activity has been commonly used to determine the amount of force exerted by a muscle. The most widely used reference point for normalization is the maximum voluntary contraction (MVC). However, MVCs are often... more
1 Introduction In this paper we solve the decision problem for sim-ply typed lambda calculus with categorical coprod-uct (strong disjoint sum) types. While this calculus is both natural and simple, the decision problem is a long-standing... more
We present an abstract machine and a reduction semantics for the lambdacalculus extended with control operators that give access to delimited continuations in the CPS hierarchy. The abstract machine is derived from an evaluator in... more
by Luis Pinto and 
1 more
We study monadic translations of the call-by-name (cbn) and the call-by-value (cbv) fragments of the classical sequent calculus λµμ by Curien and Herbelin and give modular and syntactic proofs of strong normalization. The target of the... more
We analyze a normalization function for the simply typed λcalculus based on hereditary substitutions, a technique developed by Pfenning et al. The normalizer is implemented in Agda, a total language where all programs terminate. It... more
We introduce a notion of Kripke model for classical logic for which we constructively prove soundness and cut-free completeness. We discuss the novelty of the notion and its potential applications.
We present direct proofs of termination of evaluation for typed delimited-control operators shift and reset using a variant of Tait's method with context-based reducibility predicates. We address both call by value and call by name, and... more
We present a context-based approach to proving termination of evaluation in reduction semantics (i.e., a form of operational semantics with explicit representation of reduction contexts), using Tait-style reducibility predicates defined... more
The CIL compiler for core Standard ML compiles whole programs using a novel typed intermediate language (TIL) with intersection and union types and flow labels on both terms and types. The CIL term representation duplicates portions of... more
Typed A-calculi have been objects of theoretical study for many years. Recently, it has been shown that all the inductively defined types (including numbers, booleans, lists, and trees, as well as more complex structures like typed terms... more
We formulate principles of induction and recursion for a variant of lambda calculus in its original syntax (i.e., with only one sort of names for both bound and free variables) in which -conversion is based upon name swapping as in... more
Plotkin, in his seminal article Call-by-name, call-by-value and the -calculus, formalized evaluation strategies and simulations using op- erational semantics and continuations. In particular, he showed how call-by-name evaluation could be... more
We formalize two proofs of weak head normalization for the simply typed lambdacalculus in first-order minimal logic: one for normal-order reduction, and one for applicative-order reduction in the object language. Subsequently we use... more
We prove the correctness of an algorithm for normalizing untyped combinator terms by evaluation. The algorithm is written in the functional programming language Haskell, and we prove that it lazily computes the combinatory Böhm tree of... more
Robin Milner coined the slogan well typed programs cannot go wrong, advertising the power of types in functional languages like ML and Haskell to catch runtime errors. Nowadays, we can and should go further: dependently typed programming... more
Summary Robin Milner coined the slogan well typed programs cannot go wrong, advertising the power of types in functional languages like ML and Haskell to catch runtime errors. Nowadays, we can and should go further: dependently typed... more
Summary Robin Milner coined the slogan well typed programs cannot go wrong, advertising the power of types in functional languages like ML and Haskell to catch runtime errors. Nowadays, we can and should go further: dependently typed... more
We construct a logical framework supporting datatypes that mix binding and computation, implemented as a universe in the dependently typed programming language Agda 2. We represent binding pronominally, using well-scoped de Bruijn... more
Download research papers for free!