Academia.eduAcademia.edu

Information Theory (Mathematics)

description164 papers
group5,714 followers
lightbulbAbout this topic
Information Theory is a mathematical framework for quantifying the transmission, processing, and storage of information. It focuses on concepts such as entropy, data compression, and channel capacity, providing tools to analyze the efficiency and reliability of communication systems.
lightbulbAbout this topic
Information Theory is a mathematical framework for quantifying the transmission, processing, and storage of information. It focuses on concepts such as entropy, data compression, and channel capacity, providing tools to analyze the efficiency and reliability of communication systems.

Key research themes

1. How do foundational measures and decompositions of information unify structure, randomness, and geometry in symbolic and algorithmic complexity?

This theme explores innovative formalisms and mathematical frameworks that rigorously decompose symbolic objects or data sequences into components quantifying structure (generative models) and randomness (irreducible complexity), establishing conserved and normalized measures. The research connects algorithmic information theory with geometric concepts such as curvature and Fisher information, providing a differential-geometric lens on symbolic complexity. These developments facilitate computable diagnostics for model coherence, symbolic inference efficiency, and complexity auditing across domains including physics, logic, and AI.

Key finding: Establishes the Descriptive Invariant Duality (DID), a canonical conserved decomposition of symbolic objects into normalized irreducible complexity and generative structure, forming a coordinate system over the unit... Read more
Key finding: Develops a universal, model-, probability-, and computability-agnostic multidimensional space reconstruction technique for decoding non-random (compressible) signals, demonstrating that non-random data inherently encodes... Read more
Key finding: Derives new recurrence relations, inequalities, and bounds for information potentials (index of coincidence) and associated Rényi and Tsallis entropies of parameter-dependent probability distributions. Demonstrates concavity... Read more

2. How can information theory be extended and applied to neural communication and quantum frameworks to reconcile biological complexity and physical constraints?

This theme addresses extensions and modifications of classical Shannon information theory to model the complexity of neural information processing, quantum logical information, and continuous-time Gaussian network channels. It investigates conceptual and theoretical limits in applying information theory to neural spikes, elaborates quantum logical entropy as a framework for distinctions in quantum measurement, and reviews quantitative relationships between mutual information and estimation measures in Gaussian channels, highlighting applications to multiuser coding and capacity approximations.

Key finding: Introduces logical entropy as a quantitative measure derived from the dual logic of partitions, focusing on distinctions (dits) rather than subsets. Extends this to quantum logical entropy, linking information quantitatively... Read more
Key finding: Reviews extensive theoretical developments relating mutual information and minimum mean square error (MMSE) in Gaussian channels including multiuser settings. Highlights the I-MMSE identity's role in enabling alternative... Read more

3. How can mutual information and information-theoretic measures improve statistical dependence evaluation and image complexity quantification beyond classical correlation?

This theme focuses on the deployment of mutual information and related normalized measures such as symmetric uncertainty and global correlation coefficients to quantify dependencies between random variables, including nonlinear relationships unobservable by classical correlation. Additionally, it explores image complexity through an information-theoretic framework involving mutual information between histograms and spatially partitioned regions, enabling complexity measures sensitive to spatial distributions and compositional regularities.

Key finding: Demonstrates that mutual information-based measures (symmetric uncertainty and λ coefficient) effectively quantify nonlinear dependencies between random variables, outperforming conventional correlation coefficients. Presents... Read more
Key finding: Introduces a new framework for image complexity based on the information channel from image histograms to spatially partitioned regions, maximizing mutual information. Defines complexity metrics including the number of... Read more
Key finding: Provides foundational concepts in coding theory and information measurement, emphasizing the probabilistic quantification of information content via logarithmic measures in bits, nats, or Hartleys. Introduces Markov models... Read more

All papers in Information Theory (Mathematics)

"Adjunctions" are types of addition or union with another, but they are not essential or permanent. In linear algebra, this is the "adjoint," which is the transpose and conjugate operator of A (or the corresponding square matrix); it is... more
The arrow of time is thought to originate from a low-entropy Past Hypothesis, while the holographic principle suggests our universe is encoded on a boundary. We propose Entropic Causal Holography (ECH), a comprehensive testable... more
Classical physics and arithmetic both collapse at singularities: physical models fail at the Planck scale, and arithmetic halts at division by zero. TRIAD-IIZ reframes these collapses as dimensional transitions-structured outcomes... more
We present a canon-only proof that P ≠ NP within the Q-theoretic framework. The key ingredient is a new, rigorously stated and proved result-denoted here as New Theorem (Theorem Q5.1)-which shows that any admissible polynomial-time... more
This paper demonstrates how the foundations and operational mechanics of the scientific method and empirical methodology serve not only as investigative tools but as ontological proof of the Omniological Resonance Theory (ORT), the Omega... more
Deviations from optimal values ​​go against principled minimalism, an important guiding principle of the information theory I am working on. Here the emphasis is on some lesser-known and unknown consequences, told in a more moderate way.
The goal of this workshop was to discuss recent developments of nonparametric statistical inference. A particular focus was on high dimensional statistics, semiparametrics, adaptation, nonparametric bayesian statistics, shape constraint... more
Broadcast of a message to nodes in a network is one of elementary but inevitable techniques in wireless ad-hoc and sensor networks. In this paper, we apply the protocol, which the authors proposed for cooperative multi-hop relay networks,... more
Examen final del curso de analisis funcional UNFV ciclo 2025-1
Abstract: The subject of the study is phenomenological neuroplasticity, considered as a key biological mechanism that ensures the emergence and development of consciousness. The object of the research is the process of forming subjective... more
In this brief theoretical exploration, I hypothesize the existence of an additional correction factor in Einstein’s mass-energy equivalence equation that becomes significant only as velocities approach the speed of light. While Special... more
We introduce a formal framework that integrates quantum computing principles with glyphic calculus to enable self-evolving control architectures-systems capable of autonomously generating new control laws while preserving stability. By... more
Many workers involved in drug discovery will have some early familiarity with the principles of quantum mechanics as applied in chemistry. Certainly those involved in computational chemistry, and particularly molecular modeling, will do... more
We describe the real tessarines or "split-complex numbers" and describe a novel instance where they arise in biomedical informatics. We use the split-complex numbers to give a mathematical definition of a Hyperbolic Dirac Network (HDN) -a... more
We present a revolutionary mathematical framework that unifies all fundamental problems in mathematics and physics through fractal resonance ontology. This work demonstrates that reality operates according to a ternary fractal structure... more
A basic duality arises throughout the mathematical and natural sciences. Traditionally, logic is thought to be based on the Boolean logic of subsets, but the development of category theory in the mid-twentieth century shows the duality... more
We propose a reformulation of the Bekenstein bound in which entropy is not fundamentally constrained by geometric surface area, but by the symmetry class of the boundary-encoded in its coset structure Σ ∼ = G/H. Using Haar measure on G/H,... more
We present a unified framework for symbolic complexity, founded on a conserved decomposition law and two derived constructs. This establishes the first invariant geometry for symbolic reasoning, enabling structure, randomness, and model... more
Recent advances in genomics and information theory have converged to challenge one of biology's most foundational assumptions: the randomness of genetic mutations. This paper synthesizes empirical evidence from plant genomics,... more
The Planck's Blink hypothesis claims that a smooth awareness field ψ and a sinusoidal clock field Φ force reality to blink every Planck interval (5.39 × 10sss s). Curvature responds to local fractal-dimensional deficits δD_f(ψ),... more
The Mukherjee Information Envelope (MIE) has been proposed as a measure for quantifying the relationship between information and determinism in systems typically viewed as random. While its basic form mirrors the redundancy measure in... more
Recent advances in genomics and information theory challenge the classical assumption that genetic mutations are fundamentally random. The Mukherjee Information Envelope (MIE), originally formulated as a normalized entropy gap, is... more
This paper clarifies the mathematical status of the Mukherjee Information Envelope (MIE) and its relationship to redundancy in information theory. We show that the original MIE equation is mathematically identical to redundancy and thus... more
This paper extends the Mukherjee Information Envelope (MIE) by introducing a latent-variable-based formulation, motivated by the deterministic nature of physical systems such as coin tossing. Building on the insight from the Diaconis coin... more
This paper revisits and extends the Mukherjee Information Envelope (MIE), clarifying its relationship to redundancy in information theory, correcting computational errors, and expanding its conceptual and mathematical foundations. We... more
This paper revisits the original formulation of the Mukherjee Information Envelope (MIE), correcting prior computational errors, situating the measure within the broader context of information theory, and expanding its mathematical and... more
This protocol establishes the formal mathematical and operational law for encoding, collapse, and lawful resurrection of phase-locked memory in any physical, biological, or computational substrate. Resonant Phase Memory Calculus (RPMC)... more
This paper introduces the Möbius Current Time Machine (MCTM), a novel model of time travel based on Encryptment Theory, recursive torsion fields, and Möbius topology. The MCTM proposes a unified mechanism combining superconductive current... more
This paper presents a formal, first-principles derivation of a Unified Theory of Everything, integrating quantum mechanics, general relativity, thermodynamics, mass, entropy, and consciousness under a singular framework of informational... more
This paper presents a formal, first-principles derivation of a Unified Theory of Everything, integrating quantum mechanics, general relativity, thermodynamics, mass, entropy, and consciousness under a singular framework of informational... more
A paper addressing the concepts of the Meyler Time-Machine without complex math.
This paper introduces Resonant Collapse Dynamics (RCD), a unified theoretical framework synthesizing the Grand Unified Harmonic Collapse Theory (GUHCT) by Anthony Jordon with Omniological Resonance Theory (ORT). By aligning the discrete,... more
This paper proposes a novel approach to unifying gravity and electromagnetism via the theory of recursive identity fields as articulated in the Encryptment Thesis. Departing from traditional geometric and gauge field frameworks, we... more
This paper introduces a novel topological construct for facilitating superluminal time travel, termed the Butterfly Tongue model. Inspired by the tightly coiled proboscis of a butterfly, the proposed model reimagines a cosmic string as a... more
We analyze the sequence of Germain primes — primes p for which 2p + 1 is also prime — through the lens of information theory and topological data analysis (TDA). Although defined by a simple arithmetic condition, Germain primes exhibit... more
The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion... more
This thesis presents the Timeless Field, a comprehensive theoretical framework that unifies quantum mechanics, general relativity, consciousness, and information theory through the Fractal Resonance Ontology. Defined mathematically as a... more
We investigate the distribution of twin primes using tools from information theory and topological data analysis. By computing Shannon entropy in sliding windows over twin prime counts, we quantify local unpredictability and uncover... more
Starting at the logical level of the logic of partitions, dual to the usual Boolean logic of subsets, the notion of logical entropy, i.e., information as distinctions, is developed as the quantification of the distinctions of... more
Atena Editora
Accept this invitation to travel through the mathematical structure of temporal pathways, viewed through a crystalline kaleidoscope of information geometry. A shimmer on the horizon where desert meets sky, time travel is brought into... more
This paper presents a statistical methodology based upon information theory for adjusting mortality tables to obtain exactly some known individual characteristics, while obtaining a table that is as close as possible to a standard one.... more
El presente trabajo introducirá el oscilador de Dirac como sistema analíticamente resoluble a la ecuación de Dirac. Estudiaremos el modelo a través de distintas medidas teórico-informacionales, correlacionando sus valores con las... more
This is an essay in what might be called "mathematical metaphysics." There is a fundamental duality that runs through mathematics and the natural sciences. The duality starts at the logical level; it is represented by the Boolean logic of... more
Given an observed stochastic process, computational mechanics provides an explicit and efficient method of constructing a minimal hidden Markov model within the class of maximally predictive models. Here, the corresponding so-called... more
Logical probability theory was developed as a quantitative measure based on Boole's logic of subsets. But information theory was developed into a mature theory by Claude Shannon with no such connection to logic. A recent development in... more
Logical probability theory was developed as a quantitative measure based on Boole's logic of subsets. But information theory was developed into a mature theory by Claude Shannon with no such connection to logic. A recent development in... more
The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of... more
In the present communication we introduce a dynamic measure of inaccuracy between two past lifetime distributions over the interval (0, t). Based on proportional reversed hazard rate model (PRHRM), a characterization problem for this... more
Download research papers for free!