Key research themes
1. How do foundational measures and decompositions of information unify structure, randomness, and geometry in symbolic and algorithmic complexity?
This theme explores innovative formalisms and mathematical frameworks that rigorously decompose symbolic objects or data sequences into components quantifying structure (generative models) and randomness (irreducible complexity), establishing conserved and normalized measures. The research connects algorithmic information theory with geometric concepts such as curvature and Fisher information, providing a differential-geometric lens on symbolic complexity. These developments facilitate computable diagnostics for model coherence, symbolic inference efficiency, and complexity auditing across domains including physics, logic, and AI.
2. How can information theory be extended and applied to neural communication and quantum frameworks to reconcile biological complexity and physical constraints?
This theme addresses extensions and modifications of classical Shannon information theory to model the complexity of neural information processing, quantum logical information, and continuous-time Gaussian network channels. It investigates conceptual and theoretical limits in applying information theory to neural spikes, elaborates quantum logical entropy as a framework for distinctions in quantum measurement, and reviews quantitative relationships between mutual information and estimation measures in Gaussian channels, highlighting applications to multiuser coding and capacity approximations.
3. How can mutual information and information-theoretic measures improve statistical dependence evaluation and image complexity quantification beyond classical correlation?
This theme focuses on the deployment of mutual information and related normalized measures such as symmetric uncertainty and global correlation coefficients to quantify dependencies between random variables, including nonlinear relationships unobservable by classical correlation. Additionally, it explores image complexity through an information-theoretic framework involving mutual information between histograms and spatially partitioned regions, enabling complexity measures sensitive to spatial distributions and compositional regularities.