Academia.eduAcademia.edu

Transfer Entropy

description364 papers
group24 followers
lightbulbAbout this topic
Transfer entropy is a statistical measure used to quantify the amount of information transferred from one stochastic process to another. It assesses the directional influence between time series, capturing the flow of information and causal relationships, thereby providing insights into the dynamics of complex systems.
lightbulbAbout this topic
Transfer entropy is a statistical measure used to quantify the amount of information transferred from one stochastic process to another. It assesses the directional influence between time series, capturing the flow of information and causal relationships, thereby providing insights into the dynamics of complex systems.

Key research themes

1. How can Transfer Entropy be extended and interpreted to quantify sensitivity and critical transitions in complex dynamical systems?

This research theme explores theoretical generalizations and interpretations of transfer entropy to capture nuanced aspects of information transfer, especially in systems exhibiting phase transitions, criticality, or sensitivity to control parameters. Understanding these extensions is crucial for applying information-theoretic tools to detect and characterize abrupt changes in complex systems within physics, statistical mechanics, and dynamical systems.

Key finding: Introduces Fisher Transfer Entropy (FTE), a novel information-theoretic measure blending transfer entropy and Fisher information to quantify sensitivity gains to control parameters during state transitions within another... Read more
Key finding: Demonstrates that for a broad class of non-i.i.d., non-ergodic processes, Boltzmann entropy remains valid but assumes generalized functional forms identical to generalized entropies. By considering asymptotic reversible... Read more
Key finding: Proves an exact universal relation between Renyi entropy flow and full counting statistics of energy transfer for a system weakly coupled to a non-equilibrium external time-dependent system. This link allows quantification of... Read more

2. What are the mathematical properties, composition rules, and physical interpretations of generalized entropies related to or extending Transfer Entropy frameworks?

This theme focuses on the foundational mathematical structures behind entropies extending Boltzmann-Gibbs measures, including Tsallis and Renyi entropies, their nonadditive composition rules, and connections to physical properties such as finite heat baths and phase space topology. These generalizations provide alternative frameworks to understand information metrics like transfer entropy in systems with complex correlations or non-Markovian dynamics.

Key finding: Derives how deviations from classical additive entropy arise naturally from finite heat capacity baths and temperature fluctuations, linking the Tsallis and Renyi entropy parameters (q) to physical heat-bath constraints.... Read more
Key finding: Demonstrates that nonadditivity in Tsallis entropy fundamentally arises from the topology of phase space, particularly in small finite systems of point-like particles. Investigates the source term in a nonextensive transport... Read more
Key finding: Analyzes essential properties of Tsallis entropy, including its behavior under transformations, stochastic orders, and aging properties of random lifetimes. Studies its use as an uncertainty measure for coherent and mixed... Read more

3. How does transfer entropy relate to the physical nature, meaning, and experimental quantification of entropy and information flow in thermal and quantum systems?

This theme investigates conceptual and practical aspects of entropy from thermodynamic and quantum perspectives, especially regarding how transfer entropy and related measures connect with entropy's physical meaning, production, and experimental measurement. It includes clarifying the semantics of entropy, the role of symmetry and disorder, and entropy changes in open quantum systems governed by non-Hermitian dynamics, facilitating interpretation and quantification of information transfer in physical systems.

Key finding: Extends von Neumann entropy to quantum systems described by non-Hermitian Hamiltonians accounting for sinks or sources, showing that entropy production is no longer zero but can increase or decrease depending on system... Read more
Key finding: Provides a nuanced physical interpretation of thermodynamic entropy as thermal displacement of thermal energy relative to absolute temperature, arguing entropy reflects random thermal energy redistribution within material... Read more
Key finding: Proposes that ordering corresponds to the introduction of symmetry in physical systems and shows that increasing symmetry diminishes entropy, using binary 1D and 2D systems of elementary magnets as examples. The study... Read more

All papers in Transfer Entropy

Many multivariate time series anomaly detection frameworks have been proposed and widely applied. However, most of these frameworks do not consider intrinsic relationships between variables in multivariate time series data, thus ignoring... more
Background/Introduction: Widespread network disruption has been hypothesized to be an important predictor of outcomes in patients with refractory temporal lobe epilepsy (TLE). Most studies examining functional network disruption in... more
The functional brain activity network is the result of the harmonious activity of different regions. Each harmonic activity requires feedback from different areas of activity to adjust itself. As a result, any disorder in this harmony can... more
Transfer entropy (TE) captures the directed relationships between two variables. Partial transfer entropy (PTE) accounts for the presence of all confounding variables of a multivariate system and infers only about direct causality.... more
Information causality measures have proven to be very effective in uncovering the connectivity patterns of multivariate systems. The non-uniform embedding (NUE) scheme has been developed to address the “curse of dimensionality”, since the... more
Information causality measures have proven to be very effective in uncovering the connectivity patterns of multivariate systems. The non-uniform embedding (NUE) scheme has been developed to address the "curse of dimensionality", since the... more
The entropy production in stochastic dynamical systems is linked to the structure of their causal representation in terms of Bayesian networks. Such a connection was formalized for bipartite (or multipartite) systems with an integral... more
The irreversibility of trajectories in stochastic dynamical systems is linked to the structure of their causal representation in terms of Bayesian networks. We consider stochastic maps resulting from a time discretization with interval τ... more
Recent theoretical and empirical work has focused on the variability of network dynamics in maturation. Such variability seems to reflect the spontaneous formation and dissolution of different functional networks. We sought to extend... more
Identification of causal structures and quantification of direct information flows in complex systems is a challenging yet important task, with practical applications in many fields. Data generated by dynamical processes or large-scale... more
Synchronization of chaotic oscillators has become well characterized by errors which shrink relative to a synchronization manifold. This manifold is the identity function in the case of identical systems, or some other slow manifold in... more
A basic systems question concerns the concept of closure, meaning autonomy (closed) in the sense of describing the (sub)system as fully consistent within itself. Alternatively, the system may be nonautonomous (open), meaning it receives... more
Causal inference is perhaps one of the most fundamental concepts in science, beginning originally from the works of some of the ancient philosophers, through today, but also weaved strongly in current work from statisticians, machine... more
We propose an entropy statistic designed to assess the behavior of slowly varying parameters of real systems. Based on correlation entropy, the method uses symbol dynamics and analysis of increments to achieve sufficient recurrence in a... more
Synchronization of chaotic oscillators has become well characterized by errors which shrink relative to a synchronization manifold. This manifold is the identity function in the case of identical systems, or some other slow manifold in... more
This work proposes a unified thermodynamic and semantic theory of civilizational collapse, termed Entropology—the study of entropy as it manifests not only in energy flows but also in symbolic, legal, and infrastructural systems.... more
The analogy between information theory and income distribution analysis is exploited to derive a number of measures of distributional change. These include not only counterparts of the regular entropy measures, but also of the entire.... more
This article presents a structured literature review of dynamic volatility spillovers between spot and futures markets during financial crises, applying the TCCM (Theory, Context, Characteristics, Methodology) framework. Drawing from... more
Trajectories from a pair of interacting zebrafish are used to test for the existence of anticipatory dynamics in natural systems. Anticipatory dynamics (AD) is unusual in that causal events are not necessarily ordered by their temporal... more
Measures of complexity are of immediate interest for the field of autonomous robots both as a means to classify the behavior and as an objective function for the autonomous development of robot behavior. In the present paper we consider... more
Human brain activity maps are produced by functional MRI (fMRI) research that describes the average level of engagement during a specific task of various brain regions. Functional connectivity describes the interrelationship, integrated... more
In a recent work we proposed the corrected transfer entropy (CTE), which reduces the bias in the estimation of transfer entropy (TE), a measure of Granger causality for bivariate time series making use of the conditional mutual... more
In the aim to explore the complex relationships between S&P500, VIX and volume we introduce a Granger causality test using the nonlinear statistic of Asymmetric Partial Transfer Entropy (APTE). Through a simulation exercise, it arises... more
In this paper, a framework is developed for the identification of causal effects from nonstationary time series. Focusing on causality measures that make use of delay vectors from time series, the idea is to account for non-stationarity... more
Measures of the direction and strength of the interdependence among time series from multivariate systems are evaluated based on their statistical significance and discrimination ability. The best-known measures estimating direct causal... more
Information causality measures, i.e. transfer entropy and symbolic transfer entropy, are modified using the concept of surrogate data in order to identify correctly the presence and direction of causal effects. The measures are evaluated... more
An extension of transfer entropy, called partial transfer entropy, is proposed here to detect causal effects among observed interacting systems, and particularly distinguish among direct and indirect causal effects. The measure is... more
In this paper, we introduce the partial symbolic transfer entropy (PSTE), an extension of the symbolic transfer entropy that accounts only for the direct causal effects among the components of a multivariate system. It is an information... more
An extension of transfer entropy, called partial transfer entropy (PTE), is proposed to detect causal effects among observed interacting systems, and particularly to distinguish direct from indirect causal effects. PTE is compared to a... more
Measures of the direction and strength of the interdependence between two time series are evaluated and modified in order to reduce the bias in the estimation of the measures, so that they give zero values when there is no causal effect.... more
An extension of transfer entropy, called partial transfer entropy, is proposed here to detect causal effects among observed interacting systems, and particularly distinguish among direct and indirect causal effects. The measure is... more
In a recent work we proposed the corrected transfer entropy (CTE), which reduces the bias in the estimation of transfer entropy (TE), a measure of Granger causality for bivariate time series making use of the conditional mutual... more
Abstract. Information causality measures, ie transfer entropy and symbolic transfer entropy, are modified using the concept of surrogate data in order to identify correctly the presence and direction of causal effects. The measures are... more
Complexity of industrial plants and their stringent environmental and safety regulations have necessitated early detection and isolation of process faults. All the existing fault isolation methods can be categorized into two general... more
Transfer entropy (TE) captures the directed relationships between two variables. Partial transfer entropy (PTE) accounts for the presence of all confounding variables of a multivariate system and infers only about direct causality.... more
Self-entropy (SE) and transfer entropy (TE) are widely utilized in biomedical signal processing to assess the information stored into a system and transferred from a source to a destination respectively. The study proposes a more specific... more
Cross-frequency interactions, a form of oscillatory neural activity, are thought to play an essential role in the integration of distributed information in the brain. Indeed, phase-amplitude interactions are believed to allow for the... more
Neural oscillations are present in the brain at different spatial and temporal scales, and they are linked to several cognitive functions. Furthermore, the information carried by their phases is fundamental for the coordination of... more
Neural oscillations are present in the brain at different spatial and temporal scales, and they are linked to several cognitive functions. Furthermore, the information carried by their phases is fundamental for the coordination of... more
Cross-frequency interactions, a form of oscillatory neural activity, are thought to play an essential role in the integration of distributed information in the brain. Indeed, phase-amplitude interactions are believed to allow for the... more
Nature is full of random networks of complex topology describing such apparently disparate systems as biological, economical or informatical ones. Their most characteristic feature is the apparent scale-free character of interconnections... more
We introduce a novel measure, Fisher transfer entropy (FTE), which quantifies a gain in sensitivity to a control parameter of a state transition, in the context of another observable source. The new measure captures both transient and... more
Entropy measures in their various incarnations play an important role in the study of stochastic time series providing important insights into both the correlative and the causative structure of the stochastic relationships between the... more
There is growing evidence that for a range of dynamical systems featuring complex interactions between large ensembles of interacting elements, mutual information peaks at order/disorder phase transitions. We conjecture that, by contrast,... more
As a causality criterion we propose the conditional relative entropy. The relationship with information theoretic functionals mutual information and entropy is established. The conditional relative entropy criterion is compared with 3... more
The concept of "Industry 5.0" is driving significant changes in the production of chemical products and energy, promoting a shift towards a decarbonized and circular economy. Digitalization, robotics, communications, and artificial... more
Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and... more
We propose a novel measure to detect temporal ordering in the activity of individual neurons in a local network, which is thought to be a hallmark of activity-dependent synaptic modifications during learning. The measure, called Causal... more
Download research papers for free!