Skip to main content
Any real life situation where there are nodes and edges, can be modeled as graphs. Road networks are one such example where roads can be modeled as edges and nodes can be modeled as cities. Such a model can be used to analyze and optimize... more
    • by 
This paper introduces a time- and state-dependent measure of integrated information, phi, which captures the repertoire of causal states available to a system as a whole. Specifically, phi quantifies how much information is generated... more
    • by 
    • Information Theory
According to the integrated information theory, the quantity of consciousness is the amount of integrated information generated by a complex of elements, and the quality of experience is specified by the informational relationships it... more
    • by 
    •   2  
      Information TheoryInformation Geometry
The brain is never inactive. Neu- rons fire at leisurely rates most of the time, even in sleep (1), although occasionally they fire more intensely, for example, when pre- sented with certain stimuli. Coordinated changes in the activity... more
    • by 
    • Brain Imaging
The Donagi-Markman cubic is the differential of the period map for algebraic completely integrable systems. Here we prove a formula for the cubic in the case of Hitchin’s system for arbitrary semisimple g. This was originally stated... more
    • by 
    •   2  
      Algebraic GeometryRepresentation Theory
Many natural processes occur over characteristic spatial and temporal scales. This paper presents tools for (i) flexibly and scalably coarse-graining cellular automata and (ii) identifying which coarse-grainings express an automaton’s... more
    • by 
    • Information Theory
The moduli space of G-bundles on an elliptic curve with additional flag structure admits a Poisson structure. The bivector can be defined using double loop group, loop group and sheaf cohomology constructions. We investigate the links... more
    • by 
    •   2  
      Algebraic GeometryRepresentation Theory
This paper relates a recently proposed measure of information integration to experiments investigating the evoked high-density electroencephalography (EEG) response to transcranial magnetic stimulation (TMS) during wakefulness, early... more
    • by 
    •   2  
      Brain ImagingComputational Neuroscience
The internal structure of a measuring device, which depends on what its components are and how they are organized, determines how it categorizes its inputs. This paper presents a geometric approach to studying the internal structure of... more
    • by 
    •   2  
      Information TheoryCategory Theory
Time plays an essential role in the diffusion of information, influence and disease over networks. In many cases we only observe when a node copies information, makes a decision or becomes infected – but the connectivity, transmission... more
    • by 
    •   3  
      Social NetworksOptimization (Mathematics)Social Network Analysis (Social Sciences)
Cognitive neuroscience provides us with both clues and paradoxes about the neural substrate of consciousness. For example, we know that certain corticothalamic circuits are essential for conscious experience, whereas cerebellar circuits... more
    • by 
    •   3  
      Philosophy of MindInformation TheoryCognitive Neuroscience
We information-theoretically reformulate two measures of capacity from statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. We show these capacity measures count the number of hypotheses about a dataset... more
    • by 
    •   5  
      Philosophy of ScienceInformation TheoryMachine LearningKarl Popper
Broadly speaking, there are two approaches to quantifying information. The first, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number... more
    • by 
    •   6  
      Philosophy of ScienceInformation TheoryMachine LearningStatistical machine learning
This paper investigates how a population of neuron-like agents can use metabolic cost to communicate the importance of their actions. Although decision-making by individual agents has been extensively studied, questions regarding how... more
    • by 
    •   2  
      Information TheoryReinforcement Learning
Many methods for causal inference generate directed acyclic graphs (DAGs) that formalize causal relations between n variables. Given the joint distribution on all these variables, the DAG contains all information about how intervening on... more
    • by 
    • Information Theory
In this work we investigate the possibilities offered by a minimal framework of artificial spiking neurons to be deployed in silico. Here we introduce a hierarchical network architecture of spiking neurons which learns to recognize moving... more
    • by 
    • Computational Neuroscience
This paper suggests a learning-theoretic perspective on how synaptic plasticity benefits global brain functioning. We introduce a model, the selectron, that (i) arises as the fast time constant limit of leaky integrate-and-fire neurons... more
    • by 
Neurons deep in cortex interact with the environment extremely indirectly; the spikes they receive and produce are pre- and post-processed by millions of other neurons. This paper proposes two information-theoretic constraints guiding the... more
    • by 
    • Information Theory
Abstract: We propose a novel Bayesian approach to solve stochastic optimization problems that involve finding extrema of noisy, nonlinear functions. Previous work has focused on representing possible functions explicitly, which leads to a... more
    • by  and +1
Abstract: The bias/variance tradeoff is fundamental to learning: increasing a model's complexity can improve its fit on training data, but potentially worsens performance on future samples. Remarkably, however, the human brain... more
    • by 
    • Neuronal Plasticity