Victoria University of Wellington
Mathematics and Statistics
Any real life situation where there are nodes and edges, can be modeled as graphs. Road networks are one such example where roads can be modeled as edges and nodes can be modeled as cities. Such a model can be used to analyze and optimize... more
This paper introduces a time- and state-dependent measure of integrated information, phi, which captures the repertoire of causal states available to a system as a whole. Specifically, phi quantifies how much information is generated... more
According to the integrated information theory, the quantity of consciousness is the amount of integrated information generated by a complex of elements, and the quality of experience is specified by the informational relationships it... more
The brain is never inactive. Neu- rons fire at leisurely rates most of the time, even in sleep (1), although occasionally they fire more intensely, for example, when pre- sented with certain stimuli. Coordinated changes in the activity... more
The Donagi-Markman cubic is the differential of the period map for algebraic completely integrable systems. Here we prove a formula for the cubic in the case of Hitchin’s system for arbitrary semisimple g. This was originally stated... more
Many natural processes occur over characteristic spatial and temporal scales. This paper presents tools for (i) flexibly and scalably coarse-graining cellular automata and (ii) identifying which coarse-grainings express an automaton’s... more
The moduli space of G-bundles on an elliptic curve with additional flag structure admits a Poisson structure. The bivector can be defined using double loop group, loop group and sheaf cohomology constructions. We investigate the links... more
This paper relates a recently proposed measure of information integration to experiments investigating the evoked high-density electroencephalography (EEG) response to transcranial magnetic stimulation (TMS) during wakefulness, early... more
The internal structure of a measuring device, which depends on what its components are and how they are organized, determines how it categorizes its inputs. This paper presents a geometric approach to studying the internal structure of... more
Time plays an essential role in the diffusion of information, influence and disease over networks. In many cases we only observe when a node copies information, makes a decision or becomes infected – but the connectivity, transmission... more
Cognitive neuroscience provides us with both clues and paradoxes about the neural substrate of consciousness. For example, we know that certain corticothalamic circuits are essential for conscious experience, whereas cerebellar circuits... more
We information-theoretically reformulate two measures of capacity from statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. We show these capacity measures count the number of hypotheses about a dataset... more
Broadly speaking, there are two approaches to quantifying information. The first, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number... more
This paper investigates how a population of neuron-like agents can use metabolic cost to communicate the importance of their actions. Although decision-making by individual agents has been extensively studied, questions regarding how... more
Many methods for causal inference generate directed acyclic graphs (DAGs) that formalize causal relations between n variables. Given the joint distribution on all these variables, the DAG contains all information about how intervening on... more
In this work we investigate the possibilities offered by a minimal framework of artificial spiking neurons to be deployed in silico. Here we introduce a hierarchical network architecture of spiking neurons which learns to recognize moving... more
This paper suggests a learning-theoretic perspective on how synaptic plasticity benefits global brain functioning. We introduce a model, the selectron, that (i) arises as the fast time constant limit of leaky integrate-and-fire neurons... more
Neurons deep in cortex interact with the environment extremely indirectly; the spikes they receive and produce are pre- and post-processed by millions of other neurons. This paper proposes two information-theoretic constraints guiding the... more
Abstract: The bias/variance tradeoff is fundamental to learning: increasing a model's complexity can improve its fit on training data, but potentially worsens performance on future samples. Remarkably, however, the human brain... more