What are the irreducible components of information that underlie all domains of knowledge and reality? This paper proposes a cross-disciplinary “factor analysis” of information to identify its prime factors—the minimal, universal... more
The Kosmos Theory (KTS) proposes a unified view of reality, structured in island-universes, autonomous fragments of information that remain interconnected by holographic coherence. These universes constitute nuclei of a greater whole, in... more
This paper constructs a unified theoretical framework for AI semantic dynamics, modeling the behavior of Large Language Models (LLMs) as dynamic evolutionary processes in high-dimensional semantic space. Based on the Unified Dynamic... more
We present Computational Emergence Theory (CET)—a mathematically rigorous framework wherein physical reality emerges from self-organizing computational processes on infinite graph spaces, with fundamental laws arising as ergodic... more
This paper introduces the Dynamic Rate Theory as a novel framework for analyzing the P vs. NP problem, integrating cognitive deconstruction methods with mathematical structural modeling. It proposes a shift from binary logical... more
Intrusion Detection by Sensors A region can be protected using a sensor network. Each sensor has a sensing range r: r A sensor detects an object entering its sensing range.
We consider several variations of the problems of covering a set of barriers (modeled as line segments) using sensors that can detect any intruder crossing any of the barriers. Sensors are initially located in the plane and they can... more
We have proven that any implementation of the concept of `copy number' underlying Assembly Theory (AT) and its assembly index (Ai) is equivalent to Shannon entropy, and not fundamentally or methodologically different from algorithms like... more
We present an agnostic signal reconstruction method for zero-knowledge one-way communication channels in which a receiver aims to interpret a message sent by an unknown source about which no prior knowledge is available and to which no... more
The wide application of digital design, the advances of digital fabrication and robotic processes have facilitated the materialization of bespoke geometries. In turn it has raised the issue of how architects can reduce design complexity... more
It is a well-known fact that the Bayesian Networks' (BNs) use as classifiers in different fields of application has recently witnessed a noticeable growth. Yet, the Naïve Bayes' application, and even the augmented Naïve Bayes', to... more
The concept of complexity as considered in terms of its algorithmic definition proposed by G. J. Chaitin and A. N. Kolmogorov is revisited for the dynamical complexity of music. When music pieces are cast in the form of time series of... more
In this note, we elaborate on directions for efficient graph analysis, taking into account the dynamic nature of complex networks.
The article is devoted to the dialectical analysis of such fundamental essences of modern organizations as complexity, emergence and management. It considers approaches to the quantitative assessment of complexity as a constructive... more
This article reviews pixon‐based image reconstruction, which in its current formulation uses a multiresolution language to quantify an image's algorithmic information content (AIC) using Bayesian techniques. Each pixon (or its... more
Computer-based shape grammar implementations aim to support creative design exploration by automating rule-application. This paper reviews existing shape grammar implementations in terms of their algorithmic complexity, extends the... more
Sources that generate symbolic sequences with algorithmic nature may differ in statistical complexity because they create structures that follow algorithmic schemes, rather than generating symbols from a probabilistic function assuming... more
We describe an approach to linguistic theory based on algorithmic complexity, and motivated by Rissanen's minimum description length analysis. The approach adds a critical second stage to Chomsky's original conception of generative... more
Thanks to the simplicity and robustness of its calculation methods, algorithmic (or Kolmogorov) complexity appears as a useful tool to reveal chaotic dynamics when experimental time series are too short and noisy to apply Takens’... more
The Universe is the ultimate information system. The capacity to communicate is built into every object, from electrons to cells to stars. But information is not possible without order. Order imposes a limit on the speed of information.... more
I propose a simple representation language for undirected graphs that can be encoded as a bitstring, and equivalence is a topological equivalence. I also present an algorithm for computing the complexity of an arbitrary undirected network.
Linear functional analysis historically founded by Fourier and Legendre played a significant role to provide a unified vision of mathematical transformations between vector spaces. The possibility of extending this approach is explored... more
Graphs considered in this paper are finite simple undirected graphs. Let G = (V (G), E(G)) be a graph with E(G) = {e 1 , e 2 ,. .. , e m }, for some positive integer m. The edge space of G, denoted by E (G), is a vector space over the... more
Motivated by algorithmic information theory, the problem of program discovery can help find candidates of underlying generative mechanisms of natural and artificial phenomena. The uncomputability of such inverse problem, however,... more
Background: The objective of this research is to compare the relational and non-relational (NoSQL) database systems approaches in order to store, recover, query and persist standardized medical information in the form of ISO/EN 13606... more
The Robust Stable Marriage problem (RSM) is a variant of the classic Stable Marriage problem in which the robustness of a given stable matching is measured by the number of modifications required to find an alternative stable matching... more
This paper describes the theory of pixon-based image reconstruction. After a brief introduction of the basic concepts of the pixon, the paper concentrates primarily on our current implementation of the techniques along with the... more
This article attempts to show how summary statistics can be misleading if not complemented by directly inspecting the data. Instead, algorithmic complexity measures can be used to uniquely identify each data element. The data science... more
Sources that generate symbolic sequences with algorithmic nature may differ in statistical complexity because they create structures that follow algorithmic schemes, rather than generating symbols from a probabilistic function assuming... more
Based on the principles of information theory, measure theory, and theoretical computer science, we introduce a signal deconvolution method with a wide range of applications to coding theory, particularly in zero-knowledge one-way... more
Experimental studies of the cultural evolution of language have focused on how constraints on learning and communication drive emergence of linguistic structure. Yet language is typically transmitted by experts who adjust the input in... more
In two experiments, Friedenberg and Liby (2016) studied how a diversity of complexity estimates such as density, number of blocks, GIF compression rate and edge length impact the perception of beauty of semirandom two-dimensional... more
An optical algorithm is proposed to extract region(s) of maximum/minimum intensity from a 2D image.
The paper introduces the notion of the adaptive space-variant coordinate transformations and proposes a way of their implementation optically. It also discusses the applications of such coordinate transformations in image processing and... more
A multi-cut rearrangement of a string S is a string S′ obtained from S by an operation called k-cut rearrangement, that consists of (1) cutting S at a given number k of places in S, making S the concatenated string X1·X2·X3·…·Xk·Xk+1,... more
Computer-based shape grammar implementations aim to support creative design exploration by automating rule-application. This paper reviews existing shape grammar implementations in terms of their algorithmic complexity, extends the... more
Linear functional analysis historically founded by Fourier and Legendre played a significant role to provide a unified vision of mathematical transformations between vector spaces. The possibility of extending this approach is explored... more
We demonstrate that the assembly pathway method underlying ``Assembly Theory" (AT) is a suboptimal restricted version of Huffman's encoding (Shannon-Fano type) for `counting copies,' the stated objective of the authors of AT, introduced... more
A hybrid method based on the combination of generalized forward backward method (GFBM) and Green's function for the grounded dielectric slab together with the acceleration of the combination via a discrete Fourier transform (DFT) based... more
In this article we use the idea of algorithmic complexity (AC) to study various cosmological scenarios, and as a means of quantizing the gravitational interaction. We look at 5D and 7D cosmological models where the Universe begins as a... more
A methodological proposal to estimate a Tailored to the Problem Specificity mathematical transformation is developed. To begin, Linear Analysis is briefly visited because of its significant role providing a unified vision of mathematical... more
We show that numerical approximations of Kolmogorov complexity (K) of graphs and networks capture some group-theoretic and topological properties of empirical networks, ranging from metabolic to social networks, and of small synthetic... more
Nowadays, Bayesian Networks (BNs) have constituted one of the most complete, self-sustained and coherent formalisms useful for knowledge acquisition, representation and application through computer systems. Yet, the learning of these BNs... more
Very large databases are a major opportunity for science and data analytics is a remarkable new eld of investigation in computer science. The eectiveness of these tools is used to support a philosophy against the scientic method as... more
Very large databases are a major opportunity for science and data analytics is a remarkable new eld of investigation in computer science. The eectiveness of these tools is used to support a philosophy against the scientic method as... more
We introduce a family of unsupervised, domain-free, and (asymptotically) model-independent algorithms based on the principles of algorithmic information theory designed to minimize the loss of algorithmic information. The method... more