Academia.eduAcademia.edu

Computational Complexity

description20,856 papers
group73,144 followers
lightbulbAbout this topic
Computational Complexity is a branch of computer science that studies the resources required to solve computational problems, particularly in terms of time and space. It classifies problems based on their inherent difficulty and the efficiency of algorithms, often using complexity classes such as P, NP, and NP-complete to categorize problem-solving capabilities.
lightbulbAbout this topic
Computational Complexity is a branch of computer science that studies the resources required to solve computational problems, particularly in terms of time and space. It classifies problems based on their inherent difficulty and the efficiency of algorithms, often using complexity classes such as P, NP, and NP-complete to categorize problem-solving capabilities.

Key research themes

1. How do logical systems and cognitive models relate to computational tractability in mathematics and cognition?

This research theme explores the connection between descriptive complexity theory—linking logical expressiveness to computational complexity classes—and models of human cognitive capacities, especially in mathematical cognition. The significance lies in assessing whether the traditional computational complexity cutoff at class P (polynomial-time tractability) adequately characterizes the feasibility of computational models for human reasoning, and in examining the limitations of such strict classifications for capturing the nuances of cognition and mathematical problem solving.

Key finding: This work advances the critique of the P-cognition thesis, which restricts feasible cognitive functions to those in complexity class P, by arguing that computational complexity measures are too coarse to serve as strict... Read more

2. What methods best capture and quantify computational complexity in quantum field theories and holography?

Research in this area investigates various approaches to defining and measuring complexity in quantum field theory (QFT), often motivated by holographic duality (AdS/CFT correspondence), and how complexity evolves over time. Distinguishing among these approaches, especially circuit complexity derived from wave functions versus geometric proposals like complexity=volume or complexity=action conjectures, has strong implications for understanding quantum computational resources, black hole physics, and quantum information theory. This work also probes the applicability and limitations of bounds such as Lloyd's bound within these contexts.

Key finding: This paper proposes a novel testing procedure employing information-theoretic measures (Loschmidt echo and Fidelity) to discriminate among definitions of complexity in QFT, avoiding issues like gate or basis dependence. It... Read more
Key finding: Analyzing Lloyd's computational speed bound in the context of holographic complexity, this work identifies that standard assumptions about orthogonalizing quantum gates are violated for large AdS black holes, which involve... Read more

3. How can cognitive complexity of computer programs be formally modeled and assessed to reflect human comprehension challenges?

This research strand focuses on defining, quantifying, and validating complexity metrics for computer programs from cognitive perspectives, emphasizing the subjective human effort in understanding programs. Methods draw on cognitive load theory, schema theory, and hierarchical models of task complexity, aiming to develop frameworks that capture the interplay of programming constructs, plan complexity, and inter-schema interactions. Such frameworks are instrumental for computing education research, instructional design, and improving curriculum sequencing based on objective yet cognitively grounded measures.

Key finding: This article proposes the Cognitive Complexity of Computer Programs (CCCP) framework, which evaluates program complexity by measuring plan depth and maximal plan interactivity, reflecting the complexity of requisite cognitive... Read more
Key finding: Building upon cognitive psychology and educational research, this work introduces "Rules of Program Behavior" to better characterize program semantics for learners’ mental models and proposes a theoretical framework that... Read more
Key finding: This study argues that cognitive complexity should not be confined to fixed quantitative metrics but expressed as subjective ratings incorporating human factor variability. Using divisive hierarchical clustering and user... Read more
Key finding: The paper introduces a measure of software complexity grounded on cognitive weights assigned to basic control structures (BCS) reflecting the mental effort during comprehension. It proposes aggregating these weights through... Read more

All papers in Computational Complexity

The design of several database query languages has been influenced by Codd's relational algebra. This paper discusses the difficulty of optimizing queries based on the relational algebra operations select, project, and join. A matrix,... more
A new fourth order box-scheme for the Poisson problem in a square with Dirichlet boundary conditions is introduced, extending the approach in . The design is based on a "hermitian box" approach, combining the approximation of the gradient... more
Orthogonal Frequency division multiplexing (OFDM) is widely used technique in modern day wireless communication systems which provides robustness to channel fading and immunity to impulse interference. Despite of its advantages, one of... more
Motion estimation using the one-bit transform (1BT) was proposed in to achieve large computation reduction. However, it degrades the predicted image by almost 1dB as compared with full search. In this paper, we propose a modification to... more
HAL is a multi-disciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or... more
In this paper we ask which properties of a distributed network can be computed from a few amount of local information provided by its nodes. The distributed model we consider is a restriction of the classical CON GEST (distributed) model... more
The notions of universality and completeness are central in the theories of computation and computational complexity. However, proving lower bounds and necessary conditions remains hard in most of the cases. In this article, we introduce... more
We wish to tile a rectangle or a torus with only vertical and horizontal bars of a given length, such that the number of bars in every column and row equals given numbers. We present results for particular instances and for a more general... more
The model of cellular automata is fascinating because very simple local rules can generate complex global behaviors. The relationship between local and global function is subject of many studies. We tackle this question by using results... more
We study cellular automata with respect to a new communication complexity prob-lem: each of two players know half of some finite word, and must be able to tell whether the state of the central cell will follow a given evolution, by... more
For the design of complex digital signal processing systems, block diagram oriented synthesis of real time software for programmable target processors has become an important design aid. The synthesis approach discussed in this paper is... more
GSAT is a randomized greedy local repair procedure that was introduced for solving propositional satisfiability and constraint satisfaction problems. We present an improvement to GSAT that is sensitive to the problem's structure. When the... more
Abstract: In the context of influence propagation in a social graph, we can identify three orthogonal dimensions-the number of seed nodes activated at the beginning (known as budget), the expected number of activated nodes at the end of... more
El documento está publicado bajo licencia Creative Commons de Reconocimiento 4.0 Internacional (CC BY 4.0). Web: sociocomplex.net Logo: CCBY4.0 Este libro blanco analiza por primera vez las principales fuerzas de la investigación española... more
This paper considers multiple symbol differential detection (DD) for both single-antenna and multiple-antenna systems over flat Ricean-fading channels. We derive the optimal multiple symbol detection (MSD) decision rules for both Mary... more
Consider the problem of computing the intersection of k sorted sets. In the comparison model, we prove a new lower bound which depends on the non-deterministic complexity of the instance, and implies that the algorithm of Demaine,... more
Electrical muscle stimulation demonstrates potential for preventing muscle atrophy and for restoring functional movement after spinal cord injury (SCI). Control systems used to optimize delivery of electrical stimulation protocols depend... more
Electrical muscle stimulation demonstrates potential for restoring functional movement and preventing muscle atrophy after spinal cord injury (SCI). Control systems used to optimize delivery of electrical stimulation protocols depend upon... more
Dirac showed that a 2-connected graph of order n with minimum degree δ has circumference at least min{2δ, n}. We prove that a 2connected, triangle-free graph G of order n with minimum degree δ either has circumference at least min{4δ-4,... more
We present and study an agent-based model of T-Cell cross-regulation in the adaptive immune system, which we apply to binary classification. Our method expands an existing analytical model of T-cell cross-regulation that was used to study... more
This paper proposes a complete theoretical framework regarding the extreme states of the universe, arguing that physical reality must necessarily exist under bidirectional constraints: an upper bound of Planck density (ρP ≈ 5.1 × 10 96... more
We propose an efficient parallelization strategy for a graph based lattice Boltzmann implementation and present performance results for a variety of complex geometries.
This paper presents a probabilistic model for combining cluster ensembles utilizing information theoretic measures. Starting from a co-association matrix which summarizes the ensemble, we extract a set of association distributions, which... more
After a sequence of improvements Boyd, Sitters, van der Ster, and Stougie proved that any 2-connected graph whose n vertices have degree 3, i.e., a cubic 2-connected graph, has a Hamiltonian tour of length at most (4/3)n, establishing in... more
A graph $G$ is maximal nontraceable (MNT) if $G$ does not have a hamiltonian path but, for every $e\in E\left( \overline{G}\right) $, the graph $G+e$ has a hamiltonian path. A graph $G$ is 1-tough if for every vertex cut $S$ of $G$ the... more
An extended version of low-complexity IP Core for image/video transformations based on the CORDIC architecture is presented. This IP core is able to perform quantized 8×8 IDCT and quantized 8×8/4×4 H.264-inverse integer transforms on a... more
Divisorial gonality and stable divisorial gonality are graph parameters, which have an origin in algebraic geometry. Divisorial gonality of a connected graph G can be defined with help of a chip firing game on G. The stable divisorial... more
Connectivity problems like k-Path and k-Disjoint Paths relate to many important milestones in parameterized complexity, namely the Graph Minors Project, color coding, and the recent development of techniques for obtaining kernelization... more
The notion of treewidth plays an important role in theoretical and practical studies of graph problems. It has been recognized that, especially in practical environments, when computing the treewidth of a graph it is invaluable to first... more
One of the typical important criteria to be considered in real-time control applications is the computational complexity of the controllers, observers, and models applied. In this paper, a singular value decomposition (SVD)-based... more
The problem of non-stationary interference suppression in direct sequence spread-spectrum (DS-SS) systems is considered. The phase of interference is approximated by a polynomial within the considered interval. According to the local... more
Wireless sensor networks consist of various sensor nodes which operate in a memory, energy and bandwidth constrained environment. Target tracking is an important application in wireless sensor networks and in this paper we propose an... more
Many communications embedded systems implement decimation filters. In particular, base-band stage in multistandard receivers is composed of cascade of decimation filters performing channel selection. The number of used filter and the kind... more
The abilities of detecting contradictions and rearranging the cognitive space in order to cope with them are important to be embedded in the BDI architecture of an agent acting in a complex and dynamic world. However, to be accomplished... more
Nowadays, computational processes tend to be more **parallelizable**, especially those that are **polynomially bounded**, enabling the budgeting of computer architectures and networks capable of meeting non-functional requirements. This... more
Abstract. In this paper we will discuss two different translations between RDF (Resource Description Format) and Conceptual Graphs (CGs). These translations will allow tools like Cogui and Cogitant to be able to import and export RDF(S)... more
This paper describes a recursive estimation procedure for multivariate binary densities (probability distributions of vectors of Bernoulli random variables) using orthogonal expansions. For d covariates, there are 2 d basis coefficients... more
The most natural convexities on graphs are path convexities defined by a system P of paths in a graph G that contains all geodesics. The canonical choices for P are provided by selecting all paths, triangle paths, monophonic paths and... more
A ranking on a graph is an assignment of positive integers to its vertices such that any path between two vertices of the same rank contains a vertex of strictly larger rank. A ranking is locally minimal if reducing the rank of any single... more
This paper presents an analysis of I/O (read and write) complexities of the external sorting algorithms with no additional disk space. Each algorithm sorts N records partitioning into blocks each of with B block size. Analyzing the... more
Common approaches for robot navigation use Bayesian filters like particle filters, Kalman filters and their extended forms. We present an alternative and supplementing approach using constraint techniques based on spatial constraints... more
Modeling the environment is crucial for a mobile robot. Common approaches use Bayesian filters like particle filters, Kalman filters and their extended forms. We present an alternative and supplementing approach using constraint... more
The demand for higher capacity wireless communication networks has motivated research in the techniques of adaptive beamforming using smart antennas. The technique is to radiate narrow beams in a desired direction and to suppress... more
A few cases are known where the computational analogue of some basic information theoretical results is much harder to prove or even not known to hold. A notable example is Yao's XOR Lemma. Actually, even Direct Sum Conjectures can be... more
We study non-deterministic communication protocols in which no input has too many witnesses. Define n k (f ) to be the minimum complexity of a non-deterministic protocol for the function f in which each input has at most k witnesses. We... more
A fundamental lemma of Yao states that computational weakunpredictability of Boolean predicates is amplified when the results of several independent instances are XOR together. We survey two known proofs of Yao’s Lemma and present a third... more
A (q, k, t)-design matrix is an m×n matrix whose pattern of zeros/non-zeros satisfies the following design-like condition: each row has at most q non-zeros, each column has at least k non-zeros and the supports of every two columns... more
Berman and Schnitger (10) gave a randomized reduction from approximating MAX- SNP problems (24) within constant factors arbitrarily close to 1 to approximating clique within a factor of n (for some ). This reduction was further studied by... more
Download research papers for free!