Noisy random Boolean formulae: A statistical physics perspective
2010, Physical Review E
Abstract
Typical properties of computing circuits composed of noisy logical gates are studied using the statistical physics methodology. A growth model that gives rise to typical random Boolean functions is mapped onto a layered Ising spin system, which facilitates the study of their ability to represent arbitrary formulae with a given level of error, the tolerable level of gate-noise, and its dependence on the formulae depth and complexity, the gates used and properties of the function inputs. Bounds on their performance, derived in the information theory literature via specific gates, are straightforwardly retrieved, generalized and identified as the corresponding typical-case phase transitions. The framework is employed for deriving results on error-rates, function-depth and sensitivity, and their dependence on the gate-type and noise model used that are difficult to obtain via the traditional methods used in this field.
References (31)
- N[S L-1 ; ŜL-1 ] = e k . The latter is used to show that for ℓ ′ = L -2 gives F ℓ ′ 1 [y,z] F ℓ ′ 0 [y,z] = 1 and so on until we conclude that P ℓ (S, Ŝ) = ik for all ℓ.
- S. Borkar, IEEE Micro 25, 10 (2005).
- A. Ekert and R. Jozsa, Rev. Mod. Phys. 68, 733 (1996).
- J. Von Neumann, Probabilistic logics and the synthesis of reliable organisms from unreliable components (Princeton University Press, Princeton, NJ, 1956), p. 43-98, Automata Studies.
- N. Pippenger, IEEE Trans. Inf. Theory 34, 194 (1988).
- T. Feder, IEEE Trans. Inf. Theory 35, 569 (1989).
- B. Hajek and T. Weller, IEEE Trans. Inf. Theory 37, 388 (1991).
- W. Evans and L. Schulman, IEEE Trans. Inf. Theory 49, 3094 (2003).
- W. Evans and N. Pippenger, IEEE Trans. Inf. Theory 44, 1299 (1998).
- F. Unger, IEEE Trans. Inf. Theory 54, 3693 (2008).
- M. Mézard and A. Montanari, Information, Physics, and Computation (Oxford University Press, Oxford, 2009).
- Y. Kabashima and D. Saad, J. Phys. A: Math. Gen. 37, R1 (2004).
- P. Savický, Discrete Math. 83 (1990).
- A. Brodsky and N. Pippenger, Random Struct. Algor. 27, 490 (2005).
- B. Steinbach and C. Lang, Artif. Intell. Rev. 20, 319 (2003).
- R. B. Boppana, Inform. Process. Lett. 63, 257 (1997).
- H. Lefmann and P. Savický, Random Struct. Algor. 10, 337 (1997).
- B. Chauvin, P. Flajolet, D. Gardy, and B. Gittenberger, Comb. Probab. Comput. 13, 475 (2004).
- D. Gardy and A. Woods, DMTCS Proceedings AD, 139 (2005).
- J. P. L. Hatchett, B. Wemmenhove, I. P. Castillo, T. Nikoletopoulos, N. S. Skantzos, and A. C. C. Coolen, J. Phys. A: Math. Gen. 37, 6201 (2004).
- C. De Dominics, Phys. Rev. B. 18, 4913 (1978).
- A. Mozeika, D. Saad, and J. Raymond, Phys. Rev. Lett. 103, 248701 (2009).
- M. Minsky and S. Papert, Perceptrons: An Introduction to Computational Geometry (MIT Press, Cambridge MA, 1972), 2nd ed.
- J. M. Ortega, SIAM Journal on Numerical Analysis 10, 268 (1973).
- A. C. C. Coolen, R. Kühn, and P. Sollich, Theory of Neural Information Processing Systems (Oxford Univerity Press, Oxford, 2005).
- K. Mimura and A. C. C. Coolen, J. Phys. A: Math. Theor. 42, 415001 (2009).
- I. Neri and D. Bollé, J. Stat. Mech. Theory Exp. 2009, P08009 (2009).
- B. Derrida, E. Gardner, and A. Zippelius, Europhys. Lett. 4, 167 (1987).
- H. Fournier, D. Gardy, and A. Genitrini, in 6th SIAM Workshop on Analytic and Combina- torics (ANALCO) (2009), pp. 51 -57.
- A. Mozeika and D. Saad, in progress.
- To establish the connection between [28] and our work we use the mapping S i = 1 -2x i from x i ∈ {1, 0} to S i ∈ {-1, 1}