Academia.eduAcademia.edu

Outline

On a learning model with growing memory

1974, Cybernetics

https://doi.org/10.1007/BF01068453

Abstract

In this article we present an example of a growing structure that consists at each instant of time of a specified automaton, i.e., a head and finitely many finite automata (elements) of the same type. This structure interacts with a pair of Bernoullian random generators that yield ones and zeros with probabilities unknown to the structure (which interrogates one of the generators during each interval). The principal result is as follows. For any initial state of the structure and any generators, the limiting average frequency of interrogation of the generator that yields a "one" with a higher probability is equal to 1. In this sense the structure realizes an optimal training algorithm. The algorithm consists of alternating (socalled) learning and operating cycles, in Sec. i we shall formulate a sufficient condition of optimality of such algorithms. In Sec. 2 we shall describe the automata forming the structure, and in Sec. 3 we shall show that the algorithm realized by it satisfies this sufficient condition. Each element of the structure has three states, and the minimum possible number of states is not smaller than two in any case. In Sec. 4 we present bounds related to a probability-theoretical scheme whose simulation underlies the operation of the structure; these bounds are perhaps of an intrinsic interest.

References (4)

  1. M~ L~ Tsetlin, Studies in Automata Theory and Simulation of Biological Systems [in Russian], Nauka, Moscow (1969).
  2. M. E. Hellman and Th~ M~ Cover, "Comments on automata in random media," Problemy Peredachi Informatsii, _6, No. 2 (1970)~
  3. Ho Robbins, "Some aspects of the sequential design of experiments," Bull. Amer. Math. Soc., 58, 529-532 (1952)~
  4. W~ Feller, An Introduction to Probability Theory and Its Applications, Vol~ 1, Wiley, New York {1950)~