Academia.eduAcademia.edu

Outline

Cryptography based on neural networks analytical results

2002, Journal of Physics A: Mathematical and General

https://doi.org/10.1088/0305-4470/35/47/104

Abstract

Mutual learning process between two parity feed-forward networks with discrete and continuous weights is studied analytically, and we find that the number of steps required to achieve full synchronization between the two networks in the case of discrete weights is finite. The synchronization process is shown to be non-self-averaging and the analytical solution is based on random auxiliary variables. The learning time of an attacker that is trying to imitate one of the networks is examined analytically and is found to be much longer than the synchronization time. Analytical results are found to be in agreement with simulations.

FAQs

sparkles

AI

What explains the synchronization mechanism in Parity Machines (PMs)?add

The study reveals that synchronization is achieved through Hebbian learning, requiring finite steps even for discrete weights, as shown in simulations with L=3 where t_synch was approximately 400.

How does the architecture of PMs affect cryptographic security?add

The paper demonstrates that with common inputs and mutual learning, PMs can create ephemeral keys that resist imitation by attackers, maintaining security even when the attacker knows the network architecture.

What are the conditions for achieving full synchronization in continuous weight PMs?add

Synchronization in continuous weight PMs necessitates weight normalization after updates, contributing to a time dependence proportional to input size, as discussed through coupled differential equations.

How do different update rules impact the mutual learning of PMs?add

The analytical results reveal that updates are conditional on agreement in outputs, leading to varied internal representations and unique synchronizations depending on hidden unit agreements.

When was the bridge between neural networks and cryptography established?add

The foundational relationship was numerically established in the recent literature, indicating potential neural network applications in secure cryptographic protocols.

References (12)

  1. J. Hertz, A. Krogh and R. G. Palmer, Introduction to the Theory of Neural Computation (Redwood City, CA: Addison-Wesley, 1991)
  2. A. Engel and C. Van den Broeck Statistical Mechanics of Learning (Cambridge University Press, Cambridge, 2001).
  3. I. Kanter, W. Kinzel and E. Kanter, Europhys, Lett. 57, 141 (2002).
  4. E. Barkai, D. Hansel and I. Kanter, Phys. Rev. Lett. 65, 2312 (1990);
  5. M. Opper, Phys. Rev. E., 51, 3613, (1995);
  6. R. Simonetti and N.J. Caticha J. Phys. A 29, 4859, (1996).
  7. R. Metzler, W. Kinzel and I. Kanter, Phys. Rev. E 62, 2555 (2000), J. Phys. A 33, L141 (2000).
  8. W. Kinzel and P. Rujan, Europhys. Lett. 13,473 (1990);
  9. O. Kinouchi and N. Caticha, J. Phys. A 25, 6243 (1992);
  10. C. Van den Broeck and M. Bouten, Europhys. Lett. 22, 223, (1993).
  11. G. Reents and R. Urbanczik, Phys. Rev. Lett., 80, 5445, (1998).
  12. M. Rosen-Zvi, I. Kanter and W. Kinzel (in preparation).