Optimal storage capacity of neural networks at finite temperatures

Дата и время публикации : 1993-06-15T01:54:55Z

Авторы публикации и институты :
G. M. Shimi (Department of Physics and Center for Theoretical Physics Seoul National University, Seoul 151-742, Korea)
D. Kim (Department of Physics and Center for Theoretical Physics Seoul National University, Seoul 151-742, Korea)
M. Y. Choi (Department of Physics and Center for Theoretical Physics Seoul National University, Seoul 151-742, Korea)

Ссылка на журнал-издание: Ссылка на журнал-издание не найдена
Коментарии к cтатье: 22 pages in LaTex, 4 figures upon request, SNUTP-93-26
Первичная категория: cond-mat

Все категории : cond-mat

Краткий обзор статьи: Gardner’s analysis of the optimal storage capacity of neural networks is extended to study finite-temperature effects. The typical volume of the space of interactions is calculated for strongly-diluted networks as a function of the storage ratio $alpha$, temperature $T$, and the tolerance parameter $m$, from which the optimal storage capacity $alpha_c$ is obtained as a function of $T$ and $m$. At zero temperature it is found that $alpha_c = 2$ regardless of $m$ while $alpha_c$ in general increases with the tolerance at finite temperatures. We show how the best performance for given $alpha$ and $T$ is obtained, which reveals a first-order transition from high-quality performance to low-quality one at low temperatures. An approximate criterion for recalling, which is valid near $m=1$, is also discussed.

Category: Physics