First Modern Backpropagation ​

In his 1976 paper “Taylor Expansion of the Accumulated Rounding Error,” S. Linnainmaa presented a novel approach for addressing rounding errors in numerical calculations through the use of Taylor series expansion. By systematically expanding the rounding...

First version of dropout

The 1990 paper “A Stochastic Version of the Delta Rule” by Stephen J. Hanson introduces a modified version of the delta rule incorporating synaptic noise into the weight update process of neural networks. This stochastic approach, termed the Stochastic...

First Modern Backpropagation

In his 1976 paper “Taylor Expansion of the Accumulated Rounding Error,” S. Linnainmaa presented a novel approach for addressing rounding errors in numerical calculations through the use of Taylor series expansion. By systematically expanding the rounding...

Ising’s model

The first non-learning recurrent neural network (RNN) architecture, known as the Ising model or Lenz-Ising model, was introduced and analyzed by physicists Ernst Ising and Wilhelm Lenz in the 1920s. This model was originally developed to understand ferromagnetism in...

Adaptive Ising Model

In his 1972 paper “Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements,” S. -I. Amari proposed a pioneering approach to pattern recognition and sequence learning through the use of self-organizing neural networks. These...

Linear regression

Linear regression is still a frequently used approximation for explaining linear dependencies between phenomena. It is often the first out-of-shelf method in various fields, such as econometrics, biology, engineering, and social sciences, due to its simplicity and...