Blind Descent: A Prequel to Gradient Descent
We describe an alternative to gradient descent for backpropogation through a neural network, which we call Blind Descent. We believe that Blind Descent can be used to augment backpropagation by using it as an initialisation method and can also be used at saturation. Blind Descent, inherently by design, does not face problems like exploding or vanishing gradients.
READ FULL TEXT