Derivation: Derivatives for Common Neural Network Activation Functions
The material in this post has been migraged with python implementations to my github pages website.
Posted on September 8, 2014, in Classification, Derivations, Machine Learning, Neural Networks, Regression and tagged Backpropagation, backpropagation algorithm, Logistic Sigmoid, Neural Networks, Quotient Rule, Tanh Function. Bookmark the permalink. 9 Comments.