Anonymous

Which Function Is Better Use In Backpropagation Neural Networks, Sigmoid(x) Or Tanh(x)?

1

1 Answers

Akshay Kalbag Profile
Akshay Kalbag answered
The sigmoid(x) function is better used in backpropagation neural networks. A sigmoid(x) function is used in a backpropagation neural network for the purpose of introducing non-linearity in the model and/or for the purpose of making sure that certain signals remain within a range which has been specified.

A popular neural net element is a neural net element which computes a linear combination of input signals. It applies a bound sigmoid(x) function to the result. This model can be seen as a smoother variant of what is known as the classical threshold neuron.

There is one fundamental reason why a sigmoid(x) function is more popular in a backpropagation neural network. The reason why a sigmoid(x) function is more popular in a backpropagation neural network is because the sigmoid(x) function is the only function which satisfies the following property: d/dt sig (t) = sig (t) [1-sig (t)]. The simple polynomial relationship between the derivative and itself is very easy to compute.

Answer Question

Anonymous