A hyperbolic tangent activation function that maps input values to a range between –1 and 1, commonly used in neural networks before ReLU became dominant.
A hyperbolic tangent activation function that maps input values to a range between –1 and 1, commonly used in neural networks before ReLU became dominant.