Rectified Linear Unit Function|LearnCPlusPlus

Comments · 130 Views

"IThis article will explain what a Rectified Linear Function is in ANN. How can ReLU Activation Function Hyperbolic be used? Let us refresh our memories about activation functions and define these terms."

In Artificial Neural Networks, the Rectified Linear Unit Function or in other terms ReLU Activation Function is an activation function defined as the positive part of its argument. Can be written as f(x)= max(0, x) where x is the sum of weighted input signals to an artificial neuron. ReLU Function is also known as a Ramp Function and is analogous to Half-wave Rectification in electrical engineering.

Comments