Activation functions

 My notes about Activation Functions

( In progress) 


ReLU - Rectified Linear Unit 

Acts as a threshold. 

Everything bellow 0 is set to 0. 

Everything above 0 stays as it is.










Identity function 

Passes the input unchanged.


Modulus function 

Passes the absolute value of the input.