unofficial pytorch implement of “Squareplus: A Softplus-Like Algebraic Rectifier”

unofficial pytorch implement of “Squareplus: A Softplus-Like Algebraic Rectifier”

SquarePlus

Squareplus is a Softplus-Like activation function. It is a very simple smooth approximation of ReLU.

The form of squareplus is very simple. It only uses addition, multiplication, division and square root:

b>0 in this form. When b=0, it degenerates to ReLU(x)=max(x,0)

The original paper pointed out that since only addition, multiplication, division and square root are used, the speed of squareplus (mainly on the CPU) will be faster than SoftPlus and other functions.

In jianlin su’s

 

 

 

To finish reading, please visit source site