下面介紹常見的激活函數(shù)
激活函數(shù)都在tf.nn中
最常見的是relu函數(shù),它將所有小于0的值置為0
可以看到relu在的輸出可以趨近正無窮瞧筛,這樣的情況有可能導致梯度消失或者是梯度爆炸路媚,relu6函數(shù)是在relu基礎上加上上界
顯然6是上界
其次是sigmoid函數(shù)是辕,但是因為其不是zero-center貌夕,所以現(xiàn)在很少直接用這個函數(shù)做激活函數(shù)
因為sigmoid不是zero-center的伍绳,tanh函數(shù)和sigmoid類似搭伤,但是是zero-center的
softsign:這個函數(shù)常用于作為激活函數(shù)只怎,其可以看做是符號函數(shù)的平滑近似
The softsign function also gets used as an activation function. The form of thisfunction is x/(abs(x) + 1). The softsign function is supposed to be a continuous approximation to the sign function. It appears as follows
softplus:可以看做是relu的一種平滑近似
the softplus, is a smooth version of the ReLU function. The form of this function is? log(exp(x) + 1)
Exponential Linear Unit (ELU) is very similar to the softplus function except that the bottom asymptote is -1 instead of 0. The form is (exp(x)+1) if x < 0 else x. It appears as follows
也就是說elu的下限為-1,是relu的平滑近似
對比效果圖: