35 lines
1.7 KiB
Markdown
35 lines
1.7 KiB
Markdown
# 深度学习基础
|
||
|
||
## 激活函数
|
||
|
||
### 常见的激活函数
|
||
|
||
#### 1. sigmoid函数
|
||
函数的定义$$ f(x) = \frac{1}{1 + e^{-x}} $$,其值域为 $$ (0,1) $$。 函数图像
|
||
|
||
![图像](https://raw.githubusercontent.com/deel-learn/DeepLearning-500-questions/master/ch3_%E6%B7%B1%E5%BA%A6%E5%AD%A6%E4%B9%A0%E5%9F%BA%E7%A1%80/img/ch3/3-26.png)
|
||
|
||
#### 2. tanh激活函数
|
||
函数的定义为:$$ f(x) = tanh(x) = \frac{e^x - e^{-x}}{e^x + e^{-x}} $$,值域为 $$ (-1,1) $$。函数图像
|
||
|
||
![pic](https://raw.githubusercontent.com/deel-learn/DeepLearning-500-questions/master/ch3_%E6%B7%B1%E5%BA%A6%E5%AD%A6%E4%B9%A0%E5%9F%BA%E7%A1%80/img/ch3/3-27.png)
|
||
|
||
#### 3. Relu激活函数
|
||
函数的定义为:$$ f(x) = max(0, x) $$ ,值域为 $$ [0,+∞) $$;函数图像
|
||
|
||
![pic](https://raw.githubusercontent.com/deel-learn/DeepLearning-500-questions/master/ch3_%E6%B7%B1%E5%BA%A6%E5%AD%A6%E4%B9%A0%E5%9F%BA%E7%A1%80/img/ch3/3-28.png)
|
||
|
||
#### 4. Leak Relu激活函数
|
||
函数定义为: $$ f(x) = \left{ \begin{aligned} ax, \quad x<0 \ x, \quad x>0 \end{aligned} \right. $$,值域为 $$ (-∞,+∞) $$。图像如下($$ a = 0.5 $$):
|
||
|
||
![pic](https://raw.githubusercontent.com/deel-learn/DeepLearning-500-questions/master/ch3_%E6%B7%B1%E5%BA%A6%E5%AD%A6%E4%B9%A0%E5%9F%BA%E7%A1%80/img/ch3/3-29.png)
|
||
|
||
#### 5. SolftPlus 激活函数
|
||
函数的定义为:$$ f(x) = ln( 1 + e^x) $$,值域为 $$ (0,+∞) $$。图像如下
|
||
|
||
![pic](https://raw.githubusercontent.com/deel-learn/DeepLearning-500-questions/master/ch3_%E6%B7%B1%E5%BA%A6%E5%AD%A6%E4%B9%A0%E5%9F%BA%E7%A1%80/img/ch3/3-30.png)
|
||
|
||
#### 6. softmax激活函数
|
||
函数定义为: $$ \sigma(z)j = \frac{e^{z_j}}{\sum{k=1}^K e^{z_k}} $$。
|
||
Softmax 多用于多分类神经网络输出。
|