diff --git a/Chapters/3-Activation-Functions/INDEX.md b/Chapters/3-Activation-Functions/INDEX.md index 6d4404a..a646206 100644 --- a/Chapters/3-Activation-Functions/INDEX.md +++ b/Chapters/3-Activation-Functions/INDEX.md @@ -14,7 +14,7 @@ their **derivate is near 0** ## List of Non-Saturating Activation Functions - + ### ReLU @@ -487,8 +487,6 @@ for `activation-functions`** and it is used to **deal with numerical instabilities** and as a **component for other losses** - - [^PReLU]: [Microsoft Paper | arXiv:1502.01852v1 [cs.CV] 6 Feb 2015](https://arxiv.org/pdf/1502.01852v1) [^RReLU]: [Empirical Evaluation of Rectified Activations in Convolution Network](https://arxiv.org/pdf/1505.00853v2)