Gaussian Error Linear Unit Activates Neural Networks Beyond ReLU | Synced
Results of the various experiments show GELU consistently has the best performance compared with ReLU and ELU, and can be considered a viable alternative to previous nonlinear approaches.
Source: Synced | AI Technology & Industry Review
Results of the various experiments show GELU consistently has the best performance compared with ReLU and ELU, and can be considered a viable alternative to previous nonlinear approaches.