Issue |
SHS Web Conf.
Volume 144, 2022
2022 International Conference on Science and Technology Ethics and Human Future (STEHF 2022)
|
|
---|---|---|
Article Number | 02006 | |
Number of page(s) | 5 | |
Section | Mobile Communication Technology and Prospects of Frontier Technology | |
DOI | https://doi.org/10.1051/shsconf/202214402006 | |
Published online | 26 August 2022 |
RELU-Function and Derived Function Review
Networking academy, Haikou University of Economics, Hainan, China, 571127
* Corresponding author: gaoming@cas-harbour.org
The activation function plays an important role in training and improving performance in deep neural networks (dnn). The rectified linear unit (relu) function provides the necessary non-linear properties in the deep neural network (dnn). However, few papers sort out and compare various relu activation functions. Most of the paper focuses on the efficiency and accuracy of certain activation functions used by the model, but does not pay attention to the nature and differences of these activation functions. Therefore, this paper attempts to organize the RELU-function and derived function in this paper. And compared the accuracy of different relu functions (and its derivative functions) under the Mnist data set. From the experimental point of view, the relu function performs the best, and the selu and elu functions perform poorly.
© The Authors, published by EDP Sciences, 2022
This is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.