Adaptive Activation Functions Using Fractional Calculus

Julio Zamora Esquivel, Adan Cruz Vargas, Rodrigo Camacho Perez, Paulo Lopez Meyer, Hector Cordourier, Omesh Tickoo; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2019, pp. 0-0


We introduce a generalization methodology for the automatic selection of the activation functions inside a neural network, taking advantage of concepts defined in fractional calculus. This methodology enables the neural network to define and optimize its own activation functions during the training process, by defining the fractional order of the derivative of a given primitive activation function, tuned as an additional training hyper-parameter. By following this approach, the neurons inside the network can adjust their activation functions, e.g. from MLP to RBF networks, to best fit the input data, and reduce the output error. The result show the benefits of using this technique implemented on a ResNet18 topology by outperforming the accuracy of a ResNet100 trained with CIFAR10 reported in the literature.

Related Material

author = {Zamora Esquivel, Julio and Cruz Vargas, Adan and Camacho Perez, Rodrigo and Lopez Meyer, Paulo and Cordourier, Hector and Tickoo, Omesh},
title = {Adaptive Activation Functions Using Fractional Calculus},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops},
month = {Oct},
year = {2019}