Parabola-Based Artificial Neural Network Activation Functions

Authors

Khachumov V. Khachumov M.

Annotation

The paper deals with the issues of constructing logical functions using artificial neurons with various activation functions. In particular, we propose a method for constructing parabola-based nonlinearity that allows expanding the logical capabilities of neurons and neural networks based on them. Alternative nonlinearity, like the sigmoid, is suitable for configuring a neural network by error backpropagation. We give an algorithm for such an XOR function adjustment using two neurons. The paper also shows the possibility of the XOR function building using one neuron by applying a rotated parabola (solving the XOR problem). We conduct experimental research on neural network adjustment based on standard sigmoid, s-parabola, and rotated parabola activation functions, which showed prospects of the proposed approach. The new nonlinearity makes it possible to reduce the overall computational complexity of setting up a neural network and speed up calculations in tasks that require the use of a large number of neurons.

External links

DOI: 10.1109/RusAutoCon58002.2023.10272855

Download PDF or read online at the IEEE Xplore library (registration required): https://ieeexplore.ieee.org/document/10272855

At the Intelligent Control Laboratory of the Program Systems Institute of the RAS website: https://icontrol.psiras.ru/publications/

Reference link

Khachumov M., Emelyanova Yu., Khachumov V. Parabola-Based Artificial Neural Network Activation Functions // IEEE proceedings of the 2023 International Russian Automation Conference (RusAutoCon), Sochi, Russian Federation, 2023, pp. 249–254.