The paper deals with the issues of constructing logical functions using artificial neurons with various activation functions. In particular, we propose a method for constructing parabola-based nonlinearity that allows expanding the logical capabilities of neurons and neural networks based on them. Alternative nonlinearity, like the sigmoid, is suitable for configuring a neural network by error backpropagation. We give an algorithm for such an XOR function adjustment using two neurons. The paper also shows the possibility of the XOR function building using one neuron by applying a rotated parabola (solving the XOR problem). We conduct experimental research on neural network adjustment based on standard sigmoid, s-parabola, and rotated parabola activation functions, which showed prospects of the proposed approach. The new nonlinearity makes it possible to reduce the overall computational complexity of setting up a neural network and speed up calculations in tasks that require the use of a large number of neurons.
Скачать PDF или читать онлайн в библиотеке IEEE Xplore (англ., требуется регистрация): https://ieeexplore.ieee.org/document/10272855
На сайте Лаборатории интеллектуального управления ИПС им. А.К. Айламазяна РАН: https://icontrol.psiras.ru/publications/
Khachumov M., Emelyanova Yu., Khachumov V. Parabola-Based Artificial Neural Network Activation Functions // IEEE proceedings of the 2023 International Russian Automation Conference (RusAutoCon), Sochi, Russian Federation, 2023, pp. 249–254.