Authors: Tomáš Buriánek; Sebastián Basterrech
Addresses: Department of Computer Science, Faculty of Electrical Engineering and Computer Science, VŠB-Technical University of Ostrava, Ostrava-Poruba, Czech Republic ' Department of Computer Science, Faculty of Electrical Engineering, Czech Technical University, Prague, Czech Republic
Abstract: A type of feedforward neural network with a specific architecture was developed around ten years ago under the name of flexible neural tree (FNT). The model has two families of adjustable parameters: the parameters presented in the activation function of the neurons, and the topology of the tree. The method uses meta-heuristic algorithms for finding a good tree topology and the set of embedded parameters. The technique has been successfully applied for solving machine learning problems with time-series and sequential data. The canonical FNT was introduced with the radial basis function as activation function of the neurons. In this article, we analyse the performance of the FNT when different type of activation functions is presented in the tree. We present a comparative analysis among different type of neurons. We study the performance of the model when the following four types of neurons are used: Gaussian, hyperbolic tangent, Fermi function and a linear variation of Fermi function. The empirical analysis was made over a well-known simulated time-series benchmark and a real-world networking problem.
Keywords: feedforward neural network; flexible neural tree; FNT; neuron activation function; time-series modelling; temporal learning.
International Journal of Advanced Intelligence Paradigms, 2023 Vol.25 No.3/4, pp.360 - 373
Received: 04 Aug 2017
Accepted: 10 Mar 2018
Published online: 19 Jul 2023 *