Learning smooth dendrite morphological neurons by stochastic gradient descent for pattern classification

Wilfrido Gómez-Flores and Humberto Sossa.

 

Abstract

This article presents a learning algorithm for dendrite morphological neurons (DMN) based on stochastic gradient descent (SGD). In particular, we focus on a DMN topology that comprises spherical dendrites, smooth maximum activation function nodes, and a softmax output layer, whose original learning algorithm is performed in two independent stages: (1) dendrites’ centroids are learned by k-means, and (2) softmax layer weights are adjusted by gradient descent. A drawback of this learning method is that both stages are unplugged; once dendrites’ centroids are defined, they keep static during weights learning, so no feedback is performed to correct the dendrites’ positions to improve classification performance. To overcome this issue, we derive the delta rules for adjusting the dendrites’ centroids and the output layer weights by minimizing the cross-entropy loss function under an SGD scheme. This gradient descent-based learning is feasible because the smooth maximum activation function that interfaces the dendrite units with the output layer is differentiable. The proposed DMN is compared against eight morphological neuron models with distinct topologies and learning methods and four well-established classifiers: support vector machine (SVM), multilayer perceptron (MLP), and random forest (RF), and k-nearest neighbors (k-NN). Besides, the classification performance is evaluated on 81 datasets. The experimental results show that the proposed method tends to outperform the DMN methods and is competitive or even better than SVM, MLP, RF, and k-NN. Thus, it is an alternative approach that can effectively be used for pattern classification. Moreover, SGD for DMN learning standardizes this neural model, like current artificial neural networks.

 

https://doi.org/10.1016/j.neunet.2023.09.033

Print
Orden de presentación (texto):2023, 11
Cinvestav © 2024
11/11/2024 01:41:23 p. m.