2005 IEEE International Conference on Systems, Man and Cybernetics
Abstract:This paper presents a novel network, called Scale Equalized Higher-order Neural Network (SEHNN) based on concept of Scale Equalization (SE). We show that SE is particularly useful in alleviating the scale divergence problem that plagues higher-order networks. SE comprises two main processes: setting the initial weight vector and conducting the matrix transformation. An illustrative embodiment of SEHNN is built on the Sigma-Pi Network (SPN) applied to task of function approximation. Empirical results verify that SEHNN outperforms other higher-order networks in terms of computation efficiency. Compared to SPN, and Pi-Sigma Network (PSN), SEHNN requires less number of epochs to complete the training process.