English  |  正體中文  |  简体中文  |  Items with full text/Total items : 26988/38789
Visitors : 2345965      Online Users : 39
RC Version 4.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Adv. Search
LoginUploadHelpAboutAdminister

Please use this identifier to cite or link to this item: http://ntour.ntou.edu.tw:8080/ir/handle/987654321/28624

Title: Scale Equalized Higher-order Neural Networks
Authors: Chien-Ming Lin;Keng-Hsuan Wu;Jung-Hua Wang
Contributors: NTOU:Department of Electrical Engineering
國立臺灣海洋大學:電機工程學系
Keywords: SEHNN;Scale Equalization;Higher-order Neural Network;function approximation
Date: 2005-10
Issue Date: 2011-10-21T02:38:35Z
Publisher: 2005 IEEE International Conference on Systems, Man and Cybernetics
Abstract: Abstract:This paper presents a novel network, called Scale Equalized Higher-order Neural Network (SEHNN) based on concept of Scale Equalization (SE). We show that SE is particularly useful in alleviating the scale divergence problem that plagues higher-order networks. SE comprises two main processes: setting the initial weight vector and conducting the matrix transformation. An illustrative embodiment of SEHNN is built on the Sigma-Pi Network (SPN) applied to task of function approximation. Empirical results verify that SEHNN outperforms other higher-order networks in terms of computation efficiency. Compared to SPN, and Pi-Sigma Network (PSN), SEHNN requires less number of epochs to complete the training process.
Relation: 1, pp.816-821
URI: http://ntour.ntou.edu.tw/handle/987654321/28624
Appears in Collections:[電機工程學系] 演講及研討會

Files in This Item:

File Description SizeFormat
index.html0KbHTML205View/Open


All items in NTOUR are protected by copyright, with all rights reserved.

 


著作權政策宣告: 本網站之內容為國立臺灣海洋大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,請合理使用本網站之內容,以尊重著作權人之權益。
網站維護: 海大圖資處 圖書系統組
DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback