National Taiwan Ocean University Institutional Repository:Item 987654321/50983
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 27228/39071
Visitors : 2413276      Online Users : 41
RC Version 4.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Adv. Search

Please use this identifier to cite or link to this item:

Title: Improved representation-burden conservation network for learning nonstationary VQ
Contributors: 國立臺灣海洋大學電機工程學系
Keywords: dynamic network
self-development networks
competitive learning
input density mapping
vector quantization
conscience principl
Date: 1998-08
Issue Date: 2018-11-06T02:21:34Z
Publisher: Neural Processing Letters
Abstract: Abstract: In a recent publication [1], it was shown that a biologically plausible RCN (Representationburden
Conservation Network) in which conservation is achieved by bounding the summed representation-burden
of all neurons at constant 1, is effective in learning stationary vector quantization. Based
on the conservation principle, a new approach for designing a dynamic RCN for processing both
stationary and non-stationary inputs is introduced in this paper. We show that, in response to the input
statistics changes, dynamic RCN improves its original counterpart in incremental learning capability
as well as in self-organizing the network structure. Performance comparisons between dynamic RCN
and other self-development models are also presented. Simulation results show that dynamic RCN is
very effective in training a near-optimal vector quantizer in that it manages to keep a balance between
the equiprobable and equidistortion criterion.
Relation: 8(1) pp.41-53
Appears in Collections:[Department of Electrical Engineering] Periodical Articles

Files in This Item:

File Description SizeFormat

All items in NTOUR are protected by copyright, with all rights reserved.


著作權政策宣告: 本網站之內容為國立臺灣海洋大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,請合理使用本網站之內容,以尊重著作權人之權益。
網站維護: 海大圖資處 圖書系統組
DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback