IMI-BAS BAS
 

BulDML at Institute of Mathematics and Informatics >
ITHEA >
International Journal ITA >
2003 >
Volume 10 Number 2 >

Please use this identifier to cite or link to this item: http://hdl.handle.net/10525/933

Title: Applications of the Sufficiency Principle in Acceleration of Neural Networks Trainig
Authors: Krissilov, Victor
Krissilov, Anatoly
Oleshko, Dmitry
Keywords: Neural Networks
Training
Issue Date: 2003
Publisher: Institute of Information Theories and Applications FOI ITHEA
Abstract: One of the problems in AI tasks solving by neurocomputing methods is a considerable training time. This problem especially appears when it is needed to reach high quality in forecast reliability or pattern recognition. Some formalised ways for increasing of networks’ training speed without loosing of precision are proposed here. The offered approaches are based on the Sufficiency Principle, which is formal representation of the aim of a concrete task and conditions (limitations) of their solving [1]. This is development of the concept that includes the formal aims’ description to the context of such AI tasks as classification, pattern recognition, estimation etc.
URI: http://hdl.handle.net/10525/933
ISSN: 1313-0463
Appears in Collections:Volume 10 Number 2

Files in This Item:

File Description SizeFormat
ijita10-2-p10.pdf100.55 kBAdobe PDFView/Open

 



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

 

Valid XHTML 1.0!   Creative Commons License DSpace Software Copyright © 2002-2009  The DSpace Foundation - Feedback