计算物理 ›› 1995, Vol. 12 ›› Issue (2): 203-206.

• 论文 • 上一篇    下一篇

一种快速分类的神经元网络算法

陈国新1, 张承福2   

  1. 1. 北京应用物理与计算数学研究所计算物理实验室, 100088;
    2. 北京大学物理系, 100871
  • 收稿日期:1993-08-09 修回日期:1994-06-22 出版日期:1995-06-25 发布日期:1995-06-25
  • 基金资助:
    863高技术基金和国家自然科学基金资助

A HIGH-SPEED NEURAL NETWORKS ALGORITHM ON CLASSIFICATION-PROBLEM

Cheng Guoxin1, Zhang Chengfu2   

  1. 1. Beijing Applied Physics and Computational Mathematics, P.O.Box 8009, Beijing 100088;
    2. Department of Physics, Beijing University, 100871
  • Received:1993-08-09 Revised:1994-06-22 Online:1995-06-25 Published:1995-06-25

摘要: 深入分析了BP(Back-Propagation)算法的缺陷,在BP算法的基础上,提出逐层训练多层网络的快速算法,主要精神是:a)逐层地训练多层网络而不是一起训练;b)对隐单元层给以具体指导;c)根据具体问题给予合适的权重分配规则即合适的"能量函数";d)保持BP算法的优点。对一些问题的训练速度与BP算法比较有几个数量级的提高。这一算法还可对多层网络的运行机制作一些研究.

关键词: 人工神经网络, BP算法, 多层网络, 逐层训练

Abstract: Artificial-Neural Network (ANN) is a complex network system composed of many simple elements connected extensively each other and can be regarded as a simulation or abstraction of biological neural system. Multi-Layer Network (MLN) is one of the most important ANN. A essential algorithm to train MLN is BP (Back-Propagation) algorithm. Based on BP algorithm, a hgih-speed algorithm-train MLN layer-by-layer algorithm is proposed with main points:4) train MLN layer-by-layer instead of all together; b) giving instruction to hidden units; c) giving appropriate "energy function' according to specific Problems; d) preserving the merits of BP algorithm. The simulation results increase by orders of magnitude in training-speed. This algorithm applies also some research on running mechanism of MLN.

Key words: ANN, BP algorithm, MLN, training layer-by-layer

中图分类号: